Friday, July 16, 2010

[G] Q2'10 spam & virus trends from Postini

| More

Official Google Enterprise Blog: Q2'10 spam & virus trends from Postini

Editor's note: The spam data cited in this post is drawn from the network of Google email security and archiving services, powered by Postini, which processes more than 3 billion email messages per day in the course of providing email security to more than 50,000 businesses and 18 million business users.

Spam and virus volumes this year have continued their upward trend. Q2’10 has seen a sharp 16% increase in spam volume over Q1’10. Virus traffic has moderately increased 3% increase this quarter, however Q2’10 virus was 260% higher than Q2’09. These trends tell us that the spammers are still extremely active, and their botnets produce high levels of spam and virus traffic.

By the by numbers
Spam volume shot up 16% from Q1’10 to Q2’10. Overall, however spam levels are down 15% from Q2’09.

Virus volume grew quickly at the beginning of the quarter, shooting up 90% from March to April, but then quickly dropped off. We saw only a modest 3% uptick from Q1’10 to Q2’10 at the aggregate level. Compared to Q2’09, this represents a 260% increase.

One interesting trend we noticed is size of individual spam messages rising 35% from Q1’10. This points to the fact that spammers are sending more image-based spam, as well as viruses as attachments.

New methods of attack
We have also seen a recent surge in obfuscated (hidden) JavaScript attacks. These messages are a hybrid between virus and spam messages. The messages are designed to look like Non Delivery Report (NDR) messages, which are legitimate messages, however they contained hidden JavaScript which in some cases tried to do things the user may not have been aware of.

In some cases, the message may have forwarded the user's browser to a pharma site or tried to download something unexpected, which is more virus-like. Since the messages contained classic JavaScript which generates code, the messages could change themselves and take multiple forms, making them challenging to identify.

Fortunately, our spam traps were receiving these messages early, providing our engineers with advanced warning which allowed us to write manual filters and escalate to our anti-virus partners quickly. In addition to this, we updated our Postini Anti-Spam Engine (PASE) to recognize the obfuscated JavaScript and capture the messages based on the underlying code to ensure accuracy.

The classics
Although they’ve added a few new tricks to their bag, spammers continue to exploit tried and true techniques, including:

• False Social Networking Messages
Social networks continue to be one of the most frequently spoofed domains for the purpose of spreading phishing scams and virus downloaders. These messages do not actually come from social networks but look similar to legitimate social networks messages. Such messages often contain links to external websites which contain malicious content and/or attempt to harvest user login information. The Postini Anti-Spam Engine is very good at detecting such messages, but users should always be cautious when handling messages from popular social networking sites.

• Current events
As always, spammers continue to spoof major news stories, and this quarter, we saw an increase in spam involving the World Cup. Here is one example of a virus downloader that our spam filters caught:

• Shipping scams
The shipping scam is a favorite of spammers. This quarter we saw a more wide spread outbreak of messages claiming to be from major shipping companies because spammers get a higher success rate with these type of scams. The subject for the message made it look like an invoice and the message body contained random text such as news stories that did not look particularly "spammy." Each message had an attached zip file that presumably was intended to contain some sort of virus payload; however, the data was corrupt and did not pose any actual threat.

Stay safe from phishing scams
With the global economy continuing to lag, we have seen a continued upswing in “friend-in-need” phishing attempts, where hackers break into the email account of unsuspecting users and then hand-type a message to send to the victim’s email contacts.

The most common message told a story of the person being mugged while traveling abroad and requesting money to be sent to them in order to help them get home. The hacker is preying on the generosity of the victims friends in the hopes that one or more of them will send money to them. These messages can be difficult for spam filters to identify since they are hand typed and not sent in bulk. It goes without saying, but be weary of emails requesting money – regardless of the sender.

In response to these outbreaks, our engineers have released several updated filters to combat new spam waves.

Conclusion
Spam volume fluctuates in the short term, but overall, for the last 3 quarters spam volume has been relatively flat. Spammers continue to exploit techniques that have proven results, but as we have seen with obfuscated JavaScript attacks spammers are always experimenting with new techniques to stay ahead of security measures. Google Postini Services customers are protected from the brunt of these increases in spam volume.

For more information on how Google’s security and archiving services can help your business stay safe and compliant, please visit www.google.com/postini.

Posted by Adam Hollman and Gopal Shah, Google Postini Services team

URL: http://googleenterprise.blogspot.com/2010/07/q210-spam-virus-trends-from-postini.html

[G] Deeper understanding with Metaweb

| More

Official Google Blog: Deeper understanding with Metaweb

Over time we’ve improved search by deepening our understanding of queries and web pages. The web isn’t merely words—it’s information about things in the real world, and understanding the relationships between real-world entities can help us deliver relevant information more quickly. Today, we’ve acquired Metaweb, a company that maintains an open database of things in the world. Working together we want to improve search and make the web richer and more meaningful for everyone.

With efforts like rich snippets and the search answers feature, we’re just beginning to apply our understanding of the web to make search better. Type [barack obama birthday] in the search box and see the answer right at the top of the page. Or search for [events in San Jose] and see a list of specific events and dates. We can offer this kind of experience because we understand facts about real people and real events out in the world. But what about [colleges on the west coast with tuition under $30,000] or [actors over 40 who have won at least one oscar]? These are hard questions, and we’ve acquired Metaweb because we believe working together we’ll be able to provide better answers.

In addition to our ideas for search, we’re also excited about the possibilities for Freebase, Metaweb’s free and open database of over 12 million things, including movies, books, TV shows, celebrities, locations, companies and more. Google and Metaweb plan to maintain Freebase as a free and open database for the world. Better yet, we plan to contribute to and further develop Freebase and would be delighted if other web companies use and contribute to the data. We believe that by improving Freebase, it will be a tremendous resource to make the web richer for everyone. And to the extent the web becomes a better place, this is good for webmasters and good for users.

We look forward to working with the talented Metaweb team. We’ll be sure to share details on our progress in the coming months. In the meantime, if you’re interested to learn more about Metaweb’s technology, we encourage you to check out a helpful video they’ve posted on their blog.

Posted by Jack Menzel, Director of Product Management
URL: http://googleblog.blogspot.com/2010/07/deeper-understanding-with-metaweb.html

[G] Google Apps highlights – 7/16/2010

| More

Official Google Blog: Google Apps highlights – 7/16/2010

This is part of a regular series of Google Apps updates that we post every couple of weeks. Look for the label “Google Apps highlights" and subscribe to the series. - Ed.

Over the last couple of weeks we rolled out some nice updates in Gmail, improved on Google forms, added new mobile device security features and celebrated many new applications recently added to the Apps Marketplace. Enjoy!

Rich text signatures in Gmail
You’ve been able to add plain text signatures to your messages in Gmail for some time, but last Thursday we stepped it up a notch by adding rich text signatures, one of our most requested features. Now you can create signatures with different fonts, font sizes, font colors, links and images. The feature also supports different signatures for different custom “From:” addresses that you’ve configured. Head over to the “Settings” page in Gmail to get started.


HTML5 features in Gmail on Safari
Gmail has recently added some new interactive features, like drag-and-drop attachments and images, and new windows that “outlive” your original Gmail window. These features are possible thanks to HTML5, but until this week, Safari users have been left out. All of that changed on Monday, and users of Safari 5 can now enjoy these helpful HTML5 features, too.

Simpler page navigation in Google forms
With Google forms (part of Google Docs), you can quickly create and send surveys to your contacts or publish surveys on the web. We started out offering simple one-page forms, but last week we made some big improvements to our logic branching capabilities. Now you can easily create multi-page surveys that adapt depending on how people answer your questions. Try it out for yourself in the form-based choose your own adventure game that we built.


More security controls for mobile devices
Businesses and schools using Google Apps often want the ability to centrally manage mobile devices that their users connect to Google Apps, and on Tuesday we rolled out several new device management capabilities. Organizations can now require devices to use data encryption, auto-wipe devices after a certain number of failed password attempts, require device passwords to be changed periodically and more.


Apps Tuesday: 10 new additions to the Apps Marketplace
Some technology companies burden IT departments with software patches and fixes every month, but our cloud computing approach means that customers get improvements automatically with Google Apps. In addition to all the new features built by Google, this month we added 10 new applications from third-party software companies to the Apps Marketplace. Third-party apps integrate seamlessly with Google Apps and can be activated by administrators with just a couple clicks.

Who’s gone Google?
More and more organizations are getting with the times and switching to Google Apps. Today we welcome Vektrex, Rypple, XAOP, Limbach Facility Services, Riley Chartered Accounts and tens of thousands of other businesses worldwide that have moved to the cloud with Google since my last update here.

More universities are preparing to reopen their doors in the fall with new campus technology tools, too. We’re excited to have University of Minnesota College of Liberal Arts, Universitat de Girona in Spain and The College of St. Scholastica join us!

I hope you're making the most of these new features, whether you're using Google Apps with friends, family, coworkers or classmates. For more details and updates from the Apps team, head on over to the Google Apps Blog.

Posted by Jeremy Milo, Google Apps Marketing Manager
URL: http://googleblog.blogspot.com/2010/07/google-apps-highlights-7162010.html

[G] A Shout Out About Annotations

| More

Google Analytics Blog: A Shout Out About Annotations

A few months ago at the Google I/O conference, we were approached by Zach Steindler, a co-founder at Olark (a way to gain customer insight and sale better through live chat) who was raving about Google Analytics Annotations. He had such a great business case, we decided to let him rave here. Enjoy, and thanks Zach.

Making good business decisions is hard, and making the right one is even harder. At Google I/O I realized many people use Google Analytics but they aren’t familiar with the recent annotations feature that has helped us make smarter business decisions.

When we look at our Google Analytics, we don’t really care if our numbers are up or down; what we really want to know is why. This means asking a lot of questions, particularly questions about what happened when, like:

“How long has that ad trial been running?”
“When did we release that update to the website?”
“What happened after that last blog post?”

To answer these questions I might have to dig through e-mails, commit logs, and probably end up pestering my teammates for an hour while we try to figure out what happened when. But this is serious stuff; if our numbers went up 50% in a week, you better believe we want to know why so we can do more of it!

Annotations are exactly the tool we needed to answer these questions without having to pester teammates and dig through the past. If you don't know, basically, they allow you to add notes of what events happened on a particular day. These notes are then visible for the different views in Google Analytics, so you can see how the events impacted your page views, goals, or whatever else you are tracking.

You can annotate whatever you want; we annotate things like external publicity, major updates to our site, blog posts, even service issues, to see how all these events are impacting our business.

We’re big believers in the power of open data; everyone on the team has access to Google Analytics and can contribute events they think are important. This has been incredibly useful for us. Now I can answer many why questions for myself, just by looking at the data other people have contributed. When I do need to interrupt the team, it’s because I have big-picture questions, not because I need them to help me track down dates. Also, you start to notice a rhythm of events, and if that rhythm changes, how it impacts your business. As a bonus, now we have this cool timeline of events the team thought was important, which is useful for retrospectives and end-of-period reports.

We’re far from being able to make perfect decisions with perfect knowledge, but annotations have made it much easier to answer the why questions so we can make good business decisions.

Posted by Patricia Boswell, Google Analytics Team
URL: http://analytics.blogspot.com/2010/07/shout-out-about-annotations.html

[G] The most World Cup-crazy countries

| More

Official Google Blog: The most World Cup-crazy countries

Last weekend, Spain won the 2010 World Cup. For the month leading up to the final, Googlers joined the world in cheering for their favorite teams. Around our campus, games were watched on computer screens and on cafe video screens. Code went unwritten. Emails went unanswered.

Throughout the world, real life also slowed during World Cup matches. Which teams had the most loyal fans? Which game captured the attention of world the most? To answer these questions, we looked at counts of queries using Google. People search using Google day and night—except for football fans when a game is on.

These graphs show the volume of Google queries for some of the World Cup matches:


On June 15, as Brazil played its first game against North Korea, the volume of queries from Brazil, shown using a red line, plummeted when the match began, spiked during halftime, fell again and then quickly rose after the match finished.


Queries from Spain during its June 25 game against Chile also decreased during the game, except during halftime. After some post-game querying, Spaniards went to sleep and queries dropped again.

To measure which country has the most loyal fans, we computed the proportional drop in queries during each of its team’s matches compared with normal query volume. Brazil topped the charts with queries from that country dropping by half during its football games. Football powerhouse and third-place winner Germany came in second, followed by the Netherlands and South Korea.


In fourth place, South Koreans were remarkably loyal even though some games began at 3:30am Seoul time. Japan, Australia and New Zealand, also affected by time-zone differences, expressed much less interest. A few countries searched more, not less. But only Honduras and North Korea increased significantly.

During the knockout rounds, each match’s losing team is eliminated from the tournament. As fewer and fewer teams remain, we expected increased worldwide interest in each remaining game. Unsurprisingly, worldwide queries slowed the most during the final game between the Netherlands and Spain, but the round-of-16 Germany v. England game had the second largest query decrease. Semi-finals and quarter-finals were all popular except for semi-final Uruguay v. Netherlands, during which queries actually increased.


In Latin American countries, search volume dropped more steeply leading into and out of matches while, in Europe, searches ramped down and up more gradually. Of course, for games that went into extra time and penalty shootouts the drops deepened the longer the match went on, including Paraguay v. Japan, Netherlands v. Spain, and Uruguay v. Ghana as seen here:


Finally, no blog post about the World Cup would be complete without a look at what did drive people to search—after the final match, of course. Although he won neither the Golden Boot (for the most World Cup goals) nor the Golden Ball (for best player) last weekend, Spain’s David Villa is winning in search compared to the recipients of those two honors—Germany’s Thomas Müller and Uruguay’s Diego Forlán—and Dutch midfielder Wesley Sneijder. All of these men competed for the Golden Boot with five goals apiece.

Similar to when Carlos Puyol headed in the single goal that put Spain in the final, people flocked to the web to search for information on Andres Iniesta, the “quiet man” who scored the one goal that led his country to its first World Cup championships. They were also interested in Dani Jarque, a Spanish footballer who died last fall and whose name was emblazoned on Iniesta’s undershirt, which he displayed after his goal. And after the match, searches for keeper Iker Casillas skyrocketed to a higher peak than any other popular footballer—including household names like Ronaldo, Villa and Messi—reached during the Cup. Sometimes, it seems, goalies get the last word.

We hope you enjoyed our series of posts on World Cup search trends and we’ll see you in Brazil in 2014!

Posted by Jeffrey D. Oldham, Software Engineer and Robert Snedegar, Technical Solutions Engineer
URL: http://googleblog.blogspot.com/2010/07/most-world-cup-crazy-countries.html

[G] Will Using Google Analytics Have A Negative Effect On My Ranking?

| More

Google Analytics Blog: Will Using Google Analytics Have A Negative Effect On My Ranking?

Good news from the horse’s mouth. We don’t mean to call Matt Cutts a horse, but, well, if you know him, you know what we mean. Matt heads the webspam team here at Google and also speaks on behalf of Google answering questions about ranking and results on Google’s search engine. When people have questions about things Google-search-related, Matt is the one who answers.

He posts regular video blogs to the Google Webmaster Help channel answering your questions. So we were very pleased when he recently answered the question, “Will using Google Analytics have a negative effect on my ranking?” In short, the answer is no, especially now that we’ve launched the asynch tracking code . Take a look at the short video:



Thanks Matt!

Posted by Jeff Gillis, Google Analytics Team
URL: http://analytics.blogspot.com/2010/07/will-using-google-analytics-have.html

[G] Imagery Update - Week of July 12

| More

Google LatLong: Imagery Update - Week of July 12


Hot off the heals of early July's imagery update, we've got another batch of new images ready for people to dive into this summer. Have fun exploring the world from Queens, NY to the Queen Hatshepsut's temple in Egypt.


Queens, New York


Queen Hatshepsut's temple in Egypt

High Resolution Aerial Updates:
USA: Salt Lake City, Fort Worth, Abilene, Cleveland, New Jersey, and the New York boroughs of Staten Island, Brooklyn, Queens, and the Bronx
New Zealand: Otaki

Countries receiving High Resolution Satellite Updates:
Mexico, Cuba, Honduras, Jamaica, Haiti, Dominican Republic, Venezuela, Colombia, Peru, Brazil, Chile, Uruguay, Argentina, Norway, Morocco, Algeria, Libya, Egypt, Nigeria, Democratic Republic of the Congo, Mozambique, Russia, Iran, Uzbekistan, Pakistan, India, Nepal, Bangladesh, China, Vietnam, The Philippines, and Australia

Countries receiving Medium Resolution Satellite Updates:
Brazil, Sweden, Ethiopia, Tanzania, Tajikistan, and Laos

For a complete picture of where we updated imagery, download this KML for viewing in Google Earth.

Posted by Matt Manolides, Senior Geo Data Strategist
URL: http://google-latlong.blogspot.com/2010/07/imagery-update-week-of-july-12.html

Thursday, July 15, 2010

[G] Tailored and Effective “Third Way”

| More

Google Public Policy Blog: Tailored and Effective “Third Way”

Posted by Richard Whitt, Washington Telecom and Media Counsel

Today we submitted comments supporting the FCC’s proposed Third Way. In a letter to the agency two months ago, Google along with other technology companies expressed the view that the Third Way framework “will create a legally sound, light-touch regulatory framework that benefits consumers, technology companies and broadband Internet access providers.” We still believe this is a true statement.

The recent Comcast decision re-opened some fundamental questions about the FCC’s jurisdiction over broadband Internet services. On balance the Third Way framework -- which would apply only in a limited manner to only the transmission component of broadband Internet service -- presents a predictable, effective, and tailored approach.
URL: http://googlepublicpolicy.blogspot.com/2010/07/tailored-and-effective-third-way.html

[G] Our op-ed: Regulating what is “best” in search?

| More

Google Public Policy Blog: Our op-ed: Regulating what is “best” in search?

Posted by Adam Kovacevich, Senior Manager, Public Policy Communications

Google’s Marissa Mayer wrote in the Financial Times today about the impact for consumers of governments potentially regulating search results. Because the article is behind the FT’s paywall, we thought we’d share the complete text here (also, check out search analyst Danny Sullivan’s take on this issue).

Do not neutralise the web’s endless search
By Marissa Mayer

Published: July 14 2010

Think about the word “jaguar” – what comes to mind? The animal? The car? A sports team? Now ask yourself: what is the best piece of literature ever written about jaguars? What about the best piece of literature ever written containing the word jaguar?

How do you define what is best? What characteristics and attributes should be taken into account? Which should not? There is a debate brewing, reported in the Financial Times this week, about whether standards are needed to ensure fairness – or what is “best” – in internet search results.

Search engines use algorithms and equations to produce order and organisation online where manual effort cannot. These algorithms embody rules that decide which information is “best”, and how to measure it. Clearly defining which of any product or service is best is subjective. Yet in our view, the notion of “search neutrality” threatens innovation, competition and, fundamentally,your ability as a user to improve how you find information.

When Google was launched in 1998, its fundamental innovation was the PageRank algorithm. It was a new and helpful tool in helping users decide which was the best information available – and one of many hundreds that have since been deployed by search engines to improve the ranking and relevance of their results.

Yet searching the web has never been more complex. Type “World Cup” into Google today and you will see millions of returns, ranging from recent news articles to images of players. Often the answer is not a web page: sports scores, news, pictures and tweets about matches are included. Such results stem from an upgrade in Google’s technology launched in 2007, which made it possible to include media such as maps, books, or videos on a results page. Our goal is to provide our users with the best and most effective answer. Consider the search “how to tie a bowtie”. Answers to these types of searches benefit from the inclusion of different media (diagrams, videos), sometimes from a Google service (books, maps).

To make matters more difficult, a quarter of all daily searches on Google have never been seen before. Each presents a new challenge, so our engineers need constantly to improve and update our algorithms. On average, we make one or two changes every day. But even then they sometimes require a more hands-on approach. For example, we occasionally have to flag malicious programmes manually, removing links to child pornography or spam sites.

The world of search has developed a system in which each search engine uses different algorithms, and with many search engines to choose from users elect to use the engine whose algorithm approximates to their personal notion of what is best for the task at hand. The proponents of “search neutrality” want to put an end to this system, introducing a new set of rules in which governments would regulate search results to ensure they are fair or neutral.

Here the practical challenges would be formidable. What is fair in terms of ordering? An alphabetical listing? Equally, new results will need to be incorporated – new web pages, but also new media types such as tweets or audio streams. Without competition and experimentation between companies, how could the rules keep up? There is no doubt that this will stifle the advance of the science around search engines.

Abuse would be a further problem. If search engines were forced to disclose their algorithms and not just the signals they use, or, worse, if they had to use a standardised algorithm, spammers would certainly use that knowledge to game the system, making the results suspect.

But the strongest arguments against rules for “neutral search” is that they would make the ranking of results on each search engine similar, creating a strong disincentive for each company to find new, innovative ways to seek out the best answers on an increasingly complex web. What if a better answer for your search, say, on the World Cup or “jaguar” were to appear on the web tomorrow? Also, what if a new technology were to be developed as powerful as PageRank that transforms the way search engines work? Neutrality forcing standardised results removes the potential for innovation and turns search into a commodity.

We know that Google plays an important role in accessing information. We also welcome scrutiny and want to ensure everyone understands how we work. Yet we believe the best answer for a particular search changes constantly. It changes because the web changes, because users’ expectations and tastes evolve and because the media never stay still. Yet proponents of search neutrality are effectively saying that they know what is “best” for you. We think consumers should be able to decide for themselves – with an array of internet search engines to choose from, each providing their very best.

The writer is vice-president of search product and user experience at Google.
URL: http://googlepublicpolicy.blogspot.com/2010/07/our-op-ed-regulating-what-is-best-in.html

[G] Google News changes reflect your feedback

| More

Google News Blog: Google News changes reflect your feedback

Posted by Chris Beckmann, Product Manager

Two weeks ago we gave the Google News homepage a new look and feel with enhanced customization, discovery and sharing. This redesign was our biggest since Google News launched in beta in 2002.

Some of you told us that you really liked it, especially how the "News for you" section lets you see a stream of articles tailored to the interests you specify. The positive usage data we saw during our months-long tests of the redesign has continued since we introduced it to all users of the U.S. English edition, and hundreds of thousands of you have already customized your Google News homepages. But some of you wrote in to say you missed certain aspects of the previous design, such as the ability to see results grouped by section (U.S., Business, etc.) in two columns.

At Google, we’re all about launching and iterating, so we've been making improvements to the design in response to your feedback. For example, we're now showing the entire cluster of articles for each story, rather than expanding the cluster when you hover your mouse over it. We've given you the ability to hide the weather forecast from your local news section. We made the option to switch between List view and Section view more obvious. And today we’re adding a third option in "News for you": Two-column view, which shows the three top stories from each section and looks like this:



A key goal of the redesign was to give you more ways to personalize your Google News, and these changes add even more choices. A heartfelt thanks to all of you who have shared your thoughts with us. Please keep letting us know what you think, and we’ll keep working to make Google News even better.
URL: http://googlenewsblog.blogspot.com/2010/07/google-news-changes-reflect-your.html

[G] Google PhD Fellowships go international

| More

Official Google Research Blog: Google PhD Fellowships go international

Posted by Alfred Spector, VP of Research and Special Initiatives

(Cross-posted from the Official Google Blog)

We introduced the Google Fellowship program last year in the United States to broaden our support of university research. The students who were awarded the 2009 fellowships were a truly impressive group, many having high profile internships this past summer and even a few with faculty appointments in the upcoming year.

Universities continue to be the source of some of the most innovative research in computer science, and in particular it’s the students that they foster who are the future of our field. This year, we’re going global and extending the fellowship program to Europe, Israel, China and Canada. We’re very happy to be continuing our support of excellence in graduate studies and offer our sincere congratulations to the following PhD students for receiving Google Fellowships in 2010:

Google European Doctoral Fellowships
  • Roland Angst, Google Europe Fellowship in Computer Vision (Swiss Federal Institute of Technology Zurich, Switzerland)
  • Arnar Birgisson, Google Europe Fellowship in Computer Security (Chalmers University of Technology, Sweden)
  • Omar Choudary, Google Europe Fellowship in Mobile Security (University of Cambridge, U.K.)
  • Michele Coscia, Google Europe Fellowship in Social Computing (University of Pisa, Italy)
  • Moran Feldman, Google Europe Fellowship in Market Algorithms (Technion - Israel Institute of Technology, Israel)
  • Neil Houlsby, Google Europe Fellowship in Statistical Machine Learning (University of Cambridge, U.K.)
  • Kasper Dalgaard Larsen, Google Europe Fellowship in Search and Information Retrieval (Aarhus University, Denmark)
  • Florian Laws, Google Europe Fellowship in Natural Language Processing (University of Stuttgart, Germany)
  • Cynthia Liem, Google Europe Fellowship in Multimedia (Delft University of Technology, Netherlands)
  • Ofer Meshi, Google Europe Fellowship in Machine Learning (The Hebrew University of Jerusalem, Israel)
  • Dora Spenza, Google Europe Fellowship in Wireless Networking (Sapienza University of Rome, Italy)
  • Carola Winzen, Google Europe Fellowship in Randomized Algorithms (Saarland University / Max Planck Institute for Computer Science, Germany)
  • Marek Zawirski, Google Europe Fellowship in Distributed Computing (University Pierre and Marie Curie / INRIA, France)
  • Lukas Zich, Google Europe Fellowship in Video Analysis (Czech Technical University, Czech Republic)
Google China PhD Fellowships
  • Fangtao Li, Google China Fellowship in Natural Language Processing (Tsinghua University)
  • Ming-Ming Cheng, Google China Fellowship in Computer Vision (Tsinghua University)
Google United States/Canada PhD Fellowships
  • Chong Wang, Google U.S./Canada Fellowship in Machine Learning (Princeton University)
  • Tyler McCormick, Google U.S./Canada Fellowship in Statistics (Columbia University)
  • Ashok Anand, Google U.S./Canada Fellowship in Computer Networking (University of Wisconsin)
  • Ramesh Chandra, Google U.S./Canada Fellowship in Web Application Security (Massachusetts Institute of Technology)
  • Adam Pauls, Google U.S./Canada Fellowship in Machine Translation (University of California, Berkeley)
  • Nguyen Dinh Tran, Google U.S./Canada Fellowship in Distributed Systems (New York University)
  • Moira Burke, Google U.S./Canada Fellowship in Human Computer Interaction (Carnegie Mellon University)
  • Ankur Taly, Google U.S./Canada Fellowship in Language Security (Stanford University)
  • Ilya Sutskever, Google U.S./Canada Fellowship in Neural Networks (University of Toronto)
  • Keenan Crane, Google U.S./Canada Fellowship in Computer Graphics (California Institute of Technology)
  • Boris Babenko, Google U.S./Canada Fellowship in Computer Vision (University of California, San Diego)
  • Jason Mars, Google U.S./Canada Fellowship in Compiler Technology (University of Virginia)
  • Joseph Reisinger, Google U.S./Canada Fellowship in Natural Language Processing (University of Texas, Austin)
  • Maryam Karimzadehgan, Google U.S./Canada Fellowship in Search and Information Retrieval (University of Illinois, Urbana-Champaign)
  • Carolina Parada, Google U.S./Canada Fellowship in Speech (Johns Hopkins University)
The students will receive fellowships consisting of full coverage of tuition, fees and stipend for up to three years. These students have been exemplary thus far in their careers, and we’re looking forward to seeing them build upon their already impressive accomplishments. Congratulations to all of you!
URL: http://googleresearch.blogspot.com/2010/07/google-phd-fellowships-go-international.html

[G] Preparing for emergencies with Google Earth Enterprise

| More

Official Google Enterprise Blog: Preparing for emergencies with Google Earth Enterprise

Editor’s Note: Brant Mitchell is Associate Deputy Director of the State of Louisiana Department of Homeland Security. The state of Louisiana Department of Homeland Security is a Google Earth Enterprise customer that leverages Google Earth Enterprise for emergency preparedness and now becomes the first Google Earth Enterprise customer to create a Google Earth Enterprise globe specifically for the public.

For the last three years the State of Louisiana has provided our first responder community a secured access to Federal, State and local geospatial data and high resolution imagery of Louisiana through a Google Earth Enterprise client. In preparation of hurricane season, Louisiana is pleased to announce that we have launched the first public version of a Google Earth Enterprise platform.

Louisiana Earth was released as part of the state's "Get a Game Plan" campaign, to assist citizens in creating evacuation plans by providing access to all of the states evacuation routes, sheltering points historical hazard data and other information that is essential during an evacuation such as locations of and available occupancy of hotels, gas stations, pharmacies, grocery stores, veterinary clinics and banks.

Louisiana Earth will also serve as a mechanism to relay critical data during disasters to help inform the public on the status of response and recovery efforts. Using the Deepwater Horizon Rig incident as an example, there is existing data that Louisiana is making available which includes the latest oil sightings from aerial observations, oyster bed closures, as well as critical environmental data such as bird nesting areas.

During hurricanes and other natural disasters, Louisiana will be able to provide information such as the location of points of distribution (PODs), food stamp offices, unemployment claims offices, disaster recovery centers as well as the status of parishes' power outages as an example.


Finally, while the primary purpose is to utilize Louisiana Earth as a mechanism to provide critical data during emergencies, it will also be utilized to promote Louisiana. We will constantly be adding data that the public can use to take advantage of the many activities and events, such as festivals, that are available in Louisiana.

Louisiana Earth already has information on all of the state parks which consists of lodging accommodations, hiking trails, and camping sites. Historical data and cultural events will also be included and will continue to be updated.

To access Louisiana Earth, go to laearth.la.gov.

Posted by Natasha Wyatt, Google Earth and Maps team
URL: http://googleenterprise.blogspot.com/2010/07/preparing-for-emergencies-with-google.html

[G] Use Chrome like a pro

| More

Official Google Blog: Use Chrome like a pro

This week I sent a note to Googlers about some of the Chrome team's favorite extensions. So many of them asked if they could share the note with people outside the company that I thought I would just do it for them, so here it is.

We're proud of the Chrome browser and the great extensions that its developer community has created, and we hope you enjoy them! They can all be found at chrome.google.com/extensions.
  • Opinion Cloud: Summarizes comments on YouTube videos and Flickr photos to provide an overview of the crowd’s overall opinion.
  • Google Voice: All sorts of helpful Voice features directly from the browser. See how many messages you have, initiate calls and texts, or call numbers on a site by clicking on them.
  • AutoPager. Automatically loads the next page of a site. You can just scroll down instead of having to click to the next page.
  • Turn Off the Lights: Fades the page to improve the video-watching experience.
  • Google Dictionary: Double-click any word to see its definition, or click on the icon in the address bar to look up any word.
  • After the Deadline: Checks spelling, style, and grammar on your emails, blog, tweets, etc.
  • Invisible Hand: Does a quick price check and lets you know if the product you are looking at is available at a lower price elsewhere.
  • Secbrowsing: Checks that your plug-ins (e.g. Java, Flash) are up to date.
  • Tineye: Image search utility to find exact matches (including cropped, edited, or re-sized images).
  • Slideshow: Turns photo sites such as Flickr, Picasa, Facebook, and Google Images into slideshows.
  • Google Docs/PDF Viewer: Automatically previews pdfs, powerpoint presentations, and other documents in Google Docs Viewer.
  • Readability: Reformat the page into a single column of text.
  • Chromed Bird: A nice Twitter viewing extension.
  • Feedsquares: Cool way of viewing your feeds via Google Reader.
  • ScribeFire: Full-featured blog editor that lets you easily post to any of your blogs.
  • Note Anywhere: Digital post-it notes that can be pasted and saved on any webpage.
  • Instant Messaging Notifier: IM on multiple clients.
  • Remember the Milk: The popular to-do app.
  • Extension.fm: Turns the web into a music library.
Posted by Jonathan Rosenberg, Senior Vice President, Product Management
URL: http://googleblog.blogspot.com/2010/07/use-chrome-like-pro.html

[G] Our 2010 EMEA CS4HS Awardees

| More

Official Google Blog: Our 2010 EMEA CS4HS Awardees

We recently told you about CS4HS, our workshop program for high school and middle school computer science teachers in the U.S. We now have some additional news to share: our 2010 EMEA (Europe, the Middle East and Africa) CS4HS awardees have been selected!

The CS4HS program provides funding to European, Middle Eastern and African universities which work in tandem with local high schools and middle schools to engage pre-university students in computer science. Awardees meet strict requirements: the projects must be scalable, impact a wide cross-section of students from all backgrounds, conform to a “train the trainer” model and, most importantly, interest and inspire the next generation of computer scientists.

The application review team said that many of the projects receiving funding directly address the training of computer science teachers in secondary schools. They were particularly excited by the Makerere University and University of Cape Town projects, both of which propose to spread best practice amongst educators in Africa—a new region for CS4HS.

You can find a list of all 14 awardees and their projects on the EMEA section of the CS4HS site.

Posted by Caitlin Pantos, University Programmes Specialist
URL: http://googleblog.blogspot.com/2010/07/our-2010-emea-cs4hs-awardees.html

[G] Google PhD Fellowships go international

| More

Official Google Blog: Google PhD Fellowships go international

We introduced the Google Fellowship program last year in the United States to broaden our support of university research. The students who were awarded the 2009 fellowships were a truly impressive group, many having high profile internships this past summer and even a few with faculty appointments in the upcoming year.

Universities continue to be the source of some of the most innovative research in computer science, and in particular it’s the students that they foster who are the future of our field. This year, we’re going global and extending the fellowship program to Europe, Israel, China and Canada. We’re very happy to be continuing our support of excellence in graduate studies and offer our sincere congratulations to the following PhD students for receiving Google Fellowships in 2010:

Google European Doctoral Fellowships
  • Roland Angst, Google Europe Fellowship in Computer Vision (Swiss Federal Institute of Technology Zurich, Switzerland)
  • Arnar Birgisson, Google Europe Fellowship in Computer Security (Chalmers University of Technology, Sweden)
  • Omar Choudary, Google Europe Fellowship in Mobile Security (University of Cambridge, U.K.)
  • Michele Coscia, Google Europe Fellowship in Social Computing (University of Pisa, Italy)
  • Moran Feldman, Google Europe Fellowship in Market Algorithms (Technion - Israel Institute of Technology, Israel)
  • Neil Houlsby, Google Europe Fellowship in Statistical Machine Learning (University of Cambridge, U.K.)
  • Kasper Dalgaard Larsen, Google Europe Fellowship in Search and Information Retrieval (Aarhus University, Denmark)
  • Florian Laws, Google Europe Fellowship in Natural Language Processing (University of Stuttgart, Germany)
  • Cynthia Liem, Google Europe Fellowship in Multimedia (Delft University of Technology, Netherlands)
  • Ofer Meshi, Google Europe Fellowship in Machine Learning (The Hebrew University of Jerusalem, Israel)
  • Dora Spenza, Google Europe Fellowship in Wireless Networking (Sapienza University of Rome, Italy)
  • Carola Winzen, Google Europe Fellowship in Randomized Algorithms (Saarland University / Max Planck Institute for Computer Science, Germany)
  • Marek Zawirski, Google Europe Fellowship in Distributed Computing (University Pierre and Marie Curie / INRIA, France)
  • Lukas Zich, Google Europe Fellowship in Video Analysis (Czech Technical University, Czech Republic)
Google China PhD Fellowships
  • Fangtao Li, Google China Fellowship in Natural Language Processing (Tsinghua University)
  • Ming-Ming Cheng, Google China Fellowship in Computer Vision (Tsinghua University)
Google United States/Canada PhD Fellowships
  • Chong Wang, Google U.S./Canada Fellowship in Machine Learning (Princeton University)
  • Tyler McCormick, Google U.S./Canada Fellowship in Statistics (Columbia University)
  • Ashok Anand, Google U.S./Canada Fellowship in Computer Networking (University of Wisconsin)
  • Ramesh Chandra, Google U.S./Canada Fellowship in Web Application Security (Massachusetts Institute of Technology)
  • Adam Pauls, Google U.S./Canada Fellowship in Machine Translation (University of California, Berkeley)
  • Nguyen Dinh Tran, Google U.S./Canada Fellowship in Distributed Systems (New York University)
  • Moira Burke, Google U.S./Canada Fellowship in Human Computer Interaction (Carnegie Mellon University)
  • Ankur Taly, Google U.S./Canada Fellowship in Language Security (Stanford University)
  • Ilya Sutskever, Google U.S./Canada Fellowship in Neural Networks (University of Toronto)
  • Keenan Crane, Google U.S./Canada Fellowship in Computer Graphics (California Institute of Technology)
  • Boris Babenko, Google U.S./Canada Fellowship in Computer Vision (University of California, San Diego)
  • Jason Mars, Google U.S./Canada Fellowship in Compiler Technology (University of Virginia)
  • Joseph Reisinger, Google U.S./Canada Fellowship in Natural Language Processing (University of Texas, Austin)
  • Maryam Karimzadehgan, Google U.S./Canada Fellowship in Search and Information Retrieval (University of Illinois, Urbana-Champaign)
  • Carolina Parada, Google U.S./Canada Fellowship in Speech (Johns Hopkins University)
The students will receive fellowships consisting of full coverage of tuition, fees and stipend for up to three years. These students have been exemplary thus far in their careers, and we’re looking forward to seeing them build upon their already impressive accomplishments. Congratulations to all of you!

Posted by Alfred Spector, VP of Research and Special Initiatives
URL: http://googleblog.blogspot.com/2010/07/google-phd-fellowships-go-international.html

[G] Preparing for emergencies with Google Earth Enterprise

| More

Google LatLong: Preparing for emergencies with Google Earth Enterprise

(Cross-posted from the Google Enterprise Blog)

Editor’s Note: Brant Mitchell is Associate Deputy Director of the State of Louisiana Department of Homeland Security. The state of Louisiana Department of Homeland Security is a Google Earth Enterprise customer that leverages Google Earth Enterprise for emergency preparedness and now becomes the first Google Earth Enterprise customer to create a Google Earth Enterprise globe specifically for the public.

For the last three years the State of Louisiana has provided our first responder community a secured access to Federal, State and local geospatial data and high resolution imagery of Louisiana through a Google Earth Enterprise client. In preparation of hurricane season, Louisiana is pleased to announce that we have launched the first public version of a Google Earth Enterprise platform.

Louisiana Earth was released as part of the state's "Get a Game Plan" campaign, to assist citizens in creating evacuation plans by providing access to all of the states evacuation routes, sheltering points historical hazard data and other information that is essential during an evacuation such as locations of and available occupancy of hotels, gas stations, pharmacies, grocery stores, veterinary clinics and banks.

Louisiana Earth will also serve as a mechanism to relay critical data during disasters to help inform the public on the status of response and recovery efforts. Using the Deepwater Horizon Rig incident as an example, there is existing data that Louisiana is making available which includes the latest oil sightings from aerial observations, oyster bed closures, as well as critical environmental data such as bird nesting areas.

During hurricanes and other natural disasters, Louisiana will be able to provide information such as the location of points of distribution (PODs), food stamp offices, unemployment claims offices, disaster recovery centers as well as the status of parishes' power outages as an example.


Finally, while the primary purpose is to utilize Louisiana Earth as a mechanism to provide critical data during emergencies, it will also be utilized to promote Louisiana. We will constantly be adding data that the public can use to take advantage of the many activities and events, such as festivals, that are available in Louisiana.

Louisiana Earth already has information on all of the state parks which consists of lodging accommodations, hiking trails, and camping sites. Historical data and cultural events will also be included and will continue to be updated.

To access Louisiana Earth, go to laearth.la.gov.

Posted by Natasha Wyatt, Google Earth and Maps team
URL: http://google-latlong.blogspot.com/2010/07/cross-posted-from-google-enterprise.html

[G] Translating Wikipedia

| More

Official Google Blog: Translating Wikipedia

(Cross-posted from the Google Translate Blog)

We believe that translation is key to our mission of making information useful to everyone. For example, Wikipedia is a phenomenal source of knowledge, especially for speakers of common languages such as English, German and French where there are hundreds of thousands—or millions—of articles available. For many smaller languages, however, Wikipedia doesn’t yet have anywhere near the same amount of content available.

To help Wikipedia become more helpful to speakers of smaller languages, we’re working with volunteers, translators and Wikipedians across India, the Middle East and Africa to translate more than 16 million words for Wikipedia into Arabic, Gujarati, Hindi, Kannada, Swahili, Tamil and Telugu. We began these efforts in 2008, starting with translating Wikipedia articles into Hindi, a language spoken by tens of millions of Internet users. At that time the Hindi Wikipedia had only 3.4 million words across 21,000 articles—while in contrast, the English Wikipedia had 1.3 billion words across 2.5 million articles.

We selected the Wikipedia articles using a couple of different sets of criteria. First, we used Google search data to determine the most popular English Wikipedia articles read in India. Using Google Trends, we found the articles that were consistently read over time—and not just temporarily popular. Finally we used Translator Toolkit to translate articles that either did not exist or were placeholder articles or “stubs” in Hindi Wikipedia. In three months, we used a combination of human and machine translation tools to translate 600,000 words from more than 100 articles in English Wikipedia, growing Hindi Wikipedia by almost 20 percent. We’ve since repeated this process for other languages, to bring our total number of words translated to 16 million.

We’re off to a good start but, as you can see in the graph below, we have a lot more work to do to bring the information in Wikipedia to people worldwide:

Number of non-stub Wikipedia articles by Internet users, normalized (English = 1)

We’ve also found that there are many Internet users who have used our tools to translate more than 100 million words of Wikipedia content into various languages worldwide. If you do speak another language we hope you’ll join us in bringing Wikipedia content to other languages and cultures with Translator Toolkit.

We presented these results last Saturday, July 10, at Wikimania 2010 in Gdańsk, Poland. We look forward to continuing to support the creation of the world’s largest encyclopedia and we can’t wait to work with Wikipedians and volunteers to create more content worldwide.

Posted by Michael Galvez, Product Manager
URL: http://googleblog.blogspot.com/2010/07/translating-wikipedia.html

[G] Focused on Creativity and Innovation - Imagination Group goes Google

| More

Official Google Enterprise Blog: Focused on Creativity and Innovation - Imagination Group goes Google

Editor's note: Continuing our “Going Google Everywhere” series, we’ve invited Matt Ballantine, CIO of Imagination Group, a global communications agency whose work with world famous brands spans all aspects of integrated, experiential and digital marketing. Imagination is an independent agency, with 12 offices around the world, and the full complement of specialists in-house, from brand consultants to architects, advertising specialists to interior designers, retail specialists and event producers to direct marketers and digital experts. Imagination’s clients include Aston Martin, Guinness, oneworld Alliance, Disney, Ford, Johnson & Johnson, Goldman Sachs, Shell and Samsung. Learn more about other organizations that have gone Google on our community map.

Throughout my career I've been bemused by how, despite best intentions, most IT projects have failed to deliver any real depth of business change. Technology issues inevitably crop up through the lifetime of the project, and the first contingencies to be cut are in the plans for communication, training and business change.

We've probably all seen it - server issues, network issues, compatibility of operating systems, patching, software release bugs... the list goes on. And all the while, the business engagement work gets squeezed (if it was ever planned in depth in the first place).

The Cloud is offering an opportunity for IT departments to fundamentally change their approach to delivering services into organisations. In the two years that I've been leading IT transformation at global communications agency Imagination, I've been describing a vision where our IT team is here to help the business exploit the technology we procure, and where we leave most of the deep technical work to experts at our partners. Expertise in-house of how Imagination uses its technology to become more collaborative, more global and more creative is of real value. Understanding how to patch together operating systems on servers just isn't.

As of April 2010, Imagination has Gone Google. My team moved 600 user accounts and 2TB of legacy email data spread across 14 locations in nine countries. We worked with partners to help manage the transition and my in-house team lead by project manager Sue Chick, were able to complete the technical migration work with a minimum of fuss and effort. In turn this meant that we could focus on helping the business start to exploit new possibilities.

Only last week, I received an excited email from our Creative Director in Sydney, Australia, who had just watched the final rehearsal of a product launch being run for a client in Hong Kong via video chat at his desk. We're only just starting to see how our teams can take the tools that we have made available to them to change how we work for the better. The Imagination IT team is now aligned to help those processes happen.



Posted by Dave Armstrong, Google Apps Team
URL: http://googleenterprise.blogspot.com/2010/07/focused-on-creativity-and-innovation_14.html

[G] BlueSpace and Google Earth Enterprise: taking visualization to the skies

| More

Official Google Enterprise Blog: BlueSpace and Google Earth Enterprise: taking visualization to the skies

Editor's Note: Justin Marston is the CEO of BlueSpace, an enterprise software company focused on the defense and intelligence communities. BlueSpace has built a next generation command and control application using their security middleware and Google Earth Enterprise. They are currently showcasing it as part of the Coalition Warrior Interoperability Demonstration (an international "war game" exercise) with support from a government agency. The BlueSpace app showcases just how far you can take Google Earth Enterprise as a visualization environment.

Geospatial visualization of multiple streams of data has been critical to the defense and intelligence communities for a long time. Whether it’s showing aircraft flying around, soldiers taking a hill or different types of intelligence – seeing it on a map has been key to understanding a conflict.

In the second World War, the allies used maps with little models to show units, and moved them with poles to update their locations. With modern radar and GPS systems, things are a bit more sophisticated, but much of the mapping functionality has lagged behind. Many of the currently deployed command and control (C2) systems use flat, two-color vector maps with triangles showing units.

Visualization of AWACS plane in Google Earth

BlueSpace and AWACS
Before BlueSpace engaged them, AWACS was already actively working with 3D visualization. AWACS is the US Air Force Airborne Warning and Control System: a forward deployed radar platform (the planes with big spinning discs on top). The vision of the AWACS program has been to move away from a black screen with green triangles on it, and move towards a more visually rich C2 environment for operators that can show the terrain in which they are working.

How has BlueSpace helped? Well, we have focused on two problems – high quality, real-time visualizations and creating a Unified Operating Picture.

High Quality, Real-time Visualizations
The first problem is creating a much more "real" view of the battle theater, with 3D models moving around in real-time based on input data feeds giving latitude and longitude references for units. Our design goal was to create something more like a real-time video game using Google Earth's richness of graphics and capabilities.

BlueSpace is demonstrating its Multi-Level Security Command and Control (MLS C2) application at 5 different locations for the Coalition Warrior Interoperability Demonstration (CWID), a joint exercise between the US, UK, Canada, Australia and NATO (among others) to help find and prove technologies and systems that can help better orchestrate coalition warfare. For the exercise, BlueSpace worked with its partners to model around 100 units including aircraft, ground units and boats and of these units move around in real-time based on data feeds being fed to the application.

You can take a look at some of the interface, captured from Google Earth in this unclassified video: http://www.bluespace.com/mlsc2.html

A Unified Operating Picture
Wars used to be fought by relatively small numbers of allies, with each nation focused on a particular theater. As warfare has evolved over the last two decades, the reach of aircraft, missiles, satellites etc. have blurred lines between the different services and often between nations.

MLS C2 User Interface using Google Earth Enterprise
for geospatial visualization of ground, air and sea units

Right now, the NATO configuration of the AWACS planes can have up to 14 different screens on each AWACS aircraft – one for the US aircraft, one for the British, one for the Canadian, one for the German, etc. So when something new comes up on radar, operators may have to look at up to 14 screens to figure out what is going on.

BlueSpace has taken these separate pictures and consolidated them into a single Unified Operating Picture (UOP) that spans all the different networks, providing one Google Earth environment, with all the units in that environment, no matter which nation or service they serve. This means an operator on an AWACS plane only has to look at one screen to see what is happening – a vast improvement.

Google Earth's extensive capabilities allow an operator to fully utilize this unified operating picture to see terrain, roads, etc. in their relation to the plotted units. In addition, Google Earth's full camera controls provide the viewing flexibility necessary to interact with those units.

BlueSpace and Google
We see a great future for Google Earth Enterprise in our C2 system. Being able to see the helicopter, visually recognize its type immediately and see which mountains are next to it when the pilot calls in, “I’m taking fire from the ridge on the left” makes a big difference in a real fight. Doing all of that across many different security domains in a Unified Operating Picture that spans multiple networks – that’s a game changing capability.

Posted by Natasha Wyatt, Google Earth Enterprise team











URL: http://googleenterprise.blogspot.com/2010/07/bluespace-and-google-earth-enterprise.html

Wednesday, July 14, 2010

[G] New keyword targeting feature rolling out globally

| More

Inside AdWords: New keyword targeting feature rolling out globally

After a successful open beta test in the UK and Canada, we're pleased to announce that the broad match modifier is now rolling out globally in most languages*. To recap the original broad match modifier beta launch announcement:
The broad match modifier is a new AdWords targeting feature that lets you create keywords which have greater reach than phrase match and more control than broad match. Adding modified broad match keywords to your campaign can help you get more clicks and conversions at an attractive ROI, especially if you mainly use exact and phrase match keywords today.

To implement the modifier, just put a plus symbol (+) directly in front of one or more words** in a broad match keyword. Each word preceded by a + has to appear in your potential customer's search exactly or as a close variant. Close variants include misspellings, singular/plural forms, abbreviations and acronyms, and stemmings (like “floor” and “flooring”). Synonyms (like “quick” and “fast”) and related searches (like “flowers” and “tulips”) aren't considered close variants.

The graphic below illustrates the relative reach of different keyword match type strategies.


Be sure there are no spaces between the + and modified words, but do leave spaces between words. Correct usage: +formal +shoes. Incorrect usage: +formal+shoes.
Here’s what one major UK retail company said about their experience using the feature:
We're always interested in ways to increase our volumes while keeping our CPA down. As a result, we've added broad match modified keywords to several campaigns where previously we only had phrase and exact match keywords. After a few weeks of testing, we're pleased to see these campaigns showed significant increases in conversion and volume, whilst keeping the CPA down. Therefore, we will be looking to scale our use of modified broad match keywords in all our campaigns to take full advantage of these great results.
If you mainly use broad match keywords in your account, you should know that switching your existing broad match keywords to modified broad match will likely lead to a significant decline in your click and conversion volumes and will not directly improve Quality Score. To maintain volume, keep existing broad match keywords active, add new modified broad match keywords, and adjust bids to achieve your target ROI based on the results you see.

You can begin using the feature by logging into your AdWords account, through the AdWords Editor and through the AdWords API. For more details, guidelines on usage, and answers to common questions, check out the original blog post and the AdWords help center.

Posted by Dan Friedman, Inside AdWords crew

*Except Chinese, Japanese, Thai, Arabic and Hebrew languages, which are coming soon. We’ll update this post when the feature becomes available in those languages.
URL: http://adwords.blogspot.com/2010/07/new-keyword-targeting-feature-rolling.html

[G] Google Books goes Dutch

| More

Inside Google Books: Google Books goes Dutch

Posted by Philippe Colombet, Strategic Partnership Development Manager

In recent months, I’ve got to know a group of people in the Hague who are working on an ambitious project to make the rich fabric of Dutch cultural and political history as widely accessible as possible - via the Internet.

That team is from the National Library of the Netherlands, the Koninklijke Bibliotheek (KB), and as of today, we'll be working in partnership to add to the library's own extensive digitisation efforts. We'll be scanning more than 160,000 of its public domain books, and making this collection available globally via Google Books. The library will receive copies of the scans so that they can also be viewed via the library's website. And significantly for Europe, the library also plans to make the digitised works available via Europeana, Europe's cultural portal.

The books we'll be scanning constitute nearly the library's entire collection of out-of-copyright books, written during the 18th and 19th centuries. The collection covers a tumultuous period of Dutch history, which saw the establishment of the country's constitution and its parliamentary democracy. Anyone interested in Dutch history will be able to access and view a fascinating range of works by prominent Dutch thinkers, statesmen, poets and academics and gain new insights into the development of the Netherlands as a nation state.

This is the third agreement we've announced in Europe this year, following our projects with the Italian Ministry of Cultural Heritage and the Austrian National Library. The Dutch national library is already well underway with its own ambitious scanning programme, which will eventually see all of its Dutch books, newspapers and periodicals from 1470 onwards being made available online. By any measure, this is a huge task, requiring significant resources, and we're pleased to be able to help the library accelerate towards its goal of making all Dutch books accessible anywhere in the world, at the click of a mouse.

It's exciting to note just how many libraries and cultural ministries are now looking to preserve and improve access to their collections by bringing them online. Much of humanity's cultural, historical, scientific and religious knowledge, collected and curated over centuries, sits in Europe's libraries, and its great to see that we are all striving towards the same goal of improving access to knowledge for all.

Google and other technology companies have an important role to play in achieving this goal, and we hope that by partnering with major European cultural institutions such as the Dutch national library, we will be able to accelerate the rapid growth of Europe's digital library.

(Cross-posted from the European Public Policy Blog)
URL: http://booksearch.blogspot.com/2010/07/google-books-goes-dutch.html

[G] Our commitment to the digital humanities

| More

Official Google Research Blog: Our commitment to the digital humanities

Posted by Jon Orwant, Engineering Manager for Google Books, Magazines and Patents

(Cross-posted from the Official Google Blog)

It can’t have been very long after people started writing that they started to organize and comment on what was written. Look at the 10th century Venetus A manuscript, which contains scholia written fifteen centuries earlier about texts written five centuries before that. Almost since computers were invented, people have envisioned using them to expose the interconnections of the world’s knowledge. That vision is finally becoming real with the flowering of the web, but in a notably limited way: very little of the world’s culture predating the web is accessible online. Much of that information is available only in printed books.

A wide range of digitization efforts have been pursued with increasing success over the past decade. We’re proud of our own Google Books digitization effort, having scanned over 12 million books in more than 400 languages, comprising over five billion pages and two trillion words. But digitization is just the starting point: it will take a vast amount of work by scholars and computer scientists to analyze these digitized texts. In particular, humanities scholars are starting to apply quantitative research techniques for answering questions that require examining thousands or millions of books. This style of research complements the methods of many contemporary humanities scholars, who have individually achieved great insights through in-depth reading and painstaking analysis of dozens or hundreds of texts. We believe both approaches have merit, and that each is good for answering different types of questions.

Here are a few examples of inquiries that benefit from a computational approach. Shouldn’t we be able to characterize Victorian society by quantifying shifts in vocabulary—not just of a few leading writers, but of every book written during the era? Shouldn’t it be easy to locate electronic copies of the English and Latin editions of Hobbes’ Leviathan, compare them and annotate the differences? Shouldn’t a Spanish reader be able to locate every Spanish translation of “The Iliad”? Shouldn’t there be an electronic dictionary and grammar for the Yao language?

We think so. Funding agencies have been supporting this field of research, known as the digital humanities, for years. In particular, the National Endowment for the Humanities has taken a leadership role, having established an Office of Digital Humanities in 2007. NEH chairman Jim Leach says: "In the modern world, access to knowledge is becoming as central to advancing equal opportunity as access to the ballot box has proven to be the key to advancing political rights. Few revolutions in human history can match the democratizing consequences of the development of the web and the accompanying advancement of digital technologies to tap this accumulation of human knowledge."

Likewise, we’d like to see the field blossom and take advantage of resources such as Google Books that are becoming increasingly available. We’re pleased to announce that Google has committed nearly a million dollars to support digital humanities research over the next two years.

Google’s Digital Humanities Research Awards will support 12 university research groups with unrestricted grants for one year, with the possibility of renewal for an additional year. The recipients will receive some access to Google tools, technologies and expertise. Over the next year, we’ll provide selected subsets of the Google Books corpus—scans, text and derived data such as word histograms—to both the researchers and the rest of the world as laws permit. (Our collection of ancient Greek and Latin books is a taste of corpora to come.)

We've given awards to 12 projects led by 23 researchers at 15 universities:
  • Steven Abney and Terry Szymanski, University of Michigan. Automatic Identification and Extraction of Structured Linguistic Passages in Texts.
  • Elton Barker, The Open University, Eric C. Kansa, University of California-Berkeley, Leif Isaksen, University of Southampton, United Kingdom. Google Ancient Places (GAP): Discovering historic geographical entities in the Google Books corpus.
  • Dan Cohen and Fred Gibbs, George Mason University. Reframing the Victorians.
  • Gregory R. Crane, Tufts University. Classics in Google Books.
  • Miles Efron, Graduate School of Library and Information Science, University of Illinois. Meeting the Challenge of Language Change in Text Retrieval with Machine Translation Techniques.
  • Brian Geiger, University of California-Riverside, Benjamin Pauley, Eastern Connecticut State University. Early Modern Books Metadata in Google Books.
  • David Mimno and David Blei, Princeton University. The Open Encyclopedia of Classical Sites.
  • Alfonso Moreno, Magdalen College, University of Oxford. Bibliotheca Academica Translationum: link to Google Books.
  • Todd Presner, David Shepard, Chris Johanson, James Lee, University of California-Los Angeles. Hypercities Geo-Scribe.
  • Amelia del Rosario Sanz-Cabrerizo and José Luis Sierra-Rodríguez, Universidad Complutense de Madrid. Collaborative Annotation of Digitalized Literary Texts.
  • Andrew Stauffer, University of Virginia. JUXTA Collation Tool for the Web.
  • Timothy R. Tangherlini, University of California-Los Angeles, Peter Leonard, University of Washington. Northern Insights: Tools & Techniques for Automated Literary Analysis, Based on the Scandinavian Corpus in Google Books.
We have selected these proposals in part because the resulting techniques, tools and data will be broadly useful: they’ll help entire communities of scholars, not just the applicants. We look forward to working with them, and hope that over time the field of digital humanities will fulfill its promise of transforming the ways in which we understand human culture.
URL: http://googleresearch.blogspot.com/2010/07/our-commitment-to-digital-humanities.html