Google set up a press event at the Computer History Museum in Mountain View today to discuss new search technologies Google has been working on.
Marissa Mayer, VP of search products and user experience, introduced prominent leads from the company to discuss mobile search and search technologies.
Vic Gundotra, VP of engineering, and Amit Singhai, Google Fellow went on stage to show off some some new releases.
Marissa Mayer, VP of search products and user experience takes the stage to explain Google’s vision of the future of search. Marissa explains that search nowadays is centered around four main challenges: modes, as in how and where people type in a search query, media, as in managing and consuming all kinds, and in language, as in being able to conduct searches without the communication barrier, and finally personalisation, as in being able to customise searches based on a person’s tastes, preferences, and location.
With that in mind, Google is going full steam ahead. The company now works in 173 localised domains around the world and is releasing products at a breackneck pace. In the past 67 days Google launched 33 new product features and technologies.
Vic Gundotra, VP of engineering, takes the stage to discuss Google’s mobile search products. Vic cites three major trends in mobile phone development that allows Google to innovate: Moore’s Law to shrink devices and make things cheaper, increase in connectivity, and finally emergence of cloud computing and harnessing computing power from data centres the size of football fields.
To demonstrate, Vic is on stage with an Android phone and an iPhone showing off Voice Search. Using just voice commands spoken to his mobile devices, Vic is able to bring up search results in English and Mandarin Chinese. The newest language Google is able to understand is Japanese. Vic brings on stage a native speaker to demonstrate how an Android phone is able to display maps and search results. To further expand on Google’s voice recognition technology, Vic demonstrates an experimental project with voice translations. Sentences spoken into his Android phone could be echoed in direct Spanish translations.
Google Goggles is an experimental product to introduce image recognition and search into a mobile user’s hand. Vic demonstrates Goggles with a Motorola Droid by snapping a picture of a Japanese Shinto temple projected onto the screen and within seconds sees a list of related search results. Vic re-emphasises the technology that goes behind this product, the power of being able to “see” with a mobile device, connectivity in place to send the information to a cloud data centre, have it be analysed. For now Google Goggles is only available to Android users. A new version of Google Maps is also available for download to Android users.
Amit Singhal used to run Google’s search unit for 9 years, he’s now working in the research capacity as a Google Fellow. Today, Amit announces Google’s Real Time Search, a new way to keep track of the latest updates, tweets, blog, news, or webpage on the Internet. Amit emphasises that Google’s core strength is collecting information, developing algorithms, and working with such large complex systems to sort this information. Real Time Search, as explained by Amit, does not exclude any information, the goal is to be a comprehensive source and letting Google’s algorithms float relevant and recent information to the top for users.
Google’s partnerships with Apple, MySpace, and Facebook is helping them keeping abreast of all the information. There are also systems in place to avoid gaming by spammers. Amit credits Matt Cutts and his anti-spam team for developing those systems. Amit also alludes to the fact that Google’s experience with web search gives them insight into counteracting spamming techniques before they’re prevalent.
Right now Real Time Search is available to English-speaking locales like Canada and Britain. Early next year Google expects to roll out Real Time Search for more languages.