- Google’sProject Soli motion-sensing technology will make its first appearance in the Pixel 4, the company announced on Monday.
- Project Soli, which detects fine movements using radars, has been under development at Google for five years.
- The company unveiled the technology in 2015, but hasn’t specified how it will be incorporated into any products until now.
- Visit Business Insider’s homepage for more stories.
Google’s motion-sensing technology, known as Project Soli, appears to be one of the headlining features on the Pixel 4, the company’s next flagship smartphone that’s expected to launch later this year.
Google provided a preview of the new features coming to the Pixel 4 on Monday, which included capabilities powered by its Project Soli technology. The new phone will include motion sensors located near the top of the device that can work with algorithms to understand when you’re nearby. These motion sensors will make it possible to perform tasks such as skipping a song, dismissing phone calls, and snoozing alarms by waving your hand.
Google’s Soli technology should also make it easier to unlock your phone more quickly using facial recognition, since the company says the sensors can detect when you’re lifting the device. The phone should unlock as you pick it up, according to Google.
The Pixel 4 is the first device to incorporate technology from Project Soli, which Google has been developing under its Advanced Technology and Projects team (ATAP) for the past five years and demonstrated four years ago. While the company has explained how the technology works, this is the first time Google has revealed how it will appear in a consumer product.
Project Soli works by using radars to track the human hand, as Google’s Ivan Poupyrev said in an introductory video from 2015. That video also showed how it was possible to scroll through menus on devices like smartwatches and speakers simply by rubbing your fingers together. That type of functionality could be especially useful for devices with small screens, like wearables, that might be difficult to operate using traditional touch input.
Progress on Project Soli has been largely quiet since its announcement until this year when the Federal Communications Commission gave Google permission to operate its sensors at higher levels than what was previously allowed. This was a sign that the company was moving forward with plans to develop the technology following its flashy unveiling in 2015.
Google is far from being the first company to implement gesture recognition into smartphones. Samsung added motion control tech to the Galaxy S4 in 2013, which let you scroll and flip through photos by waving your hand. But the phone was widely panned for having too many features, and the gesture controls didn’t really catch on. Motion sensors on the LG G8 also make it possible to take a screenshot or open an app by using gestures such as pinching or swiping in the air.
Based on Google’s Project Soli demonstrations in the past, however, it sounds like its radar-based motion recognition may take this a step further by identifying movements that are more granular.
Google typically releases its new Pixel phones in October, so we’re expecting to learn more about the device then.
Business Insider Emails & Alerts
Site highlights each day to your inbox.