Google trained Pixel 4’s Soli AI on millions of gestures from volunteers

Google today penned an explainer on the Soli radar-based technology that ships inside its Pixel 4 smartphones. While many of the hardware details were previously known, the company for the first time peeled back the curtains on Soli’s AI models, which are trained to detect and recognize motion gestures with low latency. While it’s early days  the Pixel 4 and the Pixel 4 XL are the first consumer devices to feature Soli  Google claims the tech could enable new forms of context and gesture awareness on devices like smartwatches, paving the way for experiences that better accommodate users with disabilities.

The Soli module within the Pixel 4, which was a collaborative effort among Google’s Advanced Technology and Projects (ATAP) group and the Pixel and Android product teams, contains a 60GHz radar and antenna receivers with a combined 180-degree field of view that record positional information in addition to things like range and velocity. (Over a window of multiple transmissions, displacements in an object’s position cause a timing shift that manifests as a Doppler frequency proportional to the object’s velocity.) Electromagnetic waves reflect information back to the antennas, and custom filters (including one that accounts for audio vibrations caused by music) boost the signal-to-noise ratio while attenuating unwanted interference and differentiating reflections from noise and clutter.

Read More