Google's Project Soli secures FCC approval
It seems Google is always looking to the future of electronics and how people will use the next generations of personal devices such as smartphones. To this end, the company has sought approval for a new form of technology called Project Soli. It appears to consist of sensors that turn ordinary hand motions or gestures into commands or controls for hardware in the immediate vicinity. For example, miming actions such as swiping, turning keys, or simply pressing air between the thumb and forefinger should result in responses on a screen or a circuit board.
These sensors measure just 8 by 10 millimeters across, and as such could be integrated into all kinds of devices. This suggests that they could also work with Internet of Things- (IoT), augmented reality- or virtual reality-related hardware, and could even make touchscreens unnecessary. ‘Soli sensors’ detect the necessary gestures via Direct-Sequence Spread Spectrum (DSSS) radar, which works in the 60-gigahertz band reserved for industrial, scientific and medical (or ISM) applications. It is somewhat impressive that Google has managed to cram this capability into a small-footprint, solid-state module over the project's four-year history.
On the other hand, many people may remember that Samsung tried something very similar with its Air Gesture suite of features. This form of contact-less control was a part of the TouchWiz era, and depended on a dedicated proximity sensor to work. Air Gestures proved less than popular, and were eventually phased out as the company moved on to Experience UI for its smartphones.
However, Google appears adamant that Project Soli’s implementation will work in the future. In addition, the rumor mill had also linked Apple to a form of mid-air gesture system of its own. On the other hand, this line of speculation also assumed that it would be present in the iPhone X’s successors. Needless to say, it has not made it to the XS, XS Max or XR.