Google is working on new technology for devices to read human body language
Google has revealed more insights into its Soli radar technology for non-verbal interactions between humans and computers. For example, a non-verbal interaction could be interacting with a device by waving your hand or turning your head. The technology, which the company is still developing, combines motion sensors with algorithms.
Google explains how the sensors and algorithms work in a recent video and describes future use cases. The company suggests that the benefits of this technology include reducing device overwhelm and making gadgets more helpful and less intrusive.
Gestures such as 'approach and leave' use deep learning algorithms to determine if someone is in a device's 'personal space'. According to Google, the personal space overlap is a good indicator of whether humans will interact or just pass by.
Actions such as 'turning away/towards' and 'glance' are recognised by machine learning algorithms that can understand more nuanced body language. For example, this technology can tell the degree to which your head is turned, allowing it to predict how likely you are to engage.
The Soli radar sensor was launched in 2015 and has been used in multiple Google devices. Google used it in the Pixel 4 for Motion Sense to detect hand gestures, allowing users to pause music or alarms without needing to touch their phones. The sensors have also been used in the Sleep Sensing feature available in the second-generation Nest Hub smart display; the tool helps track your sleep quality by detecting your breathing patterns and movement.