According to Google, device control in the future could literally be placed right at our fingertips. First announced back in 2015, Google’s Project Soli is an ambitious plan to replace connectivity devices such as buttons and touchscreens by a new method of control and communication based on simple hand gestures made in thin air. The US Federal Communications Commission (FCC) has recently concluded that developments currently explored by Project Soli ‘serve the public interest’, and the FCC has therefore granted permission for work on the technology to move forward.
A radar-driven interface
The proposed functionality will be achieved, Google believe, by interpreting delicate radar readings gathered via a system dubbed RadarCat. Just like the radar used to locate ships and aeroplanes, system sensors aim innocuous electromagnetic pulses at an object. The pulses which bounce back are all influenced in some way by that object’s different features and attributes – for example, its shape, thickness, density, surface texture and more. Measuring, plotting and collating these readings develops a unique radar fingerprint for that object. So perhaps in the near feature, you will be able to play slots, blackjack or roulette in your favourite online casino without even touching the screen of your device.
Simple object recognition was achieved as long ago as 2016, but researchers based at the University of St Andrews, Fife, Scotland, now want to use a development of this technology, which they are terming Solinteraction, to achieve far more nuanced levels of sensing. Using a similar sensing technique, researcher Hui-Shyong Yeo reports it has proved possible to conduct ‘a vast exploration into […] the counting, ordering, stacking, movement, and orientation of different objects, such as cards and Lego blocks.’
Potential for sophistication
Such manipulation can also be achieved by other means, but the advantage of radar is that, unlike employing RFID chips, for example, objects of interest don’t have to be altered in any way. And likewise, unlike camera-based systems, Soli interaction has no privacy implications, nor does it demand good visual contact – in fact, radar technology works equally well in the dark or in the light.
Some of the present limitations of this technology actually hint at the sophisticated analysis the method could potentially achieve. Researchers have found that, on a task like counting playing cards, for instance, a card which is slightly bent is enough to trigger false classifications due to the extreme sensitivity of the radar measurements.
One of the project’s aims is to slim down the components required for this technology so that they will fit on a small 8 x 10mm chip. Because it can use the radar technology’s ability to sense and track hand movements right down to millimetre accuracy, the Soli system is seen as a potential way to exercise a significant range of personal control over objects such as a TV, smartwatch, speaker, media player or smartphone.
As a concept control system for micro-devices like a smartwatch, where small surface areas lack the room required for conventional control access via buttons and touch screens, the idea of using hand gestures is an attractive proposition. Gestures presently envisaged include: tapping thumb and index finger to simulate a button press and rubbing two fingers together to simulate actions such as scrolling or turning a device dial. Many observers have also noted that this technology appears to have great potential to enable those with impairments and/or limited mobility to interact with devices.
Google’s ATAP performance
Some critics would say the news that Project Soli is still alive and kicking at all is the real news story. Soli is the responsibility of Google’s ATAP (Advanced Technology and Projects) division whose track record is not impressive. In fact, once ATAP become involved, most projects are kicked off into the long grass after a couple of years.
Supposedly ‘unafraid of failure’, ATAP’s roster of projects which have disappeared without trace can make grim reading: Project Ara ran for three years but was unable to meet its aim of delivering a modular smartphone. The brief for Project Abacus was to create a smartphone authentication protocol which used every kind of phone sensor – camera, microphone, GPS tracker, touchscreen activity etc. – to create and continuously update a user’s ‘trust score’ which could replace password authentication. But these days Abacus is never mentioned. This same fate befell Project Vault, which offered a secure computing environment housed inside an SD card accessed via separate authentication.
Some ATAP projects have actually been rolled out. Project Jacquard, a collaboration between Google and Levi, produced a jean jacket which incorporated a touch panel in the sleeve of the garment and retailed at $350. Project Tango has arguably been ATAP’s most successful venture. This was basically a 3D-sensing smartphone with a further array of smart sensors. Like most ATAP schemes, Tango was cancelled but a limited version of the technology now forms part of Android’s ARCore, a Google platform for building AR experiences.
So as one of the only remaining ATAP projects, the real question is: Will Soli ever get beyond the drawing board?