Google Communication Solinteraction

Computer scientists at the University of St Andrews Computer Human Interaction research group (SACHI) have unveiled Solinteraction, which demonstrates the potential for radar-based interaction. The team has worked with Google on Project Soli, a radar-based sensor technology that can sense the micro and subtle motion of human fingers. The FCC has recently granted Google a waiver that allows Project Soli sensors to use frequencies between 57 and 64 Ghz, which are higher than typically allowed in normal gadgets. Back in 2015, Google’s Advanced Technology and Projects group (ATAP) showed off tiny radar-based sensors that enabled users to control gadgets simply by tapping their fingers together. Read more for a Solinteraction video and additional information.



“In this paper, we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use cases for such interaction, which allows us to identify a series of design patterns, beyond static interaction, that are continuous and dynamic. This exploration is grounded in both a characterisation of the radar sensing and our rigorous experiments, which show that such sensing is accurate with minimal training,” said Professor Aaron Quigley, Chair of Human Computer Interaction in the School of Computer Science at the University of St Andrews.

Author

A technology, gadget and video game enthusiast that loves covering the latest industry news. Favorite trade show? Mobile World Congress in Barcelona.