Google is continuously revolutionizing the internet market and the whole IT sector with its fascinating and out of the box innovations, whether it be its global search engine, Google maps, Android, Google Glasses, Loon Internet Balloons, Allo & Duo, Chromebook, Google fiber or it's Driverless Cars. Project Soli, another impressive project from Google Inc. that recognizes and acts upon the touchless gestures made by users. Google along with its team has been working in the direction of creating such practical and helpful products that we all can actually use. Their continuous experiments with new ideas and concepts result in some great inventions that always leaves us totally amazed. It always aims to make our lives easy and fast. So to accomplish this here is its new innovation to make our hands more useful like never before, how? Well, you’ll get to know about this soon.


 
So what actually is Project Soli?
Project Soli is a new technology of gesture recognition based on radar technology. It is based on sensing and recognizing the fine motions of your hands and fingers to generate commands for a computer or a device. Project Soli is a pioneering experiment from Google’s Advanced Technology And Projects group (ATAP) in the field of touchless motion and gesture controls. It is a new advancement in the world of wearables. But it is not just wearable but a small radar-based sensor chip that can be embedded onto almost any device, your smartwatches, smartphones, tablets, etc. Through this, ATAP has made it possible to have only one user interface required to play games, click icons, and do all the stuff you want without even touching the screen i.e. your hand movements. Your hands are the only user interface you’ll ever need to do all this. That means no need to deal with those tiny touchscreens any more.
 
How it works?
 
 
Project Soli is based on the concept of Doppler’s effect that describes the frequency of a wave-like signal, depending upon the movements made by the sender and the receiver. The hand is the sender that generates the signals through the movements or gestures made in the pre-defined patterns and the radar system installed in the chipset is the receiver that detects those gestures and movements and the difference between their speeds. Moving your hand close and away from the radar changes the signal and its amplitude. Your hands generate very high bandwidth signals, for example, by sliding the thumb and your finger together or tapping them together, etc. helping the radar to detect the position movement and speed of those movements in the space with fine accuracy. That means you can control your smart gadgets without touching their surfaces. Google has used a variety of measurements and patterns of hand movements that the machine learning algorithms can easily interpret to detect the gestures made.
 
Project Soli was started over a year ago when announced in the Google I/O conference and is headed by Ivan Poupyrev, who brings about a new excitement in the IT industry for this small yet powerful new gadget. The first version of Soli was released in last year's conference but it was not that efficient and the radar consumes a lot of battery. So the ATAP team redesigned the chip with less power consumption and more efficiency. It is also planning to launch a new beta version of the development kit for the developers to utilize and test for its further improvement.
 
How it is different from others?
Project Soli is not a new concept there are some other projects that have been already brought about in the market. For example, Aria clip for smartwatches that also uses hand gestures and no touches, Apple's Watch crown is another example, but Soli is basically a different approach from them. First, it is a radar-based technology rather than camera-based, like other systems, for their motion tracking and it tracks even the really precise motions with great accuracy and speed.
 
 
Second, Google has made it as small as possible, just a tiny chipset, to make it fit onto any device or object and the radar sensors can sense any type of object. It is not merely a wearable but it will change the way we deal with our day to day electronic devices like your music systems, cars, fitness bands, etc.
 
Next Steps
Soli is a great innovation in the field of IoT devices, and it can have fascinating applications when combined with existing projects in the market. Soli could be embedded into auto control frameworks, allowing drivers to handle their vehicles with signals generated through their hand movements. Simple finger motions can turn on the horn, windshield wiper controls on the guiding wheel, etc. In the near future, Google may combine facial/fingerprint impressions with Android Auto.
Car manufacturers like BMW and Volvo that have already provided support for smart devices and apps in their vehicles can take advantage of these gestures controls through this radar base chipset, named Project Soli. By introducing these kinds of projects Google plans on to enter your home, appliances, cars, etc, and thus gathering more and more information for its knowledge graph to enhance its search capabilities to a far greater extent.
 
 
ATAP has announced to open its project to developers for building applications to test its usability, efficiency, and stability, by releasing a development kit soon. The time by when it will make it available for developers it not yet made clear, but it would expose some of its dev API's to developers for further use and advancements. Let's wait for its new step!
 
Photo Credits: Google Images

Post a Comment

Previous Post Next Post