In 2015, Google introduced us to Project Soli. Led by Ivan Poupyrev, it is an initiative that consists of tiny radars capable of recognizing the movements we make with our hands. Then, the chipset interprets them and create action without physically touching the screen or surface.
At that time, Google showed a small prototype that emitted a signal. The prototype detected with an accuracy of 10,000 records per second. After several years of that demonstration, it has received approval from the US Federal Communications Commission (FCC) for its implementation.
A green light that comes after Google and Facebook have agreed on the range of frequencies that these small radars will use.
Google Gesture Technology Moves To The Next Phase
Describing Project Soli is a complicated task since there are not many similar projects. In the presentation video, it tells us about the different possibilities. We would be facing a tiny radar that would be implemented in any device, from a wearable, a mobile, a car or any device.
Thanks to this radar you can recognize the gestures and movements nearby with great precision. You can use them to interpret “screen” movements or pulsations without having to touch anything physically.
The FCC will allow the Soli sensors to operate at a level higher than currently allowed, with frequencies that will also allow them to be used onboard the aircraft. The reason used by the FCC, as we read in Reuters, is that Project Soli will be public needs by providing an innovative device with wireless gesture technology.”
Project Soli sensors can recognize space in three dimensions using radar. A technology described by Google itself allows you to press buttons. You can perform complex gestures such as “clamp”. More interesting still is that despite being a virtual control, the interactions feel physical since there is a haptic response to touching each other’s fingers.
The Origin of Project Soli
Coming out of the Marvel movie universe, the Soli Project outlines a promise for subtle interaction technology. The gestures of your hand could adjust the volume of your equipment, set the time on your smartwatch or adjust the lights or temperature. All are done without touching anything, just like Ironman style.
A recent decision by the telecommunications regulator in the US UU enabled Google to continue the development of innovative technology.
In this video presentation, Ivan Popurev, leader of the Soli Project at Google, explained that the sensor is capable of capturing movements in a three-dimensional space using spectrum frequencies to detect manual movements.
It is known that the system can generate experiences of sub-millimetre movements with great precision. It can press an invisible button between the thumb and forefinger as well as turning a dial by rubbing a finger with your index finger.
This development raises ample possibilities for interaction, which could simplify the relationship of users with multiple smart devices, such as IoT or wearable systems.
Imagine, as the video presentation of the project shows, and you can manage to define the time of your watch only with a movement of the fingers on the wrist. Moreover, you could change the music or answer a call while driving without having to take your hands off the wheel, or even play the piano without doing much more than sliding your fingers in the air.
The development also poses opportunities to benefit people with reduced mobility or speech difficulties by simplifying ways to interact with smart devices.
As the project manager said, ” Soli has no moving parts, fits on a chip and consumes little energy. It is not affected by light conditions and works through most materials. Just imagine the possibilities… ”
How does it Work?
Soli is a sensor that uses radar to capture the movement of a human hand. More technically, it is a system that emits electromagnetic waves at wide levels so that objects within the wave range disperse energy and reflect a part of it towards a radar antenna.
To the signal that is reflected is added the energy, the time taken by the movement and the change in frequency. All of them provides information about the object and its behavior.
This way, the radar understands things such as the size, material or shape of the object, the orientation of the hand, the distance and the speed at which the movement occurs. It also understands if one hand is bigger than the other or moves differently.
After mapping the object, Soli can track and recognize hand gestures, including fine finger movements. To achieve this, Google created a new radar detection paradigm with custom hardware, software and algorithms.
The project is also working on the creation of a “universal gesture interaction language” that allows users, regardless of the type of device, to control functionalities with the same set of gestures.
Project Soli on Pixel 4
Over the past few days, we have witnessed the arrival of the first Soli leaks with the Google Pixel 4 as protagonists.
We have already received not one but two different confirmations about the design of this pair of Google phones. Those are from PriceBaba and Unbox Therapy. Right now, new information seems to suggest that the phone would be responsible for releasing Project Soli, one of the most innovative technologies in which Google has worked over the past few years.
Those who are not too familiar with Google’s “Soli” project should know that it is a technology that would enable the control of electronic devices without the need for physical contact, all through gestures. The approach is similar to that used by LG with its G8 series models.
Pixel 4 Leaks with Project Soli
9to5Google, a source close to the company’s plans regarding the development of the Pixel 4 states that this phone will arrive with a chip-based on Soli technology. Although it is unknown what the purpose of this addition.
However, the XDA-Developers portal has revealed some information discovered in the code of the latest beta of Android Q, closely related to the inclusion of the chip based on Soli technology in the Pixel 4.
Apparently, both the third and fourth beta of Android Q include text strings that refer to the “Aware” function. This function, currently unavailable, would allow gestures to be performed to control the device without touching it – such as silencing it or changing the song.
In addition, to function properly, they would require a specific sensor, which is currently not present in any model of the Pixel series. Therefore, the aforementioned chip may be responsible for giving life to this system, through the “Aware” sensor itself referred to in the Android Q code.
Furthermore, an alleged leak from the GSMArena media claimed to show the appearance of the new Pixel 4 has a perforated screen and reduced margins on the four sides of the screen. It is information that completely contradicts the data that comes to us today from 9to5Google and XDA-Developers.
As we learned so far, Google Pixel 4 would not add the hole on the screen as these images suggest for a fashion purpose, but to provide the phones with an imaging system formed by up to 5 sensors on the front side – including the “Aware” sensor mentioned above. Anyway, the phones would have a somewhat more pronounced upper margin than most current top-level terminals.
To top it off, the renowned OnLeaks leaker accumulates accurate leaks on the new devices of the main manufacturers, and which was also responsible for bringing to light the first renders of the Pixel 4 series. However, they said in their Twitter account that the model shared by GSMArena does not correspond to the design of the new Google mobiles.
When is Project Soli Available on the Market?
We do not know how long it takes for Project Soli to consolidate itself as a commercial product, and it is still too early to know exactly how it will work. However, it is presented as an interesting project that could bring numerous advantages to people with low mobility or poor vision. Among the tests based on Soli, we also get SoliType.
Among the reasons for its delay to be approved by the FCC is a dispute over the specific frequency range used. In March of last year, Google asked if Soli could be used between 57 and 64GHz, a frequency consistent with European standards. At that time, Facebook commented on whether this change could cause interference with other existing technologies.
Finally, the two companies would have reached an agreement, and Project Soli will be able to operate at a higher frequency than previously established for this type of technologies but less than the one initially proposed by Google.
Unless the company has decided to change its usual schedule of launches, Soli Project in Pixel 4 should arrive sometime next October. So, are you curious about Project Soli? Make sure to follow us for more info about Project Soli’s latest info!
News6 days ago
These Huawei Smartphones Will Get Android 10 Update in 2020
Launches2 months ago
Adorable Home Is a Decoration-Based Game
Leaks2 weeks ago
Rockstar Games Tester Reportedly Leaks Details Of GTA 6
Leaks1 week ago
Will iPhone SE2 be Exactly Like iPhone 8?
News7 days ago
ASUS ROG Zephyrus G14 Review
News2 months ago
Windows 10X New Leak Hints Laptop Support
News2 weeks ago
CES 2020: Dell XPS 13, XPS 13 Developer Edition Laptops Launched With Refreshed Design, 10th Gen Intel Core Processors
News3 weeks ago
Amazon Patent Reveals Hand-Scanning technology For Cashless Payments