Google is working on new technology that can read your body language without using cameras

There’s no point in denying it, but automation is the future. Imagine a world where your TV pauses the movie or show you’re watching when it detects you’ve gotten up to get another bowl of popcorn, and resumes playing content when you return. Or how about a computer that detects that you’re stressed at work and starts playing soft, relaxing tunes?

Well, as futuristic as these ideas seem, most of these things are happening now. However, one of the main reasons it hasn’t taken off with a bang is that these systems use cameras to record and analyze user behavior. The problem with using cameras in such systems is that it raises a ton of privacy issues. After all, people are actually paranoid about their computers and smartphones, keeping tabs on them.

Google is currently working on a new system, which records and analyzes user movements and behavior, without using cameras. Instead, the new technology uses radar to read your body’s movements and understand your mood and intentions, then act on them.

The basic idea of ​​the new system is that a device will use radar to create spatial awareness and monitor space for any changes, then send instructions in accordance with what the user would like the system to do.

This isn’t the first time Google has toyed with the idea of ​​using spatial awareness-based stimuli for its devices. In 2015, Google unveiled the Soli sensor, which used radar electromagnetic waves to pick up precise gestures and movements. Google first used the sensor in Google Pixel 4, when it used simple hand gestures for various inputs, like snoozing alarms, pausing music, taking screenshots , etc. Google has also used the radar-based sensor in the Nest Hub smart display to study the movements and breathing patterns of someone sleeping nearby.

Studies and experiments around the Soli sensor now allow computers to recognize our daily movements and make new kinds of choices.

The new study focuses on proxemics, the study of how people use the space around them to mediate social interactions. This assumes that devices such as computers and cell phones have their own personal space.

Thus, when there are changes in the personal space, the radar detects it and sends instructions. For example, a computer can start without you having to press a button.

Google is working on new technology that can read your body language without using cameras

The final frontier for large-scale automation has been the private sector, end users and households. If Google is able to finalize this technology and make it mainstream, it will be a massive victory for automation.

45secondes is a new media, do not hesitate to share our article on social networks to give us a solid boost. 🙂

Leave a Comment