Nonverbal Machine Language

Shervin Pishevar
1 min readMar 11, 2022

Machines that read body language have been around awhile and are being developed for people with autism, dyslexia and depression. These machines have even more potential for entrepreneurs that can visualize new uses for them.

Back in 2017, Carnegie Mellon University did a story about machines that read nonverbal cues. Researchers at Carnegie Mellon University’s Robotics Institute enabled a computer to understand body poses and movements of multiple people from video in real time using Panoptic Studio, a two-story dome embedded with 500 video cameras.

The idea behind reading body language (a.k.a. “interaction language”) is that through AI the machines would be perceptive of people around them by understanding nonverbal cues. Human beings do this all the time, with a variety of skill levels. Some scientists call this “emotional intelligence”, which is the subconscious notice and analysis of nonverbal cues. This includes reading facial expressions and head/hand movements, which humans regularly use to determine the emotional states of others.

Some people seem to have a knack for reading nonverbal cues. Other people, for a variety of reasons (genetic or environmental) seem to have a harder time.

Can you imagine a car that knows when you need to hear your favorite song? A thermostat that recognizes that you are in a room with people who are upset and need to cool down a little? Can you imagine wearing a pair of glasses that lets you know when someone is paying attention to your business pitch?

Join the conversation with me on Twitter @shervin.

--

--

Shervin Pishevar

Co-founder Sofreh Capital, Virgin Hyperloop, Sherpa, Webs, JamCity. VC in Uber, Airbnb, PillPack, Slack, Dollar Shave Club, Warby Parker, MZ, Tumblr, Robinhood.