MIT’s AI Uses Radio Signals To See People Through Walls

Researchers at the Massachusetts Institute of Technology have developed a new piece of software that uses wifi signals to monitor the movements, breathing, and heartbeats of humans on the other side of walls. While the researchers say this new tech could be used in areas like remote healthcare, it could in theory be used in more dystopian applications. Inverse reports: “We actually are tracking 14 different joints on the body [...] the head, the neck, the shoulders, the elbows, the wrists, the hips, the knees, and the feet,” Dina Katabi, an electrical engineering and computer science teacher at MIT, said. “So you can get the full stick-figure that is dynamically moving with the individuals that are obstructed from you — and that’s something new that was not possible before.” The technology works a little bit like radar, but to teach their neural network how to interpret these granular bits of human activity, the team at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) had to create two separate A.I.s: a student and a teacher.

[T]he team developed one A.I. program that monitored human movements with a camera, on one side of a wall, and fed that information to their wifi X-ray A.I., called RF-Pose, as it struggled to make sense of the radio waves passing through that wall on the other side. The research builds off of a longstanding project at CSAIL lead by Katabi, which hopes to use this wifi tracking to help passively monitor the elderly and automate any emergency alerts to EMTs and medical professionals if they were to fall or suffer some other injury. For more information, a press release and video about the software are available.

Share on Google+

of this story at Slashdot.

    

Posted in Uncategorized