Artificial intelligence can sense through walls too
MIT researchers used artificial intelligence to teach wireless devices to sense people’s postures and movement, even from behind a wall
A team of Massachusetts Institute of Technology (MIT) researchers have used artificial intelligence (AI) to teach wireless devices to sense people’s postures and movement, even from behind a wall.
In their project, christened ‘RF-Pose’, Professor Dina Katabi from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and her team used an artificial neural network (which imitates neurons in the human brain) to analyse radio signals that bounce off people’s bodies, and then created a dynamic stick figure that walks, stops, sits and moves its limbs as the person performs those actions.
The team believes the system could be used to monitor diseases like Parkinson’s and multiple sclerosis (MS), providing a better understanding of disease progression and allowing doctors to adjust medications accordingly. It could also help elderly people live more independently, while providing the added security of monitoring for falls, injuries and changes in activity patterns.
The team is working with doctors to explore multiple applications in healthcare.
“We’ve seen that monitoring patients’ walking speed and ability to do basic activities on their own gives healthcare providers a window into their lives that they didn’t have before, which could be meaningful for a whole range of diseases,” said Katabi, who co-wrote a new paper about the project. The team says that RF-Pose could also be used for new classes of video games where players move around the house, or even in search-and-rescue missions to help locate survivors, according to a 12 June press statement.
One challenge the researchers had to address is that most neural networks are trained using data labeled by hand. Radio signals, however, can’t be easily labeled by humans. To address this, the researchers collected examples using both their wireless device and a camera. They gathered thousands of images of people doing activities like walking, talking, sitting, opening doors and waiting for elevators. They then used these images from the camera to extract the stick figures, which they showed to the neural network along with the corresponding radio signal. This combination of examples enabled the system to learn the association between the radio signal and the stick figures of the people in the scene.
Post-training, RF-Pose was able to estimate a person’s posture and movements without cameras, using only the wireless reflections that bounce off people’s bodies.
Katabi and her colleagues have been working on ways to use radio waves, like those emitted by wireless Internet routers, to track people inside buildings since these radio waves can travel through walls and bounce off people and objects in an adjacent room. In August 2015, Katabi had even demonstrated a low-powered radio signal device called Emerald at the White House.
The device, which emits radio signals much weaker than a cellphone, uses algorithms to recognise reflected radio energies coming from human bodies.
- Crime linked to Blockchain soars 629% in Q1, says report
- IBM builds Artificial Intelligence machine that can debate with humans
- Not just IRCTC app, Indian Railways now has an app for almost every service it offers
- Wikipedia edit-a-thons: Fighting the fake news menace, one edit at a time
- Affective gaming goes to the next level
Latest News »
Editor's Picks »
- With fall of the last dove, MPC minutes portend more than one RBI rate hike
- RITES IPO ticks the valuations box, but not the growth one
- Is Reliance Jio really India’s most profitable telecom firm?
- How US-China trade war will affect India
- Dear ICICI Bank board, giving a red card to Chanda Kochhar is not enough