Intelligent carpet gives insight into human poses
The sentient Magic Carpet from “Aladdin” might have a new competitor. While it can’t fly or speak, a new tactile sensing carpet from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) can estimate human poses without using cameras, in a step toward improving self-powered personalized health care, smart homes, and gaming.
Many of our daily activities involve physical contact with the ground: walking, exercising, or resting. These embedded interactions contain a wealth of information that help us better understand people’s movements.
Previous research has leveraged use of single RGB cameras, (think Microsoft Kinect), wearable omnidirectional cameras, and even plain old off-the-shelf webcams, but with the inevitable byproducts of camera occlusions and privacy concerns. (More)