Due to another innovation straight out of MIT’s media lab, cuddly interactive
robots should soon be roaming the halls of hospitals, simultaneously creeping people out and helping the sick get better.
MIT’s Huggable robot, even in its current uncompleted state, is a technical marvel of communications and data accumulation.
The bear is lined with over 1500 sensors for direct data transmission, has cameras for eyes, a microphone field in the ears, and has an internal wireless PC. The ear mics give the robot ‘sound localization’ so it can directly interact with people in space, but the visual context from the cameras is even more impressive.
Because the camera in the bear’s eyes has a narrow field of vision, Media Lab personnel had to create a visual interface called ‘stale panorama’ to raise its space awareness. The robot takes separate video frames automatically when it is placed in any new room, by pointing its head around the room. The software that helps run the robot then mods the pics together into one large canvas, and when it’s combined with an automatic face detection feature, allows the bear bot to naturally follow a person around the room, face-to-face.
It also has an internal measurement unit that allow the bear to know its tilting orientations. But it’s the sensitivity of the sensors that are probably the most interesting for the future of touch-based physical applications.
Made of silicon, the sensors record electrical field, force and temperature data everytime the bear’s body parts are touched. It understands nine classes of affective touch: tickling, patting, poking, scratching, rubbing, squeezing, slapping, petting and light contact. From each of these, the bear is supposed to mimic a human-like response, eventually building an extensive database of appropriate replies and improving its function.
Since touch therapy is effective for sick patients, the bear is currently being looked at as a potential animal companion in hospitals.
Post a Comment