Wednesday, May 09, 2007

Social affective touch for robots

My past research on psychohaptics and taptap at the Tangible Media Group, MIT Media Laboratory, involved studying the perception of touch, and how people relate affectively to haptic sensations. Today, Dr Cynthia Breazeal who leads the Robotic Life research group at the MIT Media Laboratory addressed the question of social affective touch. Her main theme is to build cooperative machines that work and learn in partnership with people.
Today at the h2.0 event -new minds, new bodies, new identities- she presented her latest research including the robotic companion. The robotic life group explores social affective touch and develops machine learning algorithms to categorize various sense of touch, from pleasant to annoying ones. Exploring the sense of touch and interpretation of touch by robots is a new area of research.
The robot is designed in the form of a bear, the bear being a universal symbol of comfort. The robot itself is covered of silicone skin to not feel like a machine when hugged. In their first experiments, the robot had local intelligence, local sensing and response, coordination. An operator could communicate with the bear to a 3d bear model on a screen.

Applications behind this work could be for family members to be able to play from everywhere, to bridge the usual generation gap between grandparents and grandchildren, to allow people to remain in contact while separated. This is especially important when a child is spending a day at the hospital alone and what it means for the parents to be able to stay with the child remotely.
Another high vision for this work is to allow collaborative partnership using a remote tutor. Cynthia mentioned that eventually the bear could be highly trained and could offer high quality education to the ones who usually are not exposed to such education.

No comments: