Thursday, April 21, 2011
Monday, April 18, 2011
Marelo Coelho, Lyndl Hall and Joanna Berzowska have developed a series of techniques for building sensors, actuators and circuit boards that behave, look, and feel like paper.
By embedding electro-active inks, conductive threads and smart materials directly into paper during the papermaking process, they have created seamless composites that are capable of supporting new and unexpected application domains in ubiquitous and pervasive computing at affordable costs.
Posted by Cati Boulanger at 7:21 am
Thursday, April 14, 2011
Posted by Cati Boulanger at 7:14 pm
Relief, created by Daniel Leithinger, Adam Kumpf and Hiroshi Ishii is an actuated tabletop display which is able to render and animate three-dimensional shapes with a malleable surface. A direct extension of this work, Recompose created by Matthew Blackshaw, Anthony DeVincenzi, Dávid Lakatos, Daniel Leithinger and Hiroshi Ishii, is the gesture control of such actuated surface.
By collectively utilizing the body as a tool for direct manipulation alongside gestural input for functional manipulation, we show how a user is afforded unprecedented control over an actuated surface. We describe a number of interaction techniques exploring the shared space of direct and gestural input, demonstrating how their combined use can greatly enhance creation and manipulation beyond unaided human capability.
Posted by Cati Boulanger at 7:09 pm
Tuesday, April 12, 2011
The Junkyard Jumbotron lets you take a bunch of random displays and instantly stitch them together into a large, virtual display, simply by taking a photograph of them. It works with laptops, smartphones, tablets --- anything that runs a web browser. It also highlights a new way of connecting a large number of heterogenous devices to each other in the field, on an ad-hoc basis.
The Junkyard Jumbotron is designed by Rick Borovoy, Ph.D. and Brian Knep at MIT's Center for Future Civic Media.
Get the code to install it on your displays, here.
Posted by Cati Boulanger at 7:41 am
Sunday, April 10, 2011
Can we create a device that makes people aware of their early cataract condition? Using a light-field display, the researchers' method projects time-dependent patterns onto the fovea. Interactive software measures the visibility and point spread function across subapertures of the crystallin lens. By repeating this procedure for several light-paths, the cataracts size, position, density, and scattering profile are estimated.
Created by the MIT Media Lab, Camera Culture's research group with Vitor Pamplona, Erick Passos, Jan Zizka, Manuel M. Oliveira, Everett Lawson, Esteban Clua and Ramesh Raskar, CATRA utilizes a forward scattering technique, which allows the user to respond to what they visually experience.
Their device scans the lens section by section. The user sees their projected patterns and presses a few buttons to map the light attenuation in each section of the eye. This information is collected by the device creating an attenuation map of the entire lens. This allows individuals to monitor the progression of the severity of the cataract.
The maps capture a full point spread function of the lens, allowing the researchers to simulate the visual perception of a cataract affected subject over time. Early cataract onset is difficult to diagnose. This device aims at measuring cataracts, which is highly portable and collects quantifiable data to help tackle a global health problem making it ideal for the developing world.
Posted by Cati Boulanger at 7:12 pm
BodyNotes was created by the MIT Media Lab. It contributes to a series of projects that enhance the interaction between patients and healthcare practitioners. BodyNotes is a mobile tool that combines anatomical landmarks to physical objects as a mean for a patient to discuss body pain with her doctor.
The video introduces Anna, an amputee who uses BodyNotes to track the pain and comfort she feels throughout the day when wearing her prosthetic limb. Anna can view visualizations and summaries of her reported pain by location, intensity, time, duration and activity.
This data is also accessible to Anna’s prosthetist. BodyNotes also allows remote, real-time collaboration.
During their session, Anna uses a photo to indicate the exact points where she feels pain. Since this screen is simultaneously seen by the prosthetist, and the interaction is synchronized, he can show what modifications Anna could do on her own.
BodyNotes enables advanced logging and telemedicine functionality on mainstream mobile phones. It has the potential to improve health care and communication, allowing patients and doctors to better spend their time by reducing the need for office visits.
Posted by Cati Boulanger at 6:55 pm