Still unpersuaded that, as I said below, students are pointing the way to our future? Try this on for size and watch the presentation (thanks to the Experience Economist for the heads-up):
"Students at the MIT Media Lab have developed a wearable computing system that turns any surface into an interactive display screen. The wearer can summon virtual gadgets and internet data at will, then dispel them like smoke when they're done. Pattie Maes of the lab's Fluid Interfaces group said the research is aimed at creating a new digital "sixth sense" for humans.I know these are hyper-smart MIT students and, yes the interface is still a little clunky, but get ready to imagine the health care possibilities. They'll be here before you know it. And, really, watch the presentation.
"In the tactile world, we use our five senses to take in information about our environment and respond to it, Maes explained. But a lot of the information that helps us understand and respond to the world doesn't come from these senses. Instead, it comes from computers and the internet. Maes' goal is to harness computers to feed us information in an organic fashion, like our existing senses.
"The prototype was built from an ordinary webcam and a battery-powered 3M projector, with an attached mirror -- all connected to an internet-enabled mobile phone. The setup, which costs less than $350, allows the user to project information from the phone onto any surface -- walls, the body of another person or even your hand.
"Maes showed a video of her student Pranav Mistry who she describes as the brains behind the project. Mistry wore the device on a lanyard around his neck, and colored Magic Marker caps on four fingers (red, blue, green and yellow) helped the camera distinguish the four fingers and recognize his hand gestures with software that Mistry created. The gestures can be as simple as using his fingers and thumbs to create a picture frame that tells the camera to snap a photo, which is saved to his mobile phone. When he gets back to an office, he projects the images onto a wall and begins to size them.
"When he encounters someone at a party, the system projects a cloud of words on the person's body to provide more information about him -- his blog URL, the name of his company, his likes and interests. "This is a more controversial [feature]," Maes said over the audience's laughter.
"In another frame, Mistry picks up a boarding pass while he's sitting in a car. He projects the current status of his flight and gate number he's retrieved from the flight-status page of the airline onto the card.
"If you need to know what time it is, it's as simple as drawing a watch on your arm," Maes said, while Mistry used his right finger to draw a circle on his left wrist. The face of a watch popped up on his hand, which the audience liked.
"When Mistry folds his hands in "namaste" fashion, the system opens a menu to allow him to choose an application. If he wants to read e-mail on his phone, he draws an @ symbol in the air with his finger. He can project a phone pad onto his palm and dial a number without removing the phone from his pocket. As he reads the newspaper on the subway he can project a video onto the page that provides more information about the topic he's reading.
"Maes and Mistry told Wired they've been working on the project for four months, day and night, and have filed a patent for it.
"Maes' MIT group, which includes seven graduate students, were thinking about how a person could be more integrated into the world around them and access information without having to do something like take out a phone. They initially produced a wristband that would read an RFID tag to know, for example, which book a user is holding in a store."
Comments