Piezoelectric materials work quite simply, in theory — motion in, electricity out, or vice versa — and since that’s just how speakers and microphones transmit their sound, it’s not much of a stretch to imagine someone would figure out audio on a micron scale. That someone is MIT’s Yoel Fink, who’s reportedly engineered a marvelous process for producing fibers that can detect and emit sound. Following up their famous work on flexible cameras, Fink’s team discovered they could keep piezoelectric strands rigid enough to produce audible vibrations by inserting graphite, AKA pencil lead. Better yet, the lab process can apparently make the threads on a fairly large scale, “yielding tens of metres of piezoelectric fibre” at a single draw. The potential for fabric made from such fibers is fantastic, of course — especially combined with this particular scientist’s previous research into camera cloth.
In a magic trick that only geeks can pull off, researchers at MIT have found a method to let users click and scroll exactly the same way they would with a computer mouse, without the device actually being there.
Cup your palm, move it around on a table and a cursor on the screen hovers. Tap on the table like you would click a real mouse, and the computer responds. It’s one step beyond cordless. It’s an invisible mouse.
The project, called “Mouseless,” uses an infrared laser beam and camera to track the movements of the palm and fingers and translate them into computer commands.
“Like many other projects in the past, including the Nintendo Power Glove and the Fingerworks iGesture Pad, this attempts to see how we can use new technology to control old technology,” says Daniel Wigdor, a user experience architect for Microsoft who hasn’t worked directly on the project. “It’s just an intermediate step to where we want to be.”
Though new user interfaces such as touchscreens and voice recognition systems have become popular, the two-button mouse still reigns among computer users. Many technology experts think the precision pointing that a cursor offers is extremely difficult to replicate through technologies such as touch and speech.
Last week Intel CTO Justin Rattner said though Intel research labs is working on new touchscreen ideas, the mouse and keyboard combination is unlikely to be replaced in everyday computing for a long time.
In the case of the Mouseless project, the infrared laser and camera are embedded in the computer. When a user cups their hand as if a physical mouse was present under their palm, the laser beam lights up the hand that is in contact with the table. The infrared camera detects this and interprets the movements.
A working prototype of the Mouseless system costs approximately $20 to build, says Pranav Mistry, who is leading the project.
Mistry is one of the star researchers in the area of creating new user experiences. He previously developed the “Sixth Sense” project, a wearable gestural interface that lets users wave their hands in front of them and interact with maps and other virtual objects — much like Tom Cruise in Minority Report.
The Mouseless idea is not as big a breakthrough as Sixth Sense. Though it is fun, it is difficult to see a real-world case for getting rid of hardware while keeping interaction the same. User interfaces are going beyond the point-and-click interaction that the computer mouse demands. And mouse hardware itself is cheap, so there’s not much of a cost saving here.
Check out this fun, partly animated video to see what the Mouseless can do and how it works:
How does Jamie Zigelbaum, a former student at MIT Media Lab, celebrate freedom from tyranny, drool-worthy accents and “standing in the queue?” By creating Slurp, of course. In what’s easily one of the most jaw-dropping demonstrations of the year, this here digital eyedropper is a fanciful new concept that could certainly grow some legs if implemented properly in the market place. Designed as a “tangible interface for manipulating abstract digital information as if it were water,” Slurp can “extract (slurp up) and inject (squirt out) pointers to digital objects,” enabling connected machines and devices to have information transferred from desktop to desktop (or desktop to speakers, etc.) without any wires to bother with. We can’t even begin to comprehend the complexity behind the magic, but all you need to become a believer is embedded after the break. It’s 41 seconds of pure genius, we assure you.
Remember Bokodes, MIT’s tiny replacement for barcodes and the like? Their holographic nature enabled them to represent different information from different angles, and it’s this property that allows the tech behind them to be used in a very different and even more useful way: figuring out just how busted your vision is. The Camera Culture team at MIT’s Media Lab evolved that tech into a $2 box that refracts the image displayed on a smartphone screen. When combined with an app that displays a set of dots and lines, the user can manipulate the image until things look to be perfectly aligned. Once complete, the app spits out a prescription and you’re just a quick trip to your local mall-based eyeglasses joint away from perfect vision. The goal is to make it easier for optometrists in developing countries to quickly and easily find glasses for people, but an app that could save a trip to the doctor’s office is a wonderful thing regardless of where you are.
When looking for a cheap, reliable way to track gestures, Robert Wang and Jovan Popovic of MIT’s Computer Science and Artificial Intelligence Laboratory came upon this notion: why not paint the operator’s hands (or better yet, his Lycra gloves) in a manner that will allow the computer to differentiate between different parts of the hand, and differentiate between the hand and the background? Starting with something that Howie Mandel might have worn in the 80s, the researchers are able to use a simple webcam to track the hands’ locations and gestures — with relatively little lag. The glove itself is split into twenty patches made up of ten different colors, and while there’s no telling when this technology will be available for consumers, something tells us that when it does become available it’ll be very hard not to notice. Video after the break.
Update: Just received a nice letter from Rob Wang, who points out that his website is the place to see more videos, get more info, and — if you’re lucky — one day download the APIs so you can try it yourself. What are you waiting for?
The Week in Green is a new item from our friends at Inhabitat, recapping the week’s most interesting green developments and clean tech news for us.
This week Inhabitat reported live from the scene of New York Design Week, where we sifted through thousands of new home furnishings and interiors products to bring you the state-of-the-art in green design. Fresh from the floor of the International Contemporary Furniture Fair is this stunning hexagonal crystal LED light, which is composed of glowing geometric blocks that snap together to form a myriad of shapes. We were also impressed by this beautifully finished wood calculator that multiplies its green factor with sustainably-sourced materials.
The past week was also surging with developments from the field of renewable energy – first we were excited to see the unveiling of the Oyster 2, an offshore wave-harvesting energy plant that improves upon its predecessor with a simpler design, fewer moving parts, and a 250% increase in energy generation. Google, HP, and Microsoft are also getting into the green energy game with plans to tap an unexpected energy source to run their data centers – cow dung! Google also led the charge towards cleaner energy this week by funding a new type of jet engine-inspired geothermal drill that uses superheated streams of water to bore through previously impenetrable surfaces.
Speaking of jets, MIT has just unveiled several ultra-efficient airplane designs that are capable of cutting fuel use by a whopping 70%. The auto industry also received a jolt of energy as Toyota announced a partnership with Tesla that will boost California’s flagging economy and likely lead to more affordable iconic electric vehicles.
The field of wearable technology saw several innovative advancements this week as well – safe cyclists rejoice, because a group of Indian students have designed a $22 Solar and Wind Powered Bike Helmet. Meanwhile, a group of Colorado State University seniors have designed a medical incubator backpack unit that they believe can reduce baby deaths in medical emergencies.
Finally, we shined light on several brilliant advancements from the field of solar technology, starting with China’s plans to build the “biggest solar energy production base” in the world. We also looked at the HYDRA, a solar-powered hydrogen fuel cell system that can reportedly generate 20,000 gallons of pure water a day, and green energy got literal with the unveiling of the first leaf-shaped crystalline silicon solar panels.
Interacting with your computer by waving your hands may require just a pair of multicolored gloves and a webcam, say two researchers at MIT who have made a breakthrough in gesture-based computing that’s inexpensive and easy to use.
A pair of lycra gloves — with 20 irregularly shaped patches in 10 different colors — held in front of a webcam can generate a unique pattern with every wave of the hand or flex of the finger. That can be matched against a database of gestures and translated into commands for the computer. The gloves can cost just about a dollar to manufacture, say the researchers.
“This gets the 3-D configuration of your hand and your fingers,” says Robert Wang, a graduate student in the computer science and artificial intelligence lab at MIT. “We get how your fingers are flexing.” Wang developed the system with Jovan Popović, an associate professor of electrical engineering and computer science at MIT.
The technology could be used in videogames where gamers could pick up and move objects using hand gestures and by engineers and artists to manipulate 3-D models.
“The concept is very strong,” Francis MacDougall, chief technology officer and co-founder of gesture-recognition company GestureTek, told Wired.com. “If you look at the actual analysis technique they are using it is same as what Microsoft has done with Project Natal for detecting human body position.” MacDougall isn’t involved with MIT’s research project.
MIT has become a hotbed for researchers working in the area of gestural computing. Last year, an MIT researcher showed a wearable gesture interface called the “SixthSense” that recognizes basic hand movements. Another recent breakthrough showed how to turn a LCD screen into a low-cost, 3-D gestural computing system.
The latest idea is surprisingly easy in its premise. The system hinges on the ability to use a differentiated enough pattern so each gesture can be looked up quickly in a database.
For the design of their multicolored gloves, Wang and Popović tried to restrict the number of colors used so the system could reliably distinguish one color from another in different lighting conditions and reduce errors. The arrangement and shapes of the patches were chosen such that the front and back of the hand would be distinct.
Once the webcam captures an image of the glove, a software program crops out the background, so the glove alone is superimposed on a white background.
The program then reduces the resolution of the cropped image to 40 pixels by 40 pixels. It searches through a database that contains 40 x 40 digital models of a hand, clad in the distinctive glove showing different positions. Once match is found, it simply looks up the corresponding hand position.
Since the system doesn’t have to calculate the relative positions of the fingers, palm and back of the hand on the fly, it can be extremely quick, claim the researchers.
And if the video is to be believed, the precision with which the system can gauge gestures including the flexing of individual fingers is impressive.
A challenge, though, is having enough processing power and memory so gestures made by a user can be looked up in a database quickly, says MacDougall.
“It takes hundreds of megabytes of pre-recorded posed images for this to work.,” he says, “though that’s not so heavy in the computing world anymore.”
Another problem could be getting people to wear the gloves. Let’s face it: No one wants to look like Kramer in a fur coat from a episode of Seinfeld or an extra in the musical Joseph and the Technicolor Dreamcoat.
MacDougall says the pattern on the gloves can be tweaked to make them less obvious.
“If you want to make it more attractive, you could hide the patterns in a glove using retro-reflective material,” he says. “That way you could [create] differentiable patterns that wouldn’t be visible to the naked eye but a camera’s eye could see it.”
Wang and Popović aren’t letting issues like fashion dictate their research. They say they are working on a design of similarly patterned shirts.
Photo: Jason Dorfman/CSAIL Video: Robert Y. Wang/Jovan Popović
Finally, we were dazzled by two high-tech garments that harness LEDs to light up the night. Katy Perry recently took to the red carpet wearing a shimmering gown studded with thousands of blinking rainbow lights, and we were impressed by this LED-laden coat that keeps bicyclists safe when they hit the streets at night.
MIT researchers have a found way to use augmented reality to bring TVs and cellphones together so viewers can watch more than just what’s playing right in front of them.
The technology, called ‘Surround Vision,’ uses footage taken from different angles so when someone points their phone beyond the edge of the TV screen, they can see the additional content on their mobile device.
For instance, Surround Vision could allow a guest at a Super Bowl party to check out different camera angles of a play, without affecting what other guests see on the screen, says MIT. Or viewers could use it to see alternate takes of a scene while watching a movie.
“This could be in your home next year if a network decided to do it,” says Media Lab research scientist Michael Bove who’s working on the project.
Augmented reality tries to enhance the physical world by overlaying virtual computer generated elements on it. Over the last one year, a number of apps designed especially for phones have emerged where all users have to do is point their phones at a physical object to get more information on their phones. MIT’s breakthrough extends that idea.
The Surround Vision prototype, built by MIT, added a magnetometer (compass) to an existing phone, since the accelerometer included in many phones is not sensitive enough to detect the subtle motion that comes from pointing a phone to the left or right of a TV screen.
And as MIT’s video below shows, the software incorporates the data gathered from the compass and integrates it with the phone’s other sensors so viewers get an enhanced picture.
To test it, Santiago Alfaro, a graduate student in the lab who’s leading the project, shot video footage of a street from three angles simultaneously. A TV plays the the footage from the center camera. When a viewer points a phone directly at the TV, the same footage appears on the device’s screen. But if the phone is aimed to the right or the left, then it switches to another perspective.
If the system were commercialized, the video playing on the handheld device would stream over the internet, says Alfaro.
Over the next few months, Alfaro says MIT Media Lab will test the system using sports broadcasts and children’s shows.
The Week in Green is a new item from our friends at Inhabitat, recapping the week’s most interesting green developments and clean tech news for us.
It was an interesting week in green tech, as Inhabitat explored the past and future of solar technology. We dug up the world’s first modern solar panel (still working after 60 years!) and wrapped our brains around MIT’s plan to create super-efficient photovoltaic panels by folding them up like origami. Not to be outdone, IBM unveiled plans to roll out a new solar desalination system that could transform entire expanses of desert into rivers.
Solar power also took to the skies this week as the Solar Impulse plane made its first successful flight. And speaking of futuristic transportation, Minority Report-style podcars may be just around the corner if this solar powered urban transit system takes off. We were wowed by Finland’s new all-electric supercar, which will be vying for the Progressive Auto X Prize this summer.
We also took a look at several innovative kid-friendly designs including an incredible Game Boy made from paper and a biometric baby monitoring alarm clock that lets parents monitor their babies’ temperature and heart-rate remotely, as well as cue up lullabies from anywhere.
The past week also produced several promising developments from the realm of energy storage as Hitachi announced that it’s developing lithium-ion batteries that last twice as long. And finally, meet BOB, a battery the size of a building that is capable of powering an entire town in Texas. The gigantic sodium sulfur backup battery can store up to 4 megawatts of power for up to 8 hours.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.