It’s obvious that the great Sherlock Holmes doesn’t need any help, but the Augmented Magnifier designed and built by Anirudh Sharma and Pattie Maes from MIT’s Fluid Interfaces Group could help him instantly recognize clues and solve cases even quicker.
Braille helps visually impaired people read, but there is a lot of printed material that is never converted to that writing system. Blind people also miss out on using mobile devices because obviously they can’t feel the text on screen. MIT’s Fluid Interfaces Group attempted to address this issue with its FingerReader prototype.
FingerReader is a ring that reads printed text out loud using a small camera and complementary software that analyzes text and reads it aloud. The ring also has vibration motors that are used to guide the wearer. The ring vibrates when the user veers off the line being scanned or when the user has reached the end of the line. FingerReader can also be used to translate text, making it doubly useful.
Of course it’s far from perfect and is just a research prototype at this point, but its inventors are not ruling out the possibility of developing FingerReader as an actual product. Head to the Fluid Interfaces Group’s website or read their FingerReader paper (pdf) for more info.
[via BGR]
At MIT’s Media Labs, researchers Roy Shilkrot, Jochen Huber and others are working on the “FingerReader,” a ring-like device that straps itself around your finger and reads printed text out loud with a synthesized voice, thanks to a mounted camera and heavily modified open source software. The FingerReader’s voice is clipped and metallic – what one might liken to… Read More
At a symposium held by the American Society of Mechanical Engineers this week, a team of MIT engineers will present an idea that seems to tempt fate: A floating nuclear reactor, anchored out at sea, that would be immune to tsunamis and earthquakes. Is it really that crazy of a plan?
Deciding that the lowly building block was due for an upgrade, researchers at MIT have created something amazing. The simple-looking M-Blocks are made from an aluminum frame filled with electronics, an electric motor that can spin up to 20,000 rpm, and a flywheel. And they can perform some amazing feats without any human intervention.
We were promised robots. The future, science fiction told us, would be a world swarming with automatons that did all the jobs we didn’t want. But you know what? Robots are really expensive and hard to build. Two MIT scientists want to change all that with inkjet printers and techniques borrowed from origami.
Just five months ago, MIT’s Tangible Media Group was showing off a physical interface that mimics you in real time.
Ford is teaming up with the brainy folks at MIT and Stanford University to work on self-driving cars
Posted in: Today's ChiliFord is teaming up with the brainy folks at MIT and Stanford University to work on self-driving cars. MIT will focus on technology that anticipates movement by pedestrians and other vehicles, while Stanford will work on sensors that let autonomous vehicles see around obstacles. [Ford via PhysOrg]
Ford, MIT and Stanford band together to further the cause of automated driving research
Posted in: Today's ChiliIn order to reach automobile utopia of the future, Ford needs to lay the foundations of that dream today. That is why the car maker has chosen to join forces … Continue reading
Ford has worked out a new automated driving research project with the Massachusetts Institute of Technology (MIT) and Stanford University, where it will continue from where the automated Ford Fusion Hybrid research vehicle left off (which was also unveiled last month, actually), and the combined teams will work on solutions that will hopefully kiss goodbye to a number of technical challenges that have proven to difficult to solve where automated driving is concerned, until now.
(more…)
Ford’s New Automated Driving Research Projects In The Pipeline original content from Ubergizmo.