MIT’s $500 Kinect-like camera works in snow, rain, gloom of night

Remember that camera that takes 1 trillion exposures per second? Well, the illustrious folks at MIT have outdone themselves (again) by developing a camera that accomplishes all that and more, for just $500. Similar to the recently released Xbox One Kinect, this three-dimensional “nano-camera” is based on “Time Flight Technology.” That means an object’s whereabouts are calculated by measuring the time it takes light to reflect off its surface and return to the sensor. But, thanks to some fancy math, the nano-cam can capture translucent and moving objects in 3D, using a new encoding method. In the past, the results of the process (which has been dubbed “nanophotography”) could only be achieved with a $500,000 “femto-camera.” With such a dramatically lower price tag, it could be a solution to one of the many hurdles facing self-driving vehicles: the ability to tell the difference between a puddle and a cat in the pouring rain. And, even though it functions like a Kinect, don’t expect it to be standard issue with an Xbox Two (or One II, or whatever Microsoft decides to call it).

Comments

Source: MITnews

Whoa, This Handheld Router Only Cuts Where Needed To Reveal 3D Models

Researchers at MIT have developed a handheld milling machine that turns anyone into a skilled sculptor. Like with a 3D printer, users start with a 3D model on a computer, but instead of a machine laying down layers of plastic, the handheld mill removes only what’s needed from a solid block of material to eventually reveal a fully formed 3D object. This could basically turn anyone into a Michelangelo when we’re all able to buy one.

Read more…


    



Alt-week 16.11.13: Need another Earth-like planet? Study says there could be plenty

Alt-week takes a look at the best science and alternative tech stories from the last seven days.

Alt-week 16.11.13

Suddenly things just got real. A new study claims one in five sun-like stars could have a planet capable of supporting life. Hugging your loved ones while thousands of miles away is closer reality, and smog? Apparently we can vacuum that stuff up now. Yeah? This is Alt-week.

Filed under: ,

Comments

inFORM 3D Shapeshifting Surface

inFORM 3D Shapeshifting Surface Us humans do rely on touch a whole lot, as it happens to be one of our most primal senses, offering a method to tangibly interact and receive data concerning our surroundings. Over the past few years, many digital touchscreens have made their way into everyday devices, allowing us to use touch as a form of input in a range of ways, although it remains rather limited when it comes to actual interaction with our environment. To make sure that there is some semblance of the tactile, MIT Media Lab researchers have come up with a 3D, shapeshifting surface that they call the inFORM.

inFORM would enable users to execute common physical interactions with digital matter, now how about that? There is plenty of potential with this unique surface, ranging from holding a person’s hand with the subject being thousands of miles away from you, or to bring a 3D model to life. It does resemble that unique bed of pins that was recently seen in The Wolverine, and also part of Krypton’s technology that Russell Crowe loved to make use of in Man of Steel. inFORM will rely on a connected pinscreen alongside a hacked Microsoft Kinect and a laptop to get the job done, letting you alter and adjust the model using your bare hands. [inFORM Project Page]

  • Follow: General, inform, ,
  • inFORM 3D Shapeshifting Surface original content from Ubergizmo.

        



    inFORM Dynamic Shape Display: Display See, Display Do

    The touchscreen and app combo of today’s mobile devices makes one gadget act as many. Different apps display different interfaces, and the touchscreen lets you interact with those interfaces in a natural manner. But what if, aside from changing what you can see, your gadget’s display could also change its shape? That’s what MIT’s Tangible Media Group wants to realize.

    inform dynamic shape display by MIT tangible media group 620x930magnify

    What you’re looking at is Tangible Media Group’s inFORM. It’s made of a Kinect, a projector, a computer, pins, linkages and actuators. inFORM can mimic the shape and motion of 3D objects in real time. For example, a monitor can show a two-dimensional replica of your arm, but with inFORM you can have a tangible, 3D replica of your arm. And since it’s 3D you can use that replica arm to carry or move objects just by moving your own arms.

    inform dynamic shape display by MIT tangible media group 2 620x413magnify

    Equally important is inFORM’s ability to act as a collection of 3D pixels, a way of giving physical manifestation to digital information. For instance, it can make actual 3D charts, give you a tangible version of a 3D model in an instant and present even more intuitive user interfaces.

    inFORM is a step towards the Tangible Media Group’s dream that it calls Radical Atoms, a “hypothetical generation of materials that can change form and appearance dynamically, becoming as reconfigurable as pixels on a screen… so that dynamic changes of physical form can be reflected in digital states in real time, and vice versa.”

    Imagine watching horror films on a Radical Atom TV.  Imagine “holding” your loved ones as you chat with them on Radical Atom walls and floors. Imagine controlling a giant mech made of Radical Atoms. Imagine visualizing mind-boggling equations and predictions on a Radical Atom spreadsheet. Imagine having a physical keyboard or game buttons on your Radical Atom mobile device. Aww yiss.

    [MIT Tangible Media Group via Colossal]

    MIT’s inFORM UI remotely manipulates physical reality via Kinect

    MIT Media Lab’s Tangible Media Group has invented a tangible interface that acts like a remote pinscreen. That is, it transfers gestures captured by a hacked Kinect to a platform of motorized pins, thus translating motion to physical reality. The device is called inFORM, and the researchers involved in the project want it to inform […]

    MIT’s “Kinect of the Future” Can Track You Through Walls

    The ability to passively track people within a given space is every retailer’s dream (and every conspiracy theorist’s nightmare). Those dreams recently took a step closer to reality with the debut of a new people-tracking system from MIT.

    Read more…


        



    MIT M-Blocks Are Self Assembling Robots

    Avengers, Assemble! That could very well be the rallying cry of MIT’s M-Blocks, which so happen to be a new class of robotic cubes which are aware of one another, and are fully capable of a self assembling process. This is the premise that was explored by the T-1000 in the movie Terminator 2, except that such a cyborg from the future was made out of liquid metal. Well, there is also the Transformers in the science fiction realm as well, where they are a race of humanoid robots that can transform into machines – vehicles especially. Just how far along is humankind to achieving such wonders? MIT’s John Romanishin, Daniela Rus, and Kyle Gilpin could have taken a wee step closer with the M-Blocks self assembling robotic cubes.

    These simple yet independent modules are able to separate and recombine at will, letting you have the freedom to design a robot that will sport flexible functionality. The M-Blocks’ movement are self contained, which means there are no external moving parts. This is made possible thanks to a 20,000 RPM flywheel that will impart angular momentum to each cube, letting them make their way across the floor, roll over one another, and even have the ability to leap around. Hence, you get a system that will be able to join together to develop a shape, before breaking apart and assembling into a different shape altogether. Pretty neat, no?

  • Follow: Robots, ,
  • MIT M-Blocks Are Self Assembling Robots original content from Ubergizmo.

        



    Cuttable, Foldable Sensors Can Add Multi-Touch To Any Device

    coverHP

    Researchers at the MIT Media Lab and the Max Planck Institutes have created a foldable, cuttable multi-touch sensor that works no matter how you cut it, allowing multi-touch input on nearly any surface.

    In traditional sensors the connectors are laid out in a grid and when one part of the grid is damaged you lose sensitivity in a wide swathe of other sensors. This system lays the sensors out like a star which means that cut parts of the sensor only effect other parts down the line. For example, you cut the corners off of a square and still get the sensor to work or even cut all the way down to the main, central connector array and, as long as there are still sensors on the surface, it will pick up input.

    The team that created it, Simon Olberding, Nan-Wei Gong, John Tiab, Joseph A. Paradiso, and Jürgen Steimle, write:

    This very direct manipulation allows the end-user to easily make real-world objects and surfaces touch interactive,
    to augment physical prototypes and to enhance paper craft. We contribute a set of technical principles for the design of printable circuitry that makes the sensor more robust against cuts, damages and removed areas. This includes
    novel physical topologies and printed forward error correction.

    You can read the research paper here but this looks to be very useful in the DIY hacker space as well as for flexible, wearable projects that require some sort of multi-touch input. While I can’t imagine we need shirts made of this stuff, I could see a sleeve with lots of inputs or, say, a watch with a multi-touch band.

    Don’t expect this to hit the next iWatch any time soon – it’s still very much in prototype stages but definitely looks quite cool.



    Today In Dystopian War Robots That Will Harvest Us For Our Organs

    dystopian war robots

    Welcome to our continuing series featuring videos of robots that will, when they become autonomous, hunt us down and force us to work in the graphene factories of Mars. Below we see Wild Cat, a fully untethered remote control quadrupedal robot made by Boston Dynamics, creators of the famous Big Dog. This quadruped can run up to 16 miles an hour and features a scary-sound internal gas engine that can power it across rough terrain. Wild Cat was funded by the DARPA’s M3 program aimed at introducing flexible, usable robots into natural environments AKA introducing robotic pack animals for ground troops and build flocking, heavily armed robots that can wipe out a battlefield without putting humans in jeopardy.

    Next up we have ATLAS, another Boston Dynamics bot that can walk upright on rocks. Sadly ATLAS is tethered to a power source but he has perfect balance and can survive side and front hits from heavy weights – a plus if you’re built to be the shock troops of a new droid army. ATLAS can even balance on one foot while being smacked with wrecking balls, something the average human can’t do without suffering internal damage. I can’t wait for him to be able to throw cinder blocks!

    Finally we present these charming self-assembling robots from MIT’s Computer Science and Artificial Intelligence Laboratory which we covered earlier today. The robots exert an internal force to spin and then connect with each other using magnets, allowing them to fly into the air for a second and then fall down next to their brothers and sisters in exactly the right spot. This allows these completely featureless squares to form any shape they want and, like autonomous LEGOs, they can build complex devices out of a few simple shapes.

    “There’s a point in time when the cube is essentially flying through the air,” said researcher Kyle Gilpin. “And you are depending on the magnets to bring it into alignment when it lands. That’s something that’s totally unique to this system.”

    They may look innocuous but imagine these things self-assembling into, say, a wall, a door, or even a plate of explosives. They could sneak through pipes into your home and create a robotic assassin to destroy you in the sleep, thereby freeing up your “Schlafplatz” for other humans who have been reduced to sleeping out of doors after the robots took over most habitable locations for the storage of fermenting human slurry. Stay frosty, humans!