Robot skin captures super detailed 3D surface images

Remember those awesome pin art toys where you could press your hand (or face) into the pins to leaving a lasting impression? Researchers at MIT have taken the idea one (or two) steps further with “GelSight,” a hunk of synthetic rubber that creates a detailed computer visualized image of whatever surface you press it against. It works as such: push the reflective side of the gummy against an object (they chose a chicken feather and a $20 bill) and the camera on the other end will capture a 3-D image of the microscopic surface structure. Originally designed as robot “skin,” researchers realized the tool could be used in applications from criminal forensics (think bullets and fingerprints) to dermatology. The Coke can-sized machine is so sensitive, it can capture surface subtleties as small as one by two micrometer in surface — finally solving the mystery of who stole the cookies from the cookie jar. (Hint: we know it was you Velvet Sledgehammer).

Continue reading Robot skin captures super detailed 3D surface images

Robot skin captures super detailed 3D surface images originally appeared on Engadget on Wed, 10 Aug 2011 08:10:00 EDT. Please see our terms for use of feeds.

Permalink Fast Company, MIT News  |  sourceGelSight  | Email this | Comments

Stanford schooling unwashed masses with free online Intro to Artificial Intelligence (video)

If you fancy yourself a Stanford (wo)man, but lack the requisite dollars to actually attend, now’s your chance to collect those collegiate bragging rights. Starting October 10th, you can join Professor Sebastian Thrun and Google’s Director of Research, Peter Norvig, in a free, online version of the school’s Introduction to Artificial Intelligence course. The class covers, “knowledge representation, inference, machine learning, planning and game playing, information retrieval, and computer vision and robotics,” and ambitiously aims to be the largest online AI course ever taught. If you’re feeling the ole red and white, you can register at the source link below, but if you’re looking for the official Stanford stamp of approval, we’re afraid you’re barking up the wrong tree — non-students will receive a certificate of completion from the instructors only. Still interested? Check out the video introduction after the break and hit the source for more details.

Continue reading Stanford schooling unwashed masses with free online Intro to Artificial Intelligence (video)

Stanford schooling unwashed masses with free online Intro to Artificial Intelligence (video) originally appeared on Engadget on Fri, 05 Aug 2011 21:47:00 EDT. Please see our terms for use of feeds.

Permalink Slashdot, IEEE Spectrum  |  sourceStanford  | Email this | Comments

Military lightning gun parts sold on eBay, probably built in someone’s garage

Lightning gun parts

We’re not sure where to start with this one. It’s, in a word, unbelievable. Technologist Cody Oliver was digging through eBay for parts to build a robot car that Elon Musk could drive around Burning Man, when he came across surplus equipment from defense contractors Omnitech Robotics and Ionatron. The components were originally from the military’s Joint Improvised Explosive Device Neutralizers, or JINs — remote-controlled lightning guns designed to disable IEDs. But, the story quickly goes from interesting to terrifying. Oliver soon discovered the weapons were cobbled together largely from off-the-shelf parts, including a Linksys router with the serial numbers scraped off, and lacked even basic security. The now retired JINs were controlled over a standard 802.11 WiFi signal, with the encryption turned off — leaving the multimillion dollar devices vulnerable to insurgents. Ultimately the parts were deemed unfit for even Musk’s RC art car. You can read all of the horrifying details at the source link.

[Thanks, Chris]

[Image credit: Cody Oliver]

Military lightning gun parts sold on eBay, probably built in someone’s garage originally appeared on Engadget on Fri, 05 Aug 2011 20:34:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceWired  | Email this | Comments

Qbo music player robot responds to hand gestures, challenges DJ Roomba to a dance-off (video)

What’s a good way to impress your friends? With a robot boom box that responds to your every hand movement, that’s how. Meet Qbo, TheCorpora’s open-source Linux robot who we’ve gotten to know over the years, even through his awkward phase. Nowadays, this full grown cutie has stereoscopic “eyes” and a face-identifying system that’s capable of learning, recognizing faces, and responding. With his new hand gesture recognition skills, Qbo will start playing music the moment you hold up a fist. Putting your hand out in a “halt” position stops the song and pointing left or right jumps to different tracks in your playlist. Giving Qbo the peace sign increases the volume (yeah, seriously!), while pointing the peace sign down tells him to take it down a few notches. The ultimate party mate and wing man is even so kind as to announce the name and title of the track. The video after the break best explains what hanging with this fellow is like, but if you’re keen on textual explanations, just imagine yourself awkwardly doing the robot to control your stereo. Go on, we won’t look.

Continue reading Qbo music player robot responds to hand gestures, challenges DJ Roomba to a dance-off (video)

Qbo music player robot responds to hand gestures, challenges DJ Roomba to a dance-off (video) originally appeared on Engadget on Thu, 04 Aug 2011 06:29:00 EDT. Please see our terms for use of feeds.

Permalink PlasticPals  |  sourceTheCorpora Blog  | Email this | Comments

Eliza is a doe-eyed, graceful dancing machine, lacks maniacal quality on the floor (video)

Eliza

Unlike the last batch of bots we’ve seen, Eliza is actually quite graceful. The cartoonish humanoid got its start as a guide, shuttling people around shopping malls and the Guangzhou Asian Games 2010 Experience Center. Now it’s finally getting a chance to show off what it’s got — namely some ill dance moves. These four doe-eyed machines spin, perform complicated arm choreography in perfect synchronization, and pause to pose during this epic number. Clearly, the next step is for someone to teach them how to Dougie. Check out the videos after the break.

[Thanks, Robotbling]

Continue reading Eliza is a doe-eyed, graceful dancing machine, lacks maniacal quality on the floor (video)

Eliza is a doe-eyed, graceful dancing machine, lacks maniacal quality on the floor (video) originally appeared on Engadget on Thu, 21 Jul 2011 13:07:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcePlasticPals  | Email this | Comments

Researchers use graphene to draw energy from flowing water, self-powered micro-robots to follow?

What can’t graphene do? The wonder material’s been at the heart of a stunning number of technological breakthroughs of late, and now it’s adding oil exploration to its long list of achievements. A team of researchers at Rensselaer Polytechnic Institute have discovered that the flow of good old H2O over a sheet of graphene can generate enough electricity to power “tiny sensors” used in tracking down oil deposits. The gang, led by professor Nikhil Koratkar, was able to suck 85 nanowatts of power out of a slab of graphene measuring .03 by .015 millimeters. The little sensors the researchers speak of are pumped into potential oil wells via a stream of water, and are then put to work sniffing out hydrocarbons indicative of hidden pockets of oil and natural gas. Of course, that doesn’t have a whole lot of practical application for your average gadget consumer, but Koraktar sees a future filled with tiny water-powered robots and micro-submarines — we can dig it.

Researchers use graphene to draw energy from flowing water, self-powered micro-robots to follow? originally appeared on Engadget on Thu, 21 Jul 2011 10:53:00 EDT. Please see our terms for use of feeds.

Permalink Physorg  |  sourceRensselaer Polytechnic Institute  | Email this | Comments

Stanford’s Lightsaber-Wielding Robot Is Strong With the Force

What better way to combine your nerdy loves of computer programming and Star Wars than with a robot that can actually battle with a lightsaber?

This is “JediBot,” a Microsoft Kinect–controlled robot that can wield a foam sword (lightsaber, if you will) and duel a human combatant for command of the empire. Or something like that.

“We’ve all seen the Star Wars movies; they’re a lot of fun, and the sword fights are one of the most entertaining parts of it. So it seemed like it’d be cool to actually sword fight like that against a computerized opponent, like a Star Wars video game,” graduate student Ken Oslund says in the video above.

The world of dynamic robotics and AI has been immensely aided by the affordable, hackable Microsoft Kinect. The Kinect includes multiple camera and infrared light sensors, which makes recognizing, analyzing and interacting with a three-dimensional moving object — namely, a human — much simpler than in the past. Microsoft recently released the SDK for the Kinect, so we should be seeing increasingly useful and creative applications of the device. The KUKA robotic arm in the video above is traditionally used in assembly line manufacturing, but you may remember it from a Microsoft HALO: Reach light sculpture video last year.

According to the course overview (.pdf) for the “Experimental Robotics” course, the purpose of the laboratory-based class is “to provide hands-on experience with robotic manipulation.” Although the other groups in the class used a PUMA 560 industrial manipulator, the JediBot design team, composed of four graduate students including Tim Jenkins and Ken Oslund, got to use a more recently developed KUKA robotic arm. This final project for the course, which they got to choose themselves, was completed in a mere three weeks.

“The class is really open-ended,” Jenkins said. “The professor likes to have dynamic projects that involve action.”

The group knew they wanted to do something with computer vision so a person could interact with their robot. Due to the resources available, the group decided to use a Microsoft Kinect for that task over a camera. The Kinect was used to detect the position of JediBot’s opponent’s green sword-saber.

The robot strikes using a set of predefined attack motions. When it detects a hit, when its foam lightsaber comes in contact with its opponent’s foam lightsaber and puts torque on the robotic arm’s joints, it recoils and moves on to the next motion. It switches from move to move every one or two seconds.

“The defense mechanics were the most challenging, but people ended up enjoying the attack mode most. It was actually kind of a gimmick and only took a few hours to code up,” Jenkins said.

The project utilized a secret weapon not apparent in the video: a special set of C/C++ libraries developed by Stanford visiting entrepreneur and researcher Torsten Kroeger. Normally, the robot would need to plot out the entire trajectory of its motions from start to finish — preplanned motion. Kroeger’s Reflexxes Motion Libraries enable you to make the robot react to events, like collisions and new data from the Kinect, by simply updating the target position and velocity, with the libraries computing a new trajectory on-the-fly in less than a single millisecond.

This allows JediBot to respond to sensor events in real time, and that’s really the key to making robots more interactive.

Imagine a waiterbot with the reflexes to catch a falling drink before it hits the ground, or a karate robot you can spar against for practice before a big tournament.

I doubt anyone would be buying their own KUKA robotic arm and creating a sword-playing robot like JediBot in their home, but innovations like this using interactive controllers, and the availability of the Reflexxes Motion Libraries in particular for real-time physical responses, could help us see robots that better interact with us in daily life.

Video courtesy Stanford University/Steve Fyffe


Robot band covers Marilyn Manson, renders sullen teenagers obsolete (video)

Sure, we’ve seen robot bands before. But even when insecure and egotistical, they never quite capture the youthful disaffection we want from our mechanical pop stars. Until now. End of Life is a robot band consisting of a cello, and electric guitar, drums, and, for some reason, a flat-bed scanner — maybe he’s the cute one? The group recently covered Marilyn Manson’s three-string anthem “The Beautiful People,” and it sounds almost exactly like you’d expect: we’ll call it “raw, visceral, and uncensored.” We can’t wait to see them sneer at Rock Band-playing robots too lazy to learn a real instrument. Catch them in the video after the break, and you can tell all your less-cool friends you knew them back before they sold out.

Continue reading Robot band covers Marilyn Manson, renders sullen teenagers obsolete (video)

Robot band covers Marilyn Manson, renders sullen teenagers obsolete (video) originally appeared on Engadget on Mon, 18 Jul 2011 16:22:00 EDT. Please see our terms for use of feeds.

Permalink Technabob  |  sourceYouTube (bd594)  | Email this | Comments

Android trash can robot begs the question: ‘Why are you hitting yourself?’ (video)

We’ve seen robots that look like they’ve had one too many, but we’re pretty sure this little guy needs to check into rehab. Despite its absolutely adorable appearance, this Android seems hell-bent on destruction, literally beating itself up, and eventually falling on its face. Built using the requisite Arduino, a trash can, some LEDs, and a slew of other components, this little guy was apparently created in three days on a budget just barely exceeding $100. You can see a video of the waste-bin bot hitting rock bottom at the source link below, but please refrain from laughing; Android alcoholism is a serious issue.

Continue reading Android trash can robot begs the question: ‘Why are you hitting yourself?’ (video)

Android trash can robot begs the question: ‘Why are you hitting yourself?’ (video) originally appeared on Engadget on Sat, 16 Jul 2011 18:01:00 EDT. Please see our terms for use of feeds.

Permalink MIC Gadget  |  sourceMobile01 (translated)  | Email this | Comments

Plick hitches an elastic ride on the DIY robotics train (video)

Man, we hope Gumby’s collecting some royalty checks for this one. One part incredible-stretching toy, and one part DIY robotics kit, the Plick project takes the traditional hobbyist approach to brick-building your own bot and slaps a little rubber all around it. The industrial design prototype from Brazilian engineer Gabriel Paciornik combines programmable robotic parts with an elastic wired connection suitable for strapping your mad scientist creations to everyday objects. So, what can you make? The kit packs a variety of sensor-based circles that react to distance and sound, giving your mod-jobs the power of movement and light. It’s safe to say this not-for-market toy veers far from LEGO Mindstorms NXT territory — and that’s exactly the point. Far out video demo and its 60s beach music soundtrack after the break.

Continue reading Plick hitches an elastic ride on the DIY robotics train (video)

Plick hitches an elastic ride on the DIY robotics train (video) originally appeared on Engadget on Sat, 16 Jul 2011 05:14:00 EDT. Please see our terms for use of feeds.

Permalink Coolest Gadgets  |  sourcePlick Project  | Email this | Comments