Pulito, the Lego Mindstorms swiffer-bot that seeks out electricity (video)

You could certainly buy a ready-made robot to sweep your hardwood floors, but doesn’t building your own out of Lego bricks sound like loads more fun? That’s what PlastiBots did with the Pulito pictured above, a Lego Mindstorms NXT sweeper with a host of sensors to navigate around furniture and a standard Swiffer pad to scrub. There’s no fancy NorthStar or Celestial navigation packages to keep the bot on track, so it meanders about much of the time, but there is an fancy infrared beacon on the robot’s charging dock to guide the creature home. When the Pulito’s running out of juice from a long, tiring session of painstakingly traversing your floors, it’s programmed to automatically seek out that invisible light and receive a loving 12 volt embrace from the station’s brass charging bars. See it in action after the break, and hit our source link for more.

[Thanks, Dave]

Continue reading Pulito, the Lego Mindstorms swiffer-bot that seeks out electricity (video)

Pulito, the Lego Mindstorms swiffer-bot that seeks out electricity (video) originally appeared on Engadget on Mon, 15 Nov 2010 09:16:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcePlastiBots  | Email this | Comments

Robo-nurse gives gentle bed baths, keeps its laser eye on you (video)

When they’re not too busy building creepy little humanoids or lizard-like sand swimmers, researchers at the Georgia Institute of Technology like to concern themselves with helping make healthcare easier. To that end, they’ve constructed the Cody robot you see above, which has recently been demonstrated successfully wiping away “debris” from a human subject. The goal is simple enough to understand — aiding the elderly and infirm in keeping up their personal hygiene — but we’d still struggle to hand over responsibility for granny’s care to an autonomous machine equipped with a camera and laser in the place where a head might, or ought to, be. See Cody cleaning up its designer’s extremities after the break.

Continue reading Robo-nurse gives gentle bed baths, keeps its laser eye on you (video)

Robo-nurse gives gentle bed baths, keeps its laser eye on you (video) originally appeared on Engadget on Thu, 11 Nov 2010 10:29:00 EDT. Please see our terms for use of feeds.

Permalink MIT Technology Review  |  sourceGeorgia Tech  | Email this | Comments

Adafruit Offers $1000 Bounty for Open-Source Kinect Drivers

Open-source hardware company Adafruit has declared open season on Microsoft’s Kinect, offering a $1000 bounty to anyone who can write and release open-source drivers for the camera.

Kinect, released today for Xbox 360, is expensive for a video game peripheral, but inexpensive considering its built-in hardware. It has an RGB camera, depth sensor, and multi-array microphone. But as we observed yesterday, it’s Kinect’s proprietary software that provides full-body 3D motion capture, facial recognition, and voice recognition capabilities.

“Imagine being able to use this off the shelf camera for Xbox for Mac, Linux, Win, embedded systems, robotics, etc.” Adafruit writes. “We know Microsoft isn’t developing this device for FIRST Robotics, but we could! Let’s reverse engineer this together, get the RGB and distance out of it and make cool stuff!”

The OK Project is Adafruit’s first attempt at a contest of this kind. Any person or group to upload working Kinect code and examples under an open source license to GitHub will be awarded $1000. The code can run on any operating system but must be open-source. Adafruit even invites Microsoft to participate.

This isn’t much like finding an open driver for a printer. It’s more like jailbreaking the iPhone. The Kinect has its own processor, and the code powering it operates several different pieces of hardware and does a lot of preprocessing before sending it out to the console. The human-anatomy and facial-recognition software is especially tricky. But that doesn’t mean it can’t be done.

In an email, Adafruit’s Phillip Torrone writes that the company “would like to see this camera used for education, robotics and fun outside the Xbox.” That does sound like Microsoft’s bag, and I’d bet many people in the company in those fields have plans for the tech behind Kinect. Sadly, I doubt they’ll be tripping over themselves to help hack the company’s own camera.

See Also:


Reporter Gives Robonaut Space Robot a Squeeze

Reporter and Robonaut 2She called it a date, but as far as we can tell, the meeting between MSNBC reporter Stephanie Pappas and soon-to-be the first humanoid robot in space Robonaut 2 was a bit of a one-sided affair.

A joint project between General Motors and NASA, Robonaut 2 is expected to help astronauts perform repairs and other maintenance on the International Space Station. This model, Robonaut 2B will travel on the very last Space Shuttle mission; Originally scheduled a November 1 launch, fuel leaks have delayed the Shuttle Discovery blast-off until Tuesday of next week.

Pappas, who met the robot at NASA’s Johnson Space Center, reports that the 330 pound automaton was a little intimidating and looked as if it might be “ready to throw a punch.” It does look a tiny bit like a giant version of one of those punching puppets (our favorites were always the nun and ET) . Though only a torso, Robonaut 2 can replicate human hand and arm movement and perform tasks such as drilling and painting. During Pappas’ date, however, Robonaut didn’t paint, throw a punch, speak or even move. To be fair, Pappas’s date is not the robot heading into space. The final model, Robonaut 2B, has new fire-proof skin and a few space-ready parts. Plus, as Pappas notes, it doesn’t have any smell. (Now you know the answer to the age-old-question, “Do things still smell in space?”).

As Pappas’ date neared its conclusion, the reporter did manage to make brief contact with the humanoid robot’s arm. She reports that it felt like a “cross between a memory-foam pillow and a well-muscled human arm.”

We’re taking bets on whether or not Robonaut will call Pappas, or at least text her.

Robot Bowler Still Can’t Best Bowling Pro

EARL the Bowling Robot

EARL is the perfect name for a robotic bowler, but even if we strip away the acronym artifice to reveal the technology’s full name–Enhanced Automated Robotic Launcher–EARL’s a pretty cool invention. According to a post on Coolest Gadgets EARL is an expert bowling robot used by the Equipment Specifications and Certifications team of the National Bowling Congress to test bowling gear.

I used to bowl a bit and always imagined that if I, like EARL, could throw the ball exactly the same every time, and consistently hit the sweet spot between the 1 and 2 (or 3) pin, I’d have a strike every time. I never bowled above a175, but strangely, EARL, a seemingly perfect bowler–it’s a computer for heaven’s sake–can’t bowl perfectly either. Learn why (and see EARL compete) after the jump.

Pleo Survives and Gets a Whole Lot Smarter

Pleo II

Pleo, the adorable robotic camosaur has pulled off a feat that’s eluded even the mighty, prehistory dinosaurs: rising from the dead. Say hello to Pleo II: The Revenge! (Okay, I added “The Revenge” part.)

After nearly going down with the bankrupt Ugobe Corp., Pleo was purchased by Innvo Labs late last year. During CES 2010, company COO Derek Dotson promised a new “plush” Pleo, but offered no timeline for delivery.

Now, he’s opened up to Pleo fan site BobthePleo and spilled all the details about Pleo II. No, it’s not plush. Instead the rubber-skinned Pleo II will arrive in two gender-specific colors (pink for the girl and blue for the boy). That rather unimaginative innovation aside, Pleo’s guts are getting a significant upgrade, too. Dotson promises additional four touch sensors and an RFID reader in its mouth to identify some of its new toys.

Flexible, implantable LEDs look set to start a new body modification craze

LED lights are cool, you’re cool, why not combine the two, right? We doubt that’s quite the reasoning that led to this international research project, but it’s certainly an appealing way to look at it. Our old buddy John Rogers from the University of Illinois at Urbana-Champaign has headed up a research team with participants from the US, China, Korea, and Singapore, who have together produced and demonstrated a new flexible and implantable LED array. Bettering previous efforts at inserting lights under the human skin, this approach allows for stretching and twisting by as much as 75 percent, while the whole substrate is encased in thin silicon rubber making it waterproof. Basically, it’s a green light to subdermal illumination, which could aid such things as monitoring the healing of wounds, activating light-sensitive drug delivery, spectroscopy, and even robotics. By which we’re guessing they mean our robot overlords will be able to color-code us more easily. Yeah, that must be it.

Flexible, implantable LEDs look set to start a new body modification craze originally appeared on Engadget on Mon, 18 Oct 2010 10:53:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcePhysOrg  | Email this | Comments

Robots Learning How Not to Hurt Humans, By Punching Them

Epson Robot Punches Man's ArmFace it, some day you, me and everybody else will be working along-side robots. They’re already in our factories and starting to arrive in our homes. Despite everyone’s irrational fear of “our robot overlords,” this is as it should be. The only problem is that robots today are not nearly as smart as we think they are and a powerful manufacturing bot could, without meaning to, take your head off if you get in its way.

No, robots are not trying to harm us, but programming them to understand our emotions, needs and reactions to, say, pain is pretty darn difficult. Over in Slovenia, scientists are seeking to overcome this android deficit by teaching robots how humans react to varying degrees of human-robot collisions. To do so, researchers took a standard Epson manufacturing robot arm and programmed it to “punch” someone’s arm as many as 18 times with dull and then increasingly sharp instruments. Punchees were asked to record the severity of their pain. This ranged from “painless” to “unbearable”.

If our future is literally filled with robots, it’s unlikely we’ll be able to work alongside them without occasionally bumping into each other. As a report in New Scientist explains, the data will be used to program future robots and ensure that they slow down when sensors indicate they’re in the proximity of a human. No word on if the scientist will also program robots that do bump into humans to say, “My bad.”

Robots learning our pain threshold by punching humans and seeing if they cry

The first rule of robotics is you do not talk about robotics that a robot should not injure a human being or, through inaction, allow a human being to come to harm. But how does a robot know when its acts or omissions are causing nearby fleshies discomfort? The obvious way is to scan for the same signals of distress that we humans do — facial, physical, and aural — but another, more fun, way is to just hit people over and over again and ask them how much each blow hurt. That’s what professor Borut Povse over in Slovenia is doing, in a research project he describes as “impact emulation,” where six test subjects are punched by a robotic arm until they can’t take it anymore. It’s funny, yes, but it’s also novel and a somewhat ingenious way to collect data and produce more intelligent machines. Of course, whether we actually want more intelligent machines is another matter altogether.

[Thanks, Anthony]

Robots learning our pain threshold by punching humans and seeing if they cry originally appeared on Engadget on Thu, 14 Oct 2010 06:43:00 EDT. Please see our terms for use of feeds.

Permalink News.com.au  |  sourceNew Scientist  | Email this | Comments

Ironman is Real and Hes Almost Ready to Kick Some Butt

Ironman ExoskeletonWhat is Marvel’s Ironman comic book hero but an average guy with a failing heart wearing a really awesome exoskeleton? It’s not as implausible as, say, a “man who can fly” or a dude who turns into a giant green body builder when he’s angry. So why should be surprised when we hear that a real-world “ironman” exoskeleton may be coming to a shipping yard or military operation near you?

According to a report on the Salt Lake Tribune, Massachusetts-based defense contractor Raytheon-Sarcos has developed a 195-lb, full-body suit, the XOS-2 that can make a 200 pound weight feel like 12 pounds and give the wearer the ability to punch through a roughly 6-inch-thick wood wall.

Despite the millions already spent developing the suit, Ratheon’s body suit looks nowhere near as impressive as the armor worn by Robert Downey Jr. in the Ironman movies; nor can it travel great distances without an external hydraulic power pack. That will all change, eventually, According to Raytheon executives, tethered industrial use could come as early as three years from now. Fully mobile, combat-ready use is at least five years away.

Even so, who doesn’t want to try one of these on today?