Athlete Robot runs just a few steps before falling down, does it with style

Since 2007, researchers at Tokyo University’s ISI Lab have been working on a prototype of a running robot, which we’ve seen several of in the past. Athlete Robot (as it is seemingly dubbed) is a little bit different though. While it hasn’t outwardly been given the humanoid treatment in any significant way, technologically, it works very much like a human. As you’ll see in the video below, early prototypes of the bot which were less ‘human’ like in design didn’t function as well as the newer version, which boasts McKibben artificial muscles and a biologically correct musculoskeletal system. Now, the robot still can’t run very far without falling over, but it’s impressive to watch its movements nonetheless.

Continue reading Athlete Robot runs just a few steps before falling down, does it with style

Athlete Robot runs just a few steps before falling down, does it with style originally appeared on Engadget on Wed, 15 Dec 2010 21:58:00 EDT. Please see our terms for use of feeds.

Permalink PlasticPals  |  sourceYouTube  | Email this | Comments

Athlete Robot Ready to Run As Humans Do

Athlete Running Robot

Robots are among our most polarizing technological innovations. Some of us love and openly embrace bots, while others live in near constant fear of an android coup. When I hear, “robot learns to run like humans,” I imagine robot races and bipedal bots bounding over hills to help save us. The fearful, however, see their worst fears realized: “Now robots can actually chase and catch us.”

Robot researchers like Ryuma Niiyama (currently working in MIT’s Robot Locomotion Group) couldn’t care less about your fears. According to a report in IEEE Spectrum, Niiyama is building a biped robot called “Athlete” that uses artificial muscles and prosthetic feet to run at speeds and in a style more akin to human locomotion. Previous humanoid robots like the Honda Asimo use a complex array of motors, sensor and actuators to walk and even, in the case of Asimo, “run.” However, anyone who has seen Asimo dash around a stage knows that the bot’s motion doesn’t look entirely natural.

Niiyama’s robot mimics some aspects of human running to achieve a more natural gait. The robot’s artificial muscles reside entirely above the “knees”. Below that it’s all prosthetic elastic blades that some double amputees use for running. As a result, the robot springs forward with each step–as humans do– and uses its muscles and sensors to maintain balance as it races forward; again, pretty much as people do when they’re running.

So far, Niiyama and his team have only been partially successful. Athlete runs a few unaided steps but then falls over. Watch the video below, which charts Athlete’s development from an early 2007 model to today’s elastic-blade-fitted Athlete.

Video after the jump.

Wheeme massage robot asks where it hurts to drive away the pain

Former Israeli electronics and defense engineers wouldn’t be the first group of people we’d peg to leap into the robot massager biz, but that’s exactly who’s behind the Wheeme from DreamBots inc. According to the firm’s about page, the Wheeme was developed to meet “the increasing demand for smart products that offer the natural feeling of caressing, relaxation, falling asleep and even just tickling.” True to those goals, the device works by moving slowly across a person’s body to provide a gentle massage using its soft silicone rubber “fingerettes” (a.k.a wheels). Special tilt sensor technology ensures it won’t fall off or lose grip while motoring either — making the Wheeme a master at its trade — at least for customers lying down. Officially this rover will start shipping in the spring of 2011, but pre-ordering the device which costs $49 plus shipping will guarantee you don’t miss out on any of the drive by goodness. To view the Wheeme going to work while narrowly avoiding crashes, check out the embedded video after the break.

Continue reading Wheeme massage robot asks where it hurts to drive away the pain

Wheeme massage robot asks where it hurts to drive away the pain originally appeared on Engadget on Fri, 03 Dec 2010 02:35:00 EDT. Please see our terms for use of feeds.

Permalink Spectrum.ieee.org, Engadget German  |  sourceDreambots.com  | Email this | Comments

Neato XV-11 robot vacuum gets its very own open source LIDAR hack

There’s nothing like a little bounty to light a fire under a group of open source fanatics, is there? We saw this principle applied recently when Adafruit offered up cold, hard cash for an Open Source Kinect driver, and now one enterprising reader over at robotbox.net has gone and hacked the LIDAR unit on a Neato XV-11 robot vacuum — and won $401 for the effort. What’s this mean to you? Well, the gentleman (who goes by the nom de hack Hash79) can now read data sent from the optical ranging hardware on the vacuum to a PC. There has been a pretty enthusiastic group of hackers surrounding the device for a while now and now with a little hard work (and a $399 autonomous robot vacuum) you too can have a 360 degree scanning LIDAR with one degree accuracy and a 10Hz refresh rate. Pretty sweet, right? Video after the break.

Continue reading Neato XV-11 robot vacuum gets its very own open source LIDAR hack

Neato XV-11 robot vacuum gets its very own open source LIDAR hack originally appeared on Engadget on Mon, 29 Nov 2010 18:34:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcerobotbox.net  | Email this | Comments

60 year-old remote-controlled robot made from scrap parts makes a dramatic, beautiful comeback

This is George. He’s a six-foot tall robot handmade from the aluminum scraps of a crashed bomber in 1950. George is remote controlled, and was built by Tony Sale, the same man who recently resurrected the nearly forgotten robotic darling from the storage shed where he’s spent the last 45 years or so. Some oil and batteries were all it took to get George up and walking again, and he’ll now have a permanent home at the National Museum of Computing in Bletchley Park, Buckinghamshire, England. And that’s the next museum we’ll be visiting, because we cannot get enough of this giant. Tear-inducing video is after the break.

[Image Credit: Geoff Robinson, Daily Mail]

Continue reading 60 year-old remote-controlled robot made from scrap parts makes a dramatic, beautiful comeback

60 year-old remote-controlled robot made from scrap parts makes a dramatic, beautiful comeback originally appeared on Engadget on Tue, 23 Nov 2010 21:55:00 EDT. Please see our terms for use of feeds.

Permalink Switched  |  sourceRobots  | Email this | Comments

Control a 3-D–Mapping Robot With Gestures? Just Add Kinect

Philipp Robbel, a student at MIT’s Personal Robotics Group, has used a hacked Xbox Kinect camera and an iRobot Create kit to make a Roomba-esque KinectBot that can recognize human beings and respond to their gestural commands.

In an interview with SingularityHub, Robbel discussed how KinectBot grew out of his research in robots that could locate trapped or missing people in a disaster. The Kinect’s ability to map terrain in 3-D and to recognize and respond to human gestures could eventually be teamed up with aerial drones and rapid-response teams to launch rescue operations.

This video shows how KinectBot was assembled and what it can do.

Bear in mind, this is just what Robbel calls a “weekend hacking project.” Imagine what Microsoft’s Robotics team — who’ve had a lot longer to play with the tech behind Kinect than the rest of us — might be cooking up in their labs.

Still a $150 off-the-shelf sensor like Kinect opens up the option box for everybody. Add the right mix of boops and beeps, a computer-hacking interface, jet packs and the ability to serve drinks and fix starships, and we’re just a few iterations away from a full-fledged R2-D2 unit. We’re living in the future.

See Also:


Kinect sensor bolted to an iRobot Create, starts looking for trouble

While there have already been a lot of great proof-of-concepts for the Kinect, what we’re really excited for are the actual applications that will come from it. On the top of our list? Robots. The Personal Robots Group at MIT has put a battery-powered Kinect sensor on top of the iRobot Create platform, and is beaming the camera and depth sensor data to a remote computer for processing into a 3D map — which in turn can be used for navigation by the bot. They’re also using the data for human recognition, which allows for controlling the bot using natural gestures. Looking to do something similar with your own robot? Well, the ROS folks have a Kinect driver in the works that will presumably allow you to feed all that great Kinect data into ROS’s already impressive libraries for machine vision. Tie in the Kinect’s multi-array microphones, accelerometer, and tilt motor and you’ve got a highly aware, semi-anthropomorphic “three-eyed” robot just waiting to happen. We hope it will be friends with us. Video of the ROS experimentation is after the break.

Continue reading Kinect sensor bolted to an iRobot Create, starts looking for trouble

Kinect sensor bolted to an iRobot Create, starts looking for trouble originally appeared on Engadget on Wed, 17 Nov 2010 21:38:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcesquadbot (YouTube), ROS.org  | Email this | Comments

Pulito, the Lego Mindstorms swiffer-bot that seeks out electricity (video)

You could certainly buy a ready-made robot to sweep your hardwood floors, but doesn’t building your own out of Lego bricks sound like loads more fun? That’s what PlastiBots did with the Pulito pictured above, a Lego Mindstorms NXT sweeper with a host of sensors to navigate around furniture and a standard Swiffer pad to scrub. There’s no fancy NorthStar or Celestial navigation packages to keep the bot on track, so it meanders about much of the time, but there is an fancy infrared beacon on the robot’s charging dock to guide the creature home. When the Pulito’s running out of juice from a long, tiring session of painstakingly traversing your floors, it’s programmed to automatically seek out that invisible light and receive a loving 12 volt embrace from the station’s brass charging bars. See it in action after the break, and hit our source link for more.

[Thanks, Dave]

Continue reading Pulito, the Lego Mindstorms swiffer-bot that seeks out electricity (video)

Pulito, the Lego Mindstorms swiffer-bot that seeks out electricity (video) originally appeared on Engadget on Mon, 15 Nov 2010 09:16:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcePlastiBots  | Email this | Comments

Robo-nurse gives gentle bed baths, keeps its laser eye on you (video)

When they’re not too busy building creepy little humanoids or lizard-like sand swimmers, researchers at the Georgia Institute of Technology like to concern themselves with helping make healthcare easier. To that end, they’ve constructed the Cody robot you see above, which has recently been demonstrated successfully wiping away “debris” from a human subject. The goal is simple enough to understand — aiding the elderly and infirm in keeping up their personal hygiene — but we’d still struggle to hand over responsibility for granny’s care to an autonomous machine equipped with a camera and laser in the place where a head might, or ought to, be. See Cody cleaning up its designer’s extremities after the break.

Continue reading Robo-nurse gives gentle bed baths, keeps its laser eye on you (video)

Robo-nurse gives gentle bed baths, keeps its laser eye on you (video) originally appeared on Engadget on Thu, 11 Nov 2010 10:29:00 EDT. Please see our terms for use of feeds.

Permalink MIT Technology Review  |  sourceGeorgia Tech  | Email this | Comments

Choreographing a humanoid robot’s dance routine is as easy as click and pull

You may not be able to build an HRP-4C fembot in your average garage, but the programming would practically take care of itself — not only does the AIST humanoid sing using off-the-shelf Yamaha Vocaloid software, its dance moves are click-and-drag, too. Roboticist Dr. Kazuhito Yokoi gave IEEE Spectrum an inside look at the HRP-4C’s motion trajectory software, which works much like 3D animation tools: you position the limbs where you want them to start and when you want them to end up using keyframes, and the software takes care of the rest. The system’s intelligent enough to generate a 6.7 second sequence from just eight keyframes, and it compensates for hazardous instructions, too — if your haphazard choreography would tip her over or send limbs flying, it’ll automatically adjust her moves. See how it works in a video after the break and hit up our source link for the full interview.

Continue reading Choreographing a humanoid robot’s dance routine is as easy as click and pull

Choreographing a humanoid robot’s dance routine is as easy as click and pull originally appeared on Engadget on Wed, 03 Nov 2010 09:48:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceIEEE Spectrum  | Email this | Comments