Robots learning our pain threshold by punching humans and seeing if they cry
Posted in: research, robot, RobotApocalypse, robotics, Today's Chili, universityThe first rule of robotics is you do not talk about robotics that a robot should not injure a human being or, through inaction, allow a human being to come to harm. But how does a robot know when its acts or omissions are causing nearby fleshies discomfort? The obvious way is to scan for the same signals of distress that we humans do — facial, physical, and aural — but another, more fun, way is to just hit people over and over again and ask them how much each blow hurt. That’s what professor Borut Povse over in Slovenia is doing, in a research project he describes as “impact emulation,” where six test subjects are punched by a robotic arm until they can’t take it anymore. It’s funny, yes, but it’s also novel and a somewhat ingenious way to collect data and produce more intelligent machines. Of course, whether we actually want more intelligent machines is another matter altogether.
[Thanks, Anthony]
Robots learning our pain threshold by punching humans and seeing if they cry originally appeared on Engadget on Thu, 14 Oct 2010 06:43:00 EDT. Please see our terms for use of feeds.
Permalink News.com.au |
New Scientist | Email this | Comments
Chumby, the cute device that lets users play internet-based apps and music, has become a hacker’s delight because of its extensibility and Linux-based operating system.
It’s latest avatar is as the face and brain of a bipedal robot created by EMG Robotics. The robot is using accelerometers from Freescale to balance and walk. And while it is rather slow and clumsy, it’s a pretty neat hack.
“One small step toward our future robotic overlords,” wrote Andrew ‘Bunnie’ Huang, founder at Chumby on his blog. “But hey, at least they’ll be open source. That might even be an improvement over what we have today.”
In itself, the $120 Chumby One is a pretty interesting device. It has a 454 MHz Arm0 processor, a 3.5-inch LCD touchscreen, Wi-Fi connectivity and USB port. It is designed as open source hardware so schematics and layouts for the device are available to everyone. In the past, users have taken Insignia Infocast, a photo and app viewer, running on the Chumby platform and turned it into a $170 Linux tablet.
The latest Chumby hack may not be as functional but it is definitely fun. Check out the video to see the Chumby One walking around. (The demo begins at the 1:40 mark and the earlier portion of the video has no sound.)
See Also:
- Chumby Guts: Robot Viscera For Hackers
- For Hardware Entrepreneurs, Getting From Idea to Reality Isn’t Easy
- Hack Turns $170 Media Viewer Into Tablet
- Twitter-Enhanced Cuckoo Clock Chirps — and Tweets
Panasonic’s ‘Parallel link’ manufacturing robot can learn new tricks
Posted in: robot, Today's Chili, videoContinue reading Panasonic’s ‘Parallel link’ manufacturing robot can learn new tricks
Panasonic’s ‘Parallel link’ manufacturing robot can learn new tricks originally appeared on Engadget on Tue, 12 Oct 2010 16:04:00 EDT. Please see our terms for use of feeds.
Lego’s MINDroid Android app remotely controls Mindstorms NXT robots
Posted in: Android, android market, AndroidMarket, app, application, bluetooth, Google, GoogleAndroid, lego, remote control, RemoteControl, robot, Software, Today's ChiliHardcore hobbyists have been controlling their Mindstorms NXT creations with all sorts of paraphernalia for years, but now Lego itself is stepping in to lend a hand. The new MINDroid app just splashed down in the Android Market, and it enables Android 2.1 (or greater) handsets to dictate Mindstorms NXT robots over Bluetooth. According to Lego, tilting / turning the phone can make the robot move forward, turn to the sides, and by pressing an action button on the phone’s screen, activate the ‘Action’ motor. Given that the download will cost you absolutely nothing, what are you waiting for? Your robot army awaits your commands.
Lego’s MINDroid Android app remotely controls Mindstorms NXT robots originally appeared on Engadget on Tue, 12 Oct 2010 12:02:00 EDT. Please see our terms for use of feeds.
Permalink GigaOM |
Lego | Email this | Comments
Sofie surgical robot gives haptic feedback for a more humane touch
Posted in: medical, Medicine, robot, Robots, Today's ChiliSofie surgical robot gives haptic feedback for a more humane touch originally appeared on Engadget on Mon, 11 Oct 2010 23:13:00 EDT. Please see our terms for use of feeds.
Permalink | Gizmag | Email this | Comments
Lego Mindstorm NXT enlisted for shirt-folding robot
Posted in: household, lego, robot, Robots, Today's ChiliAll we have to say about this shirt-folding robot is that it does a better job than we could ever hope to. And for that, we love it. Video is below.
Continue reading Lego Mindstorm NXT enlisted for shirt-folding robot
Lego Mindstorm NXT enlisted for shirt-folding robot originally appeared on Engadget on Sun, 10 Oct 2010 11:08:00 EDT. Please see our terms for use of feeds.
Permalink Make |
YouTube | Email this | Comments
Berkeley Bionics reveals eLEGS exoskeleton, aims to help paraplegics walk in 2011 (update: eyes-on and video)
Posted in: hands-on, medical, robot, Robots, Today's Chili, video
Wondering where you’ve heard of Berkeley Bionics before? These are the same whiz-kids who produced the HULC exoskeleton in mid-2008, and now they’re back with a far more ambitious effort. Announced just moments ago in San Francisco, the eLEGS exoskeleton is a bionic device engineered to help paraplegics stand up and walk on their own. It’s hailed as a “wearable, artificially intelligent, bionic device,” and it’s expected to help out within the hospital, at home and elsewhere in this wild, wild place we call Earth. Initially, the device will be offered to rehabilitation centers for use under medical supervision, and can be adjusted to fit most people between 5’2″ and 6’4″ (and weighing 220 pounds or less) in a matter of minutes. We’re told that the device provides “unprecedented knee flexion,” and it’s also fairly quiet in operation; under ideal circumstances, speeds of up to 2MPH can be attained, and it employs a gesture-based human-machine interface that relies on legions of sensors to determine a user’s intentions and act accordingly. Clinical trials are going on as we speak about to begin, and there’s a limited release planned for the second half of 2011. We’re still waiting to hear back on a price, so keep it locked for more as we get it live from the event.
Update: We just got to see the eLEGS walk across stage, and you’ll find a gallery full of close-up pics immediately below. We also spoke to Berkeley Bionics CEO Eythor Bender, who detailed the system a bit more — it’s presently made of steel and carbon fiber with lithium-ion battery packs, weighs 45 pounds, and has enough juice to run for six hours of continuous walking. While he wouldn’t give us an exact price, he said they’re shooting for $100,000, and will be “very competitive” with other devices on the market. Following clinical trials, the exoskeleton will be available to select medical centers in July or August, though Bender also said the company’s also working on a streamlined commercial version for all-day use, tentatively slated for 2013.
Berkeley Bionics reveals eLEGS exoskeleton, aims to help paraplegics walk in 2011 (update: eyes-on and video) originally appeared on Engadget on Thu, 07 Oct 2010 14:44:00 EDT. Please see our terms for use of feeds.
Teddy bears are not just cuddly creatures for kids at bed time. Fujitsu Labs has developed a prototype teddy bear for adults that’s packed with some sophisticated hardware and can interact with and respond to humans. The stuffed bear is being called a “social robot with a personality,” and can make simple gestures, eye contact and small talk.
The hope is to use them for “robot therapy” in geriatric medicine for patients that suffer from dementia, says Fujitsu.
Fujitsu’s teddy bear robot is reminiscent of Pleo, the green robotic dinosaur capable of displaying basic emotions through animatronics and reacting to its surroundings. Despite Pleo’s innovative approach and tech capability, the robot didn’t really become a mainstream sensation –largely because it was positioned as a toy.
Fujitsu’s teddy bear robot comes with loftier ambition. The robotic teddy bear can be plugged to a PC using a USB port. Sensors stuffed into it help it make some gestures such as lifting one of its furry hands up in response to external stimuli.
The bears have a miniature camera built into their nose so they can automatically wake up from sleep state when they sense a person nearby and can turn in their direction. A voice synthesizer inside the device lets it channel the voice of a young boy. The sound is projected from a built-in speaker and synchronized to the robot’s behavior.
The robotic bears are capable of up to 300 movement patterns including raising its arms, looking downwards and kicking its feet. The movement are combined with display of “emotions” to signal happiness, sadness and anger, says Fujitsu. And since the robot can be connected to the PC, new movements can be recorded and displayed.
What makes these robots interesting, says Fujitsu, is that it is interactive and real, in a world that is increasingly filled with virtual interactions. The bears can be played with and are likely to integrate easily into people’s lives, says the company.
Fujitsu hopes its teddy bear can help develop “robot therapy,” a way to use robots to help people overcome challenges or problems–much like how animals are used to cheer up patients in some hospitals today.
If you want to see how the robotic teddy bears work, check out this video:
See Also:
- Android Droid is a Robot That Runs Android OS
- Adorable Walking Robot Sets Distance Record
- Butler Robot Can Fetch Drinks, Snacks
- Anybots Robot Will Go to the Office for You
- Gallery: Robot Bartenders Sling Cocktails for Carbon-Based
- Holographic Displays, Robot Eyes Hint at Your Interactive Future
Photo: CEATEC JAPAN Organizing Committee
[via Dvice]
I’ve always wanted to write all of those words in a single headline–I was just never sure that such a pipedream would ever come to fruition. Thankfully, Ladies&Gents has just afforded me that opportunity.
What, precisely, is or are Ladies&Gents, you ask? Simple. It’s a British robot arm combo that draws a picture of you as you go to the bathroom (as in walk toward the bathroom, not actually, you know, go to the bathroom.
Stupid, you say? Admit it, you’re just jealous that you didn’t think it up first.
Ladies&Gents consists of two arms and two cameras. The cameras film you as you walk toward the restroom. Then one of the hands (gents on one side, ladies on the other, naturally) draws an image of you in either blue or pink marker. The image is then uploaded to a blog, because, frankly, who wouldn’t want a drawing of themselves walking to the bathroom up on the Internet for ever and ever?
Don’t expect large scale adoption of the technology any time soon. At the moment Ladies&Gents is an installation at Unleashed Devices exhibit in West London.