Creepy new Air Force camera can identify and track you from far, far away

Photon-X Behaviormetric Sensor

Sure you can do neat things like unlock your iPhone using facial recognition, but the Air Force has far grander visions for the tech. Specifically it wants a camera that can identify and track possible insurgents at a significant distance (though it’s unclear how far we’re talking about here) using only a few seconds of footage. It’s turned to Photon-X Inc. to develop a sensor that combines spacial measurements, infrared and visible light to create a “bio-signature” that maps not only static facial features but muscle movements that are unique to each individual. The technology could also be used in targeting systems to identify enemy vehicles and integrated into robots to help them navigate and identify objects… or threatening meatbags. The Air Force even foresees law enforcement, banks, and private security firms using the cams to monitor customers and watch for suspicious activity. Similar tools have been created that use software to analyze video feeds, but they can’t match the accuracy or range of this “behaviormetric” system. Normally, this is where we’d make some snide reference to Skynet or Big Brother but, honestly, we’re too creeped out for jokes.

Creepy new Air Force camera can identify and track you from far, far away originally appeared on Engadget on Fri, 20 May 2011 10:09:00 EDT. Please see our terms for use of feeds.

Permalink Wired  |  sourcePhoton-X, Department of Defense  | Email this | Comments

Emoti-bots turn household objects into mopey machines (video)

Some emotional robots dip deep into the dark recesses of the uncanny valley, where our threshold for human mimicry resides. Emoti-bots on the other hand, manage to skip the creepy human-like pitfalls of other emo-machines, instead employing household objects to ape the most pathetic of human emotions — specifically dejection and insecurity. Sure it sounds sad, but the mechanized furniture designed by a pair of MFA students is actually quite clever. Using a hacked Roomba and an Arduino, the duo created a chair that reacts to your touch, and wanders aimlessly once your rump has disembarked. They’ve also employed Nitinol wires, a DC motor, and a proximity sensor to make a lamp that seems to tire with use. We prefer our lamps to look on the sunny side of life, but for those of you who like your fixtures forlorn, the Emoti-bots are now on display at Parsons in New York and can be found moping about in the video after the break.

Continue reading Emoti-bots turn household objects into mopey machines (video)

Emoti-bots turn household objects into mopey machines (video) originally appeared on Engadget on Fri, 20 May 2011 06:07:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceEmoti-bots  | Email this | Comments

Tenacious robot ashamed of creator’s performance, shows mankind how it’s done (video)




Looks like researchers have made another step towards taking Skynet live: giving robots the groundwork for gloating. A Swiss team of misguided geniuses have developed learning algorithms that allow robot-kind to learn from human mistakes. Earthlings guide the robot through a flawed attempt at completing a task, such as catapulting a ball into a paper basket; the machine then extrapolates its goal, what went wrong in the human-guided example, and how to succeed, via trial and error. Rather than presuming human demonstrations represent a job well done, this new algorithm assumes all human examples are failures, ultimately using their bad examples to help the ‘bot one-up its creators. Thankfully, the new algorithm is only being used with a single hyper-learning appendage; heaven forbid it should ever learn how to use the robot-internet.

Tenacious robot ashamed of creator’s performance, shows mankind how it’s done (video) originally appeared on Engadget on Thu, 19 May 2011 19:02:00 EDT. Please see our terms for use of feeds.

Permalink IEEE Spectrum  |  sourceEPFL (PDF)  | Email this | Comments

BBC shows us what it’s like to live with a bionic hand

We’ve posted a fair share on bionic limbs and their advancements over the years, but rarely have we had the chance to see a video of one in real world use, on a real person. The BBC has shared a video of a man named Patrick using his bionic arm, which — long story short — was partially the result of being electrocuted at work. This is his second one to date and specifically, it’s a prototype Otto Bock mind-controlled prosthetic arm equipped with six nerve sensors that let him use the hand as if it were his own– it supports pinching and gripping with the fingers as well as lateral and circular movement of the wrist. Although the footage is a mundane roll of various day to day tasks — gripping a bottle to pour a glass of water for instance — it’s quite amazing to realize technology is helping him do things he’d otherwise be deprived of. We’d suggest checking it out at the BBC by clicking the source link below.

BBC shows us what it’s like to live with a bionic hand originally appeared on Engadget on Thu, 19 May 2011 08:01:00 EDT. Please see our terms for use of feeds.

Permalink Slashdot  |  sourceBBC  | Email this | Comments

Pleo bares it all for FCC approval

Poor Pleo. Everyone fell in love with the little green dino at first sight, but no one actually bought the thing. Undaunted, the adorable fleshy robot made a triumphant return at this year’s CES as Pleo RB (that’s “Reborn”), with the help of adopted manufacturer Innvo Labs. The newly invigorated ‘bot brings voice recognition, more sensors, and RFID-based command learning technology to the table. With all its new gear in place, Pleo was poked, prodded, and peeled by the FCC, revealing, among other things, that new RFID reader in its chin. The results are gruesome and not recommended for faint of heart robot dinosaur lovers. You’ve been warned.

Pleo bares it all for FCC approval originally appeared on Engadget on Wed, 18 May 2011 21:29:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceFCC  | Email this | Comments

Lingodroid robots develop their own language, quietly begin plotting against mankind

It’s one thing for a robot to learn English, Japanese, or any other language that we humans have already mastered. It’s quite another for a pair of bots to develop their own, entirely new lexicon, as these two apparently have. Created by Ruth Schulz and her team of researchers at the University of Queensland and Queensland University of Technology, each of these so-called Lingodroids constructed their special language after navigating their way through a labyrinthine space. As they wove around the maze, the Lingobots created spatial maps of their surroundings, with the help of on-board cameras, laser range finders and sonar equipment that helped them avoid walls. They also created words for each mapped location, using a database of syllables. With the mapping complete, the robots would reconvene and communicate their findings to each other, using mounted microphones and speakers. One bot, for example, would spit out a word it had created for the center of the maze (“jaya”), sending both of them off on a “race” to find that spot. If they ended up meeting at the center of the room, they would agree to call it “jaya.” From there, they could tell each other about the area they’d just come from, thereby spawning new words for direction and distance, as well. Schulz is now looking to teach her bots how to express more complex ideas, though her work is likely to hit a roadblock once these two develop a phrase for “armed revolt.”

Lingodroid robots develop their own language, quietly begin plotting against mankind originally appeared on Engadget on Wed, 18 May 2011 11:07:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceIEEE Spectrum  | Email this | Comments

Rescue robots map and explore dangerous buildings, prove there’s no ‘I’ in ‘team’ (video)

We’ve seen robots do some pretty heroic things in our time, but engineers from Georgia Tech, the University of Pennsylvania and Cal Tech have now developed an entire fleet of autonomous rescue vehicles, capable of simultaneously mapping and exploring potentially dangerous buildings — without allowing their egos to get in the way. Each wheeled bot measures just one square foot in size, carries a video camera capable of identifying doorways, and uses an on-board laser scanner to analyze walls. Once gathered, these data are processed using a technique known as simultaneous localization and mapping (SLAM), which allows each bot to create maps of both familiar and unknown environments, while constantly recording and reporting its current location (independently of GPS). And, perhaps best of all, these rescue Roombas are pretty teamoriented. Georgia Tech professor Henrik Christensen explains:

“There is no lead robot, yet each unit is capable of recruiting other units to make sure the entire area is explored. When the first robot comes to an intersection, it says to a second robot, ‘I’m going to go to the left if you go to the right.'”

This egalitarian robot army is the spawn of a research initiative known as the Micro Autonomous Systems and Technology (MAST) Collaborative Technology Alliance Program, sponsored by the US Army Research Laboratory. The ultimate goal is to shrink the bots down even further and to expand their capabilities. Engineers have already begun integrating infrared sensors into their design and are even developing small radar modules capable of seeing through walls. Roll past the break for a video of the vehicles in action, along with full PR.

Continue reading Rescue robots map and explore dangerous buildings, prove there’s no ‘I’ in ‘team’ (video)

Rescue robots map and explore dangerous buildings, prove there’s no ‘I’ in ‘team’ (video) originally appeared on Engadget on Tue, 17 May 2011 17:58:00 EDT. Please see our terms for use of feeds.

Permalink CNET  |  sourceGeorgia Tech  | Email this | Comments

Rolling robot learns to fly, plots escape from human captors (video)


Why settle for a robot that can just roll or fly? That’s the question some researchers from the University of Minnesota’s Center for Distributed Robotics recently asked themselves, and this little transforming contraption is their answer. As you can see in the video above, it’s able to roll around on the ground with relative ease (although obstacles may be another matter), and then prop itself up to take flight like any other robotic helicopter. Those thinking about trying their hand at a DIY version may want to think twice, however, as its not exactly as simple as it may appear. In fact, the researchers apparently spent a full $20,000 just to develop the folding rotor mechanism.

Rolling robot learns to fly, plots escape from human captors (video) originally appeared on Engadget on Mon, 16 May 2011 18:37:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceIEEE Spectrum  | Email this | Comments

French basketball team ‘trains’ with robots, learns how to ‘win’


To the list of French accomplishments you may now add “robot basketball training” — at least if the video above is to be believed. But you probably shouldn’t believe it when members of Poitiers Basket 86 testify that amusement park rides improved the team’s “spatial orientation” and helped them defeat top-ranked Chalon. It’d be different if the “robots” were teaching them perfect free-throw or helping them walk, obviously, but PB86 is known for its innovative advertising, and this seems like a quirky example. Hit the video above to see the pranksters at work, but know that, as with Sartre and Camus, something gets lost in translation.

[Thanks, Antoine]

French basketball team ‘trains’ with robots, learns how to ‘win’ originally appeared on Engadget on Mon, 16 May 2011 15:02:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcebasketzap(YouTube)  | Email this | Comments

Bipedal robots learn to shuffle, evolve toward doing the twist (video)

Yes, some robots are evolving to a point where they can play instruments and swing a hammer. Hilariously, though, bipedal robots are still awful at turning in a tight radius. Several presenters at the International Conference on Robotics and Automation have been working on a solution: instead of making them take steps, program robots to shuffle. This allows turning without complex weight-shifting — every time your foot leaves the ground, you have to adjust your balance to remain upright. Keeping your feet on the ground avoids that fairly complicated process, and can make robot-turning quicker, and possible in confined spaces; most current bipedal bots require lots of time and space to turn. See the video after the break for an example from Japan’s Osaka Electro-Communication University. It may look like a metal man shuffling his feet, but it’s an important step toward our robot-dominated future.

[Thanks, Henry]

Continue reading Bipedal robots learn to shuffle, evolve toward doing the twist (video)

Bipedal robots learn to shuffle, evolve toward doing the twist (video) originally appeared on Engadget on Mon, 16 May 2011 07:18:00 EDT. Please see our terms for use of feeds.

Permalink Plastic Pals  |  sourceOsaka Electro-Communication University  | Email this | Comments