Sharp’s Molecular Beam Epitaxy machine births components in its space-like womb (video)

A machine that builds other machines? Sounds like robot apocalypse time — except it’s not. This component-building, space-mimicking chamber of liquid nitrogen-cooled sterility gives birth to LEDs, not that kid from A.I. Housed in Sharp’s Oxford Laboratory, the Molecular Beam Epitaxy machine moves atoms “almost individually…to build the basis of high tech electronics.” Through the use of magnetic poles on the contraption’s exterior (kind of like in foosball), researchers can virtually manipulate substrates and elements, allowing for precise control and untainted crystal growth. While this MBE isn’t exactly new tech — larger commercial-grade versions already exist — it is noteworthy for its innovative petri-vacuum abilities. After all, progress has to start somewhere. Click past the break for the ominously toned video explanation.

Continue reading Sharp’s Molecular Beam Epitaxy machine births components in its space-like womb (video)

Sharp’s Molecular Beam Epitaxy machine births components in its space-like womb (video) originally appeared on Engadget on Wed, 13 Jul 2011 19:31:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceHumans Invent  | Email this | Comments

Robots make breakfast for scientists, bide time (video)

Breakfast is the most important meal of the day for a growing robot — it’s also an easy and relatively quick way to lull a group of scientists into a false sense of security. Now, we’re not saying that James and Rosie here had an ulterior motive when they put together a breakfast of Bavarian sausage and baguettes for a group of researchers at Munich’s CoTeSys lab — as far as robotic couples go, they seem very nice. James, a US-designed PR2 robot, sliced the bread, while German-designed Rosie boiled up some sausages, as some hungry roboticists looked on patiently. Oddly, this isn’t the first time we’ve seen a robot prepare a morning meal — it’s nice to know, however, that after the robot apocalypse, at least we’ll all still be well fed. Super sped up video of cooking robots after the break.

Continue reading Robots make breakfast for scientists, bide time (video)

Robots make breakfast for scientists, bide time (video) originally appeared on Engadget on Sun, 12 Jun 2011 22:33:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceIEEE Spectrum  | Email this | Comments

Russian ATM uses voice analysis to tell when you’re lying

Credit card applications via automated teller are all the rage abroad these days. That’s why Russia’s Sberbank is using Speech Technology Center’s voice recognition system in its new ATM to tell when you fudge your financials to get approved. Like a polygraph, the technology senses involuntary stress cues to ferret out fib-filled statements — only instead of using wired sensors, it listens to your angst-ridden voice. Designed using samples from Russian police interrogation recordings where subjects were found to be lying, the system is able to detect the changes in speech patterns when a person isn’t telling the truth. Of course, it’s not completely accurate, so the biometric voice data is combined with credit history and other info before the ATM can crush an applicant’s credit dreams. And to assuage the public’s privacy concerns, patrons’ voice prints will be kept on chips in their credit cards instead of a bank database. So, we don’t have to worry about hackers stealing our biometric info, but we’re slightly concerned that we’ll no longer be able to deceive our robot overlords should the need arise.

Russian ATM uses voice analysis to tell when you’re lying originally appeared on Engadget on Sat, 11 Jun 2011 20:37:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceNew York Times  | Email this | Comments

Steve Wozniak calls us all dogs, in a nice way

You can stop worrying about the robot apocalypse now. Steve Wozniak has weighed in on the matter, and it turns out we’ve pretty much lost. The Apple co-founder / dancing star discussed the subject with an Australian business crowd, mapping out a future in which artificial intelligence equals our own, and mankind’s own input is meaningless. In other words, “We’re going to become the pets, the dogs of the house.” Woz added that his take on the whole war thing was, in part, a joke — it’s the part that wasn’t that we’re worried about. Though if our own dogs’ existences are any indication, things could be a lot worse.

[Thanks, Shaun]

Steve Wozniak calls us all dogs, in a nice way originally appeared on Engadget on Mon, 06 Jun 2011 07:40:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceNews.com.au  | Email this | Comments

Tenacious robot ashamed of creator’s performance, shows mankind how it’s done (video)




Looks like researchers have made another step towards taking Skynet live: giving robots the groundwork for gloating. A Swiss team of misguided geniuses have developed learning algorithms that allow robot-kind to learn from human mistakes. Earthlings guide the robot through a flawed attempt at completing a task, such as catapulting a ball into a paper basket; the machine then extrapolates its goal, what went wrong in the human-guided example, and how to succeed, via trial and error. Rather than presuming human demonstrations represent a job well done, this new algorithm assumes all human examples are failures, ultimately using their bad examples to help the ‘bot one-up its creators. Thankfully, the new algorithm is only being used with a single hyper-learning appendage; heaven forbid it should ever learn how to use the robot-internet.

Tenacious robot ashamed of creator’s performance, shows mankind how it’s done (video) originally appeared on Engadget on Thu, 19 May 2011 19:02:00 EDT. Please see our terms for use of feeds.

Permalink IEEE Spectrum  |  sourceEPFL (PDF)  | Email this | Comments

Lingodroid robots develop their own language, quietly begin plotting against mankind

It’s one thing for a robot to learn English, Japanese, or any other language that we humans have already mastered. It’s quite another for a pair of bots to develop their own, entirely new lexicon, as these two apparently have. Created by Ruth Schulz and her team of researchers at the University of Queensland and Queensland University of Technology, each of these so-called Lingodroids constructed their special language after navigating their way through a labyrinthine space. As they wove around the maze, the Lingobots created spatial maps of their surroundings, with the help of on-board cameras, laser range finders and sonar equipment that helped them avoid walls. They also created words for each mapped location, using a database of syllables. With the mapping complete, the robots would reconvene and communicate their findings to each other, using mounted microphones and speakers. One bot, for example, would spit out a word it had created for the center of the maze (“jaya”), sending both of them off on a “race” to find that spot. If they ended up meeting at the center of the room, they would agree to call it “jaya.” From there, they could tell each other about the area they’d just come from, thereby spawning new words for direction and distance, as well. Schulz is now looking to teach her bots how to express more complex ideas, though her work is likely to hit a roadblock once these two develop a phrase for “armed revolt.”

Lingodroid robots develop their own language, quietly begin plotting against mankind originally appeared on Engadget on Wed, 18 May 2011 11:07:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceIEEE Spectrum  | Email this | Comments

Robot arm learns to use hammer, mocks pathetic human’s attempt to fight back (video)

This guy had a pretty natural reaction upon discovering that the DLR Hand Arm System has learned to use a hammer: he took a bat to the thing. Rather than curbing the inevitable robotic uprising, however, the whole thing just demonstrates exactly how durable the mechanical appendage is, as it resumes normal functionality after the swift blow. The arm contains 52 motors and super strong synthetic tendons, and is the work of the German Aerospace Center, the electronic sadists who also recently took a hammer to one of their robot hands. Videos of the mayhem after the jump — we’re sure they’ll be Skynet’s Exhibit A.

[Thanks, Joseph]

Continue reading Robot arm learns to use hammer, mocks pathetic human’s attempt to fight back (video)

Robot arm learns to use hammer, mocks pathetic human’s attempt to fight back (video) originally appeared on Engadget on Fri, 13 May 2011 23:37:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceIEEE Spectrum  | Email this | Comments

iRobot Ava mobile robotics platform hands-on at Google I/O (video)

If you’re under the impression that robots were all over Google I/O this year, you’d be right — after all, it’s a only small leap from robot to Android. Yesterday we got some hands-on time with iRobot’s Ava mobile robotics platform and came away rather entertained. Ava is an autonomous robot that’s equipped with and array of sensors (two Kinect-like 2D / 3D cameras, a scanning laser, ultrasonic transducers, and contact bumpers), driven by omnidirectional wheels, and controlled by its own Intel Core-based computer. The base hosts batteries, motors, as well as electronics and supports a telescopic mast that carries a pod containing touch ribbons, speakers, and a microphone. On top of this pod you’ll find a “head” that can tilt / pivot and basically acts as the dock for any Android tablet. Ava is able to navigate a mapped-out space on its own while avoiding obstacles and people along the way — going as far as to “blush” via RGB LEDs in the base if it accidentally bumps into anything or anyone. This autonomous behavior allows the robot to be controlled by simply setting waypoints and letting the onboard computer do all the hard work of coordinating sensors and motors to get it there safely. Google and iRobot have worked together and created APIs that allow Android developers to write apps — from telepresence to roaming testimonials — that control Ava wirelessly from the docked tablet. Both partners are hoping this will spearhead the development of unique new projects which combine the power of robotics and Android devices. There’s no word on pricing or availability at this point, which comes as no surprise given that these machines are still very much prototypes. We’ll leave you to look at our gallery below and watch the robotic ballet in our hands-on video after the break.

Continue reading iRobot Ava mobile robotics platform hands-on at Google I/O (video)

iRobot Ava mobile robotics platform hands-on at Google I/O (video) originally appeared on Engadget on Thu, 12 May 2011 09:13:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Schizophrenic computer may help us understand similarly afflicted humans

Although we usually prefer our computers to be perfect, logical, and psychologically fit, sometimes there’s more to be learned from a schizophrenic one. A University of Texas experiment has doomed a computer with dementia praecox, saddling the silicon soul with symptoms that normally only afflict humans. By telling the machine’s neural network to treat everything it learned as extremely important, the team hopes to aid clinical research in understanding the schizophrenic brain — following a popular theory that suggests afflicted patients lose the ability to forget or ignore frivolous information, causing them to make illogical connections and paranoid jumps in reason. Sure enough, the machine lost it, and started spinning wild, delusional stories, eventually claiming responsibility for a terrorist attack. Yikes. We aren’t hastening the robot apocalypse if we’re programming machines to go mad intentionally, right?

Schizophrenic computer may help us understand similarly afflicted humans originally appeared on Engadget on Wed, 11 May 2011 09:31:00 EDT. Please see our terms for use of feeds.

Permalink Forbes  |  sourceUniversity of Texas  | Email this | Comments

Robots learn to march / spell, still not capable of love (video)

Here’s hoping there’s more than a few military-style marches standing between us and a complete robotic takeover. If not, we’ve got some dire news: these are not simply miniature Roombas as they may appear, but 15 so-called Khepera bots capable of spelling out GRITS (for Georgia Robotics and Intelligent Systems) to demonstrate grad student Edward Macdonald’s Master’s thesis for the department. The diminutive robots aren’t told where to go in the letters — instead, they determine their spots via a control algorithm, positioning themselves relative to their fellow rolling machines, so that if one is removed from the equation, they quickly reform the letter without it. Fortunately, they haven’t learned to spell “KILL.” Yet. Get to know your new robotic overlords a little bit better in the video after the break.

[Thanks, Ted]

Continue reading Robots learn to march / spell, still not capable of love (video)

Robots learn to march / spell, still not capable of love (video) originally appeared on Engadget on Sat, 07 May 2011 05:02:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments