Butler Robot Can Fetch Drinks, Snacks

Meet HERB, a robot from Intel’s research labs that can fetch drinks, get a pack of chips and sort dishes. HERB or the Home Exploring Robotic Butler is a project from Intel’s personal robotics group.

The robot sits on a tricked-out Segway base and has arms that are driven by cables to allow it to be extremely dexterous. A spinning laser on the top of the robot help generates 3-D data so robot can identify objects. There’s also a camera to help it “see.”

“It (the robot) looks big but it will fit through most doorways,” says Siddhartha Srinivasa, an Intel researcher who is working on the project. “It’s about a foot longer than the human wingspan.”

Users can tell HERB what they need using an iPhone interface that the team built. There’s also a voice recognition program in the works so you can just tell the robot loud what you want it to do.

The HERB project has been in the works for nearly four years. Intel showed the robot’s latest features at its annual research day fest on Wednesday.

HERB is just one of the many robotics project that is trying to teach machines how to do everyday tasks.  Willow Garage, a Palo Alto, California based startup has a robot called PR2 that is being trained to sort laundry and fold towels.

The idea is to teach robots to go beyond carefully structured and repetitive tasks so they can move beyond factories.

Check out the video of HERB at work. HERB doesn’t move fast but if you could just sit on the couch and have it bring a bottle of beer every time, a few seconds delay shouldn’t bother you that much.

See Also:

Photo: HERB/ Priya Ganapati


Intel Researchers Turn Counter Tops Into Touchscreens

A research project from Intel can turn any surface into a touchscreen. Instead of propping up a tablet or putting a touchscreen computer in your kitchen, picture yourself tapping on the counter top to pull menus, look up recipes and add items to a shopping list.

“There’s nothing absolutely special about the surface, and it doesn’t matter if your hands are dirty,” says Beverly Harrison, a senior research scientist at Intel. “Our algorithm and a camera set-up can create virtual islands everywhere”

Intel demoed the project during the company’s annual research-day fest Wednesday to show touchscreens can go beyond computing and become a part of everyday life.

The project uses real-time 3-D object recognition to build a model of almost anything that’s placed on the counter and offer a a virtual, touchscreen-based menu. For instance, when you put a slab of meat on the counter or a green pepper, they are identified, and a virtual menu that includes recipes for both are shown.

“The computer in real time builds a model of the color, shape, texture of the objects and runs it against a database to identify it,” says Harrison. “And it requires nothing special to be attached on the steak or the pepper.”

Smartphones have turned touch into a popular user interface. Many consumers are happy to give the BlackBerry thumb a pass and instead swipe and flick their finger to scroll. New tablets are also likely to make users want to move beyond a physical keyboard and mouse.

But so far, touchscreens have been limited to carefully calibrated pieces of glass encased in the shell of a phone or computer.

Intel researchers say that won’t be the case in the future. An ordinary coffee table in the living room could morph into a touchscreen when you put a finger on it, and show a menu of music, video to choose from. Or a vanity table in the bathroom could recognize a bottle of pills placed on it and let you manage your medications from there.

Some companies are trying to expand the use of touchscreens. For instance, Displax, based in Portugal, can turn any surface — flat or curved — into a touch-sensitive display by sticking a thinner-than-paper polymer film on that surface to make it interactive.

Intel research labs try to do away with the extra layer. Instead, researchers there have created a rig with two cameras, one to capture the image of the objects and the other to capture depth. The depth cameras help recognize the objects and the difference between the hand touching the table or hovering over it. A pico-projector helps beam the virtual menus. The cameras and the pico-projector can be combined into devices just a little bigger than your cellphone, says Harrison. Sprinkle a few of these in different rooms and point them on tables, and the system is ready to go.

At that point, the software program that Harrison and her team have written kicks in. The program, which can run on any computer anywhere in the house, helps identify objects accurately and create the virtual menus. Just make a wide sweeping gesture to push the menu off the counter and it disappears. There’s even a virtual drawer that users can pull up to store images and notes.

Harrison says all this will work on almost any surface, including glass, granite and wood.

“The key here is the idea requires no special instrumentation,” she says.

Still it may be too early to make plans to remodel the kitchen to include this new system. The idea is still in the research phase, says Harrison, and it may be years before it makes it to the real world.

Photo: A counter top acts as a touchscreen display.
Priya Ganapati/Wired.com

See Also:


Single-Serve Takeaway Wine Glasses Intoxicate Britain

Over in Britain, a nation of binge-drinking alcoholics, there’s now yet another way to get a booze-fix. Marks and Spencer, the kindly uncle of national department stores, is selling a single-serve glass of wine.

The glasses, actually recyclable plastic, come pre-filled with 187ml (6.3-ounces) of Shiraz, Chardonnay or rose and have a peel-off foil lid. They cost £2.25 each ($3.37), which makes them more expensive than buying the same wine by the bottle (four glasses add up to £9, whereas the bottle is £4.50).

The product was invented by an Englishman named James Nash, and ironically his idea, before being picked up by M&S, was laughed off UK reality business show Dragon’s Den by its foolish, short-sighted panel.

The idea of single-serve wine could really take off. In-flight beverage service is the obvious market, doing away with the wastefully separate bottle and cup, but picnics for one could also work well. Sitting in the park with a sandwich, a bottle of wine and a glass will draw in some stares, even if you aren’t dressed like a wino. But with a cold glass (plastic) glass of Chardonnay to accompany your smoked salmon bagel, you’ll be the most sophisticated bum in Union Square. Chin-chin!

Wine Innovations product page [Wine Innovations]

Wine-in-a-glass entrepreneur ridiculed in Dragons’ Den toasts M&S success [Daily Mail via Crave]


Impossible-Looking Pedals Push Your Bike Up Hills

An English inventor has come up with an cheap, lightweight power-assist system for bicycles. It is built into a pair of modified pedals and requires no extra hardware. It also seems to be impossible.

I need your help, here, Gadget Lab readers. First, I’ll tell you what I know. The kit is called “Fast Forward” and, from the pictures, looks to be a pair of regular pedals with rechargeable batteries and motors inside. Fast Forward was designed by inventor Stephen Britt, and he is currently a finalist in the Barclays “Take One Small Step” contest. If it wins, Stephen will receive business funding.

To use them, you just swap them in for the pedals you already have. Here’s Stephen’s pitch:

These replace your standard pedals and provide you with assistance to get you up hills, or carry heavy loads. Each pedal incorporates a motor, gearbox, Li-po batteries and a control board. As you pedal the sensors detect your effort and provide assistance.

To pedal without assistance, simply flip the pedals over. They unclip and slot into a charger for charging, much like with a power tool. When fully developed they will provide a range of 10 miles and peak power of 200W. They will retail for around £200.

There’s no doubt that Stephen could build these pedals, but my question is, would they work? Surely the pedals, without toe-straps, would just spin under your feet. Even if you were to firmly cinch your feet in place, would a spinning pedal provide any assistance? It seems to me that the pedal would just try to twist your toes upwards and annoy you, and generally act like a tail wagging a dog.

But although I did just spend ten minutes with my foot in a spare pedal waving my leg around, I’m no no mechanic, let alone a physicist. So help me, readers. Could this possibly work? Answers, as always, in the comments.

Fast Forward Cycle Pedals [Barclays via Bicycle Design]

See Also:


ALT/1977: If Today’s Gadgets Had Been Made in the 70s

What if you could travel back in time to the 1970s? What would you do? That is the question asked by Alex Varanese in his wonderful ALT/1977 project. His answer?

[G]rab all the modern technology I could find, take it to the late 70’s, superficially redesign it all to blend in, start a consumer electronics company to unleash it upon the world, then sit back as I rake in billions, trillions, or even millions of dollars

This fantasy is realized in the form of four period-accurate promotional posters. Above you see an MP3 player clad in wood-effect plastic with a twiddly metal knob, an LED spectrum-analyser and some mysterious “mode” and “set” buttons, all paired up with some giant retro headphones and that trademark of 1970s audio, the quarter-inch jack.

Alex’ gallery also contains a notebook computer (the LapTron 64), a clamshell cellphone (the MobileVoxx) and a handheld gaming console (the Microcade 3000). If that’s not enough for you, there are an assortment of semi-abstract artworks based on the products. Anyone who grew up at this time will recognize these parodies as being absolutely dead-on.

I love these pictures, not just as clever re-imaginings but as actual products. With the exception of the Sega Game Gear-alike console, which I’m not so hot on, I’d buy any one of these products to use today. Especially that amazing Pocket Hi-Fi (tagline: “Like a party in your pocket. But not in a weird way.”)

ALT/1977: WE ARE NOT TIME TRAVELERS [Behance via ]


Prototype Bike-Helmet Stinks When Damaged

With a rather ingenious piece of engineering, researchers at the Fraunhofer Institute for Mechanics of Materials have come up with a way to force you to replace a damaged crash helmet: Make it stink.

A bike helmet is designed to absorb any impact meant for your head. Like your head, it will break when given a good enough whack, and also like your head, it won’t really work properly afterwards. The new Fraunhofer design mixes malodorous chemical capsules into the helmet’s shell. When the plastic is damaged, the oils are released and your head starts to smell like a hobo’s crotch.

The use of smelly chemicals to alert us to danger isn’t new: the gas that we use to cook is odorless and therefore undetectable without added smell. The Fraunhofer researchers haven’t specified the actual aroma they might use, but I favor something rank. If your lid starts to smell like roses, it is a warning easily ignored. If, however, it makes your noggin emit a hum that makes a dog’s breath seem like a fresh spring breeze then you will be shamed into buying a replacement.

Crash helmet with a useful smell [Physorg via DVICE]

Photo: Fraunhofer IWM


Willow Garage Holds a ‘Graduation Party’ for Its Robots

In an event that made many robot enthusiasts and tech nerds tear up, 11 robots carried flags and waved their arms as they rolled down an aisle as part of their “graduation.”

The 11 model PR2 robots are from Menlo Park, California, robotics company Willow Garage. Over the last few months, the robots have been trained for their new life in research labs worldwide where they will be used to create applications and solve problems.

The robots, each of which cost $400,000, will be working with 11 research teams whose proposals were chosen in a contest that Willow Garage organized in January.

“Robots can do great things for our economy,” Scott Hassan, founder of Willow Garage, told attendees at the event. “They can change our lives in a big way and these robots are capable of doing it in my lifetime.”

Among the tasks that the robots will be put to are folding towels and doing laundry, learning how drawers and refrigerators open, picking up items scattered on a floor, and developing 3-D perception to perform tasks such as setting a table and emptying a dishwasher.

“Robotics will have a big impact on our products in the future,” says Jan Becker, principal engineer at Bosch Research and Technology Center in Palo Alto. Bosch, which makes automotive parts and home appliances, is one of the places where a newly graduated PR2 robot will go to work. Additional sensors will be added to the PR2 robot, testing its ability to feel the environment it is in.

“Many of our products are going to have autonomy, and PR2 will help us test some of our ideas,” says Becker.

Willow Garage was founded in 2006 with the idea of creating an open-source robotics software platform. The hardware isn’t open but the company has created open source programming to drive the machine. Willow Garage’s Robot Operating System (ROS) originated at Stanford’s Artificial Intelligence Laboratory. ROS is based on Linux and can work with both Windows and Mac PCs.

Each PR2 robot has two stereo camera pairs in its head. The four 5-megapixel cameras also include a tilting laser range finder. Each of the robot’s forearms has an ethernet-based wide-angle camera, while the grippers at the tip have three-axis accelerometers and pressure-sensor arrays on the fingertips. At the base of the robot is another laser range finder.

The PR2 is powered by two eight-core i7 Xeon system servers on-board, 48 GB of memory and a battery system equivalent to 16 laptop batteries. Yet, that translates into just about two hours of battery life.

“The robot is dumb as a rock by human standards,” says Keenan Wyrobek, co-director of the personal robotics program at Willow Garage. “But it is very advanced and capable for the tasks it can perform.”

Researchers will get to keep their PR2 robot for two years in order to develop its capabilities. For example, for the last few months, researchers from the University of California Berkeley have been working with a PR2 robot, teaching it to pick up a towel from a pile of laundry, fold it and stack it. The idea is to demonstrate the machine’s ability to perceive and manipulate “deformable objects.”

Other robotics researchers from institutions such as the University of Southern California hope to expand the PR2’s motor skills so it can learn how to pour different kinds of liquid into a cup.

Another plan for one of the robots includes teaching it to work in a collaborative environment with people and other robots. (Let’s hope the robots don’t get into fights.)

It looks like much of the PR2’s training can be done by parents rather than researchers. Now that they’ve graduated from the factory, maybe it’s time to send these robots to daycare?

Check out more photos of the PR2 below.

PR2 says hello to the world. Eric Berger, co-director of the personal robotics program at Willow Garage introduced the robot at a media event.

One of the 11 PR2 robots moves down the aisle as part of its graduation ceremony.

Each arm of the PR2 has seven degrees of freedom, giving it almost-human like flexibility. The arms can carry up to 3.9 pounds (1.8 Kg). The flexibility with the wrists lets the PR2 wave, grip objects and rotate its arm at the elbow.

Photos: Priya Ganapati / Wired.com

See Also:


Video: Flexible Sony Screen Can Be Wrapped Around a Pencil

Forget the iPad, the HP Slate or pretty much any tablet. For true portable big-screen computing we want the roll-up screen that sci-fi has promised us since forever. That dream edges ever closer, and Sony is now helping it along with a flexible display that can be wrapped around a pencil.

The 4.1-inch OLED screen is thin. So thin that it is measured in micrometers. 80μm to be precise: A human hair is a comparatively hefty 100μm.

Sony’s trick was to make the circuitry itself flexible. By marrying the OLED screen with OTFTs (organic thin-film transistors), and using organic, soft insulators therein, a display can be made that shows movies whilst being rolled and stretched. This is the first display ever that can manage this. But enough of the science talk. You want to see it in action, right? Prepare to be amazed:

Pretty cool, huh? And actually not as far into the future as you might think: The other trick is the manufacturing process. The organic components can be dissolved in common solvents, from which the screens are printed instead of being assembled. This should bring costs down far enough to be used in e-paper or even screen-equipped RFID tags. Those uses are rather dull, though. How much cooler would it be to read Wired’s brand-new electronic magazine on one single sheet of electronic paper? The future is almost here.

Sony Develops a Rollable OTFT-driven OLED Display that can wrap around a Pencil [Sony via Akihabara News]

See Also:


Gesture-Based Computing Uses $1 Lycra Gloves

lycra-gloves-computing-mit

Interacting with your computer by waving your hands may require just a pair of multicolored gloves and a webcam, say two researchers at MIT who have made a breakthrough in gesture-based computing that’s inexpensive and easy to use.

A pair of lycra gloves — with 20 irregularly shaped patches in 10 different colors — held in front of a webcam can generate a unique pattern with every wave of the hand or flex of the finger. That can be matched against a database of gestures and translated into commands for the computer. The gloves can cost just about a dollar to manufacture, say the researchers.

“This gets the 3-D configuration of your hand and your fingers,” says Robert Wang, a graduate student in the computer science and artificial intelligence lab at MIT. “We get how your fingers are flexing.” Wang developed the system with Jovan Popović, an associate professor of electrical engineering and computer science at MIT.

The technology could be used in videogames where gamers could pick up and move objects using hand gestures and by engineers and artists to manipulate 3-D models.

“The concept is very strong,” Francis MacDougall, chief technology officer and co-founder of gesture-recognition company GestureTek, told Wired.com. “If you look at the actual analysis technique they are using it is same as what Microsoft has done with Project Natal for detecting human body position.” MacDougall isn’t involved with MIT’s research project.

MIT has become a hotbed for researchers working in the area of gestural computing. Last year, an MIT researcher showed a wearable gesture interface called the “SixthSense” that recognizes basic hand movements. Another recent breakthrough showed how to turn a LCD screen into a low-cost, 3-D gestural computing system.

The latest idea is surprisingly easy in its premise. The system hinges on the ability to use a differentiated enough pattern so each gesture can be looked up quickly in a database.

For the design of their multicolored gloves, Wang and Popović tried to restrict the number of colors used so the system could reliably distinguish one color from another in different lighting conditions and reduce errors. The arrangement and shapes of the patches were chosen such that the front and back of the hand would be distinct.

Once the webcam captures an image of the glove, a software program crops out the background, so the glove alone is superimposed on a white background.

The program then reduces the resolution of the cropped image to 40 pixels by 40 pixels. It searches through a database that contains 40 x 40 digital models of a hand, clad in the distinctive glove showing different positions. Once match is found, it simply looks up the corresponding hand position.

Since the system doesn’t have to calculate the relative positions of the fingers, palm and back of the hand on the fly, it can be extremely quick, claim the researchers.

And if the video is to be believed, the precision with which the system can gauge gestures including the flexing of individual fingers is impressive.

A challenge, though, is having enough processing power and memory so gestures made by a user can be looked up in a database quickly, says MacDougall.

“It takes hundreds of megabytes of pre-recorded posed images for this to work.,” he says, “though that’s not so heavy in the computing world anymore.”

Another problem could be getting people to wear the gloves. Let’s face it: No one wants to look like Kramer in a fur coat from a episode of Seinfeld or an extra in the musical Joseph and the Technicolor Dreamcoat.

MacDougall says the pattern on the gloves can be tweaked to make them less obvious.

“If you want to make it more attractive, you could hide the patterns in a glove using retro-reflective material,” he says. “That way you could [create] differentiable patterns that wouldn’t be visible to the naked eye but a camera’s eye could see it.”

Wang and Popović aren’t letting issues like fashion dictate their research. They say they are working on a design of similarly patterned shirts.

Photo: Jason Dorfman/CSAIL
Video: Robert Y. Wang/Jovan Popović

See Also:


Anybots Robot Will Go to the Office for You

anybots

Robots have replaced humans on assembly lines, battlefields, space missions and rescue operations. Now how about doing something useful, like sitting through endless meetings for you?

Meet the Anybots QB, a telepresence robot that can represent you in the office by sitting in conference rooms, going to meetings and rolling about through the cubicle farm. The whole time it does so, it displays a live webcam video of your face, while transmitting to you a live video and audio stream of whatever it’s looking at.

“The QB is an extension of you,” Bob Christopher, chief operating officer of Anybots told Wired.com. “It removes the barriers between people and work so people can teleport themselves to the office space.”

Christopher was formerly the chief executive officer of Ugobe, which made the ill-fated Pleo robotic dinosaur toys. Ugobe closed its doors last year, having failed to make a commercial success of its eerily lifelike toys.

QB won’t replace video conferencing, says Christopher, but it’s a way to look over the shoulder of your colleagues and employees without actually getting into the office. The robot can be manipulated by a user at home or any other location using just a web browser, and can transmit its master’s voice and video.

Think of it as a self-propelled Skype-cam on a stick.

A device with Segway-like balancing properties, the QB has two eyes shaped like a bug’s that give it an aesthetic similar to Pixar’s Wall-E. The cameras (and screen) are mounted atop an adjustable pole, putting them at approximately eye level with your coworkers. QB has eight hours of battery life, supports 802.11g Wi-Fi, comes with a 5-megapixel video camera and a top speed of 3.5 miles per hour. A 320 x 240 LCD screen on QB offers videos and photos, and acts as a control panel.

The $15,000 robot will be available in fall, says Anybots.

Finding ways to make telecommuting easier for office workers or helping teams spread across different locations work together has been a major area of research and product development in robotics. Research firm Gartner estimates the video-conferencing market could grow 17.8 percent between 2008 and 2013, rising from $3.8 billion to $8.6 billion.

Anybots isn’t the only company to try mixing telepresence and robots. Companies like iRobot and WowWee tried to capture some part of that business. IRobot announced ConnectR, a Roomba with a video camera, while WowWee’s Rovio is a little three-wheeled webcam bot. The ConnectR was quietly killed during the Consumer Electronics Show last year, while Rovio lives on. Willow Garage, a Palo Alto robotics company, has also created a telepresence robot called Texai, though that’s not on sale yet.

QB offers a similar experience but makes it more polished — and not so close to the ground. The robot weighs about 35 pounds and its neck can go from 3 feet to 5 feet, 9 inches. And it’s easy to use, says Christopher.

Open up a web browser, log in and with just the Up and Down controls on your computer keyboard, you can move the QB around.

The QB has an Intel Core 2 Duo processor and will soon support 3G networks. And because the robot is not tied to one user, it can be used by different employees logging in from an external location, says Christopher.

“Put a QB in the office and anyone who’s not there can take the robot and move it over to someone else’ desk,” he says. “After the first few minutes, people forget they are talking to a robot.”

That may be possible, but it is difficult to imagine that most companies will want to purchase many of these robots, no matter what the advantages are. At $15,000 apiece, they don’t come cheap.

Check out the video below to see Anybots’ QB at work:

See Also: