Intel Researchers Turn Counter Tops Into Touchscreens

A research project from Intel can turn any surface into a touchscreen. Instead of propping up a tablet or putting a touchscreen computer in your kitchen, picture yourself tapping on the counter top to pull menus, look up recipes and add items to a shopping list.

“There’s nothing absolutely special about the surface, and it doesn’t matter if your hands are dirty,” says Beverly Harrison, a senior research scientist at Intel. “Our algorithm and a camera set-up can create virtual islands everywhere”

Intel demoed the project during the company’s annual research-day fest Wednesday to show touchscreens can go beyond computing and become a part of everyday life.

The project uses real-time 3-D object recognition to build a model of almost anything that’s placed on the counter and offer a a virtual, touchscreen-based menu. For instance, when you put a slab of meat on the counter or a green pepper, they are identified, and a virtual menu that includes recipes for both are shown.

“The computer in real time builds a model of the color, shape, texture of the objects and runs it against a database to identify it,” says Harrison. “And it requires nothing special to be attached on the steak or the pepper.”

Smartphones have turned touch into a popular user interface. Many consumers are happy to give the BlackBerry thumb a pass and instead swipe and flick their finger to scroll. New tablets are also likely to make users want to move beyond a physical keyboard and mouse.

But so far, touchscreens have been limited to carefully calibrated pieces of glass encased in the shell of a phone or computer.

Intel researchers say that won’t be the case in the future. An ordinary coffee table in the living room could morph into a touchscreen when you put a finger on it, and show a menu of music, video to choose from. Or a vanity table in the bathroom could recognize a bottle of pills placed on it and let you manage your medications from there.

Some companies are trying to expand the use of touchscreens. For instance, Displax, based in Portugal, can turn any surface — flat or curved — into a touch-sensitive display by sticking a thinner-than-paper polymer film on that surface to make it interactive.

Intel research labs try to do away with the extra layer. Instead, researchers there have created a rig with two cameras, one to capture the image of the objects and the other to capture depth. The depth cameras help recognize the objects and the difference between the hand touching the table or hovering over it. A pico-projector helps beam the virtual menus. The cameras and the pico-projector can be combined into devices just a little bigger than your cellphone, says Harrison. Sprinkle a few of these in different rooms and point them on tables, and the system is ready to go.

At that point, the software program that Harrison and her team have written kicks in. The program, which can run on any computer anywhere in the house, helps identify objects accurately and create the virtual menus. Just make a wide sweeping gesture to push the menu off the counter and it disappears. There’s even a virtual drawer that users can pull up to store images and notes.

Harrison says all this will work on almost any surface, including glass, granite and wood.

“The key here is the idea requires no special instrumentation,” she says.

Still it may be too early to make plans to remodel the kitchen to include this new system. The idea is still in the research phase, says Harrison, and it may be years before it makes it to the real world.

Photo: A counter top acts as a touchscreen display.
Priya Ganapati/Wired.com

See Also:


Toshiba AirSwing UI puts you on the screen with your data

We’ve seen a Minority Report-esque interface or two hundred by this point, but Toshiba’s AirSwing really caught our attention. Using little more than a webcam and some software, this bad boy places a semi-transparent image of the operator on the display — all the easier to maneuver through the menus. And according to Toshiba, that software only utilizes about three percent of a 400MHz ARM 11 CPU — meaning that you have plenty of processor left for running your pre-crime diagnostics. There is no telling when something like this might become commercially available, but the company plans to bundle it in commercial displays for malls and the like. Video after the break.

Continue reading Toshiba AirSwing UI puts you on the screen with your data

Toshiba AirSwing UI puts you on the screen with your data originally appeared on Engadget on Fri, 28 May 2010 02:38:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceDiginfo  | Email this | Comments

Gesture-Based Computing Uses $1 Lycra Gloves

lycra-gloves-computing-mit

Interacting with your computer by waving your hands may require just a pair of multicolored gloves and a webcam, say two researchers at MIT who have made a breakthrough in gesture-based computing that’s inexpensive and easy to use.

A pair of lycra gloves — with 20 irregularly shaped patches in 10 different colors — held in front of a webcam can generate a unique pattern with every wave of the hand or flex of the finger. That can be matched against a database of gestures and translated into commands for the computer. The gloves can cost just about a dollar to manufacture, say the researchers.

“This gets the 3-D configuration of your hand and your fingers,” says Robert Wang, a graduate student in the computer science and artificial intelligence lab at MIT. “We get how your fingers are flexing.” Wang developed the system with Jovan Popović, an associate professor of electrical engineering and computer science at MIT.

The technology could be used in videogames where gamers could pick up and move objects using hand gestures and by engineers and artists to manipulate 3-D models.

“The concept is very strong,” Francis MacDougall, chief technology officer and co-founder of gesture-recognition company GestureTek, told Wired.com. “If you look at the actual analysis technique they are using it is same as what Microsoft has done with Project Natal for detecting human body position.” MacDougall isn’t involved with MIT’s research project.

MIT has become a hotbed for researchers working in the area of gestural computing. Last year, an MIT researcher showed a wearable gesture interface called the “SixthSense” that recognizes basic hand movements. Another recent breakthrough showed how to turn a LCD screen into a low-cost, 3-D gestural computing system.

The latest idea is surprisingly easy in its premise. The system hinges on the ability to use a differentiated enough pattern so each gesture can be looked up quickly in a database.

For the design of their multicolored gloves, Wang and Popović tried to restrict the number of colors used so the system could reliably distinguish one color from another in different lighting conditions and reduce errors. The arrangement and shapes of the patches were chosen such that the front and back of the hand would be distinct.

Once the webcam captures an image of the glove, a software program crops out the background, so the glove alone is superimposed on a white background.

The program then reduces the resolution of the cropped image to 40 pixels by 40 pixels. It searches through a database that contains 40 x 40 digital models of a hand, clad in the distinctive glove showing different positions. Once match is found, it simply looks up the corresponding hand position.

Since the system doesn’t have to calculate the relative positions of the fingers, palm and back of the hand on the fly, it can be extremely quick, claim the researchers.

And if the video is to be believed, the precision with which the system can gauge gestures including the flexing of individual fingers is impressive.

A challenge, though, is having enough processing power and memory so gestures made by a user can be looked up in a database quickly, says MacDougall.

“It takes hundreds of megabytes of pre-recorded posed images for this to work.,” he says, “though that’s not so heavy in the computing world anymore.”

Another problem could be getting people to wear the gloves. Let’s face it: No one wants to look like Kramer in a fur coat from a episode of Seinfeld or an extra in the musical Joseph and the Technicolor Dreamcoat.

MacDougall says the pattern on the gloves can be tweaked to make them less obvious.

“If you want to make it more attractive, you could hide the patterns in a glove using retro-reflective material,” he says. “That way you could [create] differentiable patterns that wouldn’t be visible to the naked eye but a camera’s eye could see it.”

Wang and Popović aren’t letting issues like fashion dictate their research. They say they are working on a design of similarly patterned shirts.

Photo: Jason Dorfman/CSAIL
Video: Robert Y. Wang/Jovan Popović

See Also:


Apple applies for ‘disappearing button’ patent

You know that little sleep indicator light on the front of your new MacBook Pro — the one that simply disappears when your notebook is wide awake? Apple wants to do that for buttons, too. Cupertino’s latest patent application is for pressure-sensitive, capacitive touchscreen materials it could build right into the surface of its aluminum-clad devices, and identify with laser-cut, micro-perforated holes that let light shine from within. According to the filing, the technology could potentially be used to eliminate existing buttons in favor of a smooth, solid slab, and / or integrate new ones into surfaces that weren’t previously considered for use. Engineers imagine light-up controls on a laptop’s lid that could be used while closed for things like USB charging and media playback, and local heat and sound sensors that selectively light up interface opportunities when users are in close proximity. Not bad, Apple. As long as you let us keep our nice, springy keyboards, we’re all for revolutionizing the rest of modern input.

Apple applies for ‘disappearing button’ patent originally appeared on Engadget on Fri, 30 Apr 2010 07:26:00 EST. Please see our terms for use of feeds.

Permalink AppleInsider  |  sourceUSPTO  | Email this | Comments

Japan plans mind-reading robots and brain interface devices ‘by 2020’

Our grandparents did warn us that laziness would get us in trouble. The Japanese government and private sector are, according to the Nikkei, all set to begin work on a collaborative new project to develop thought-controlled gadgets, devices … and robots. The aim is to produce brain-to-computer interfaces that would allow the ability to change channels or pump out texts just with your almighty brain power, while also facilitating artificial intelligence that would be capable of detecting when you’re hungry, cold, or in need of assistance. Manufacturing giants Toyota, Honda and Hitachi get name-dropped as potential participants in this 10-year plan, though we wonder if any of them will have the sense to ask what happens when an ultra-precise and emotionless bot is given both intelligence and mind-reading powers. Would it really stick to dunking biscuits in our tea, or would it prefer something a little more exciting?

Japan plans mind-reading robots and brain interface devices ‘by 2020’ originally appeared on Engadget on Fri, 23 Apr 2010 10:04:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceYahoo! News  | Email this | Comments

Synaptics extends multitouch Gesture Suite to Linux, Chrome OS included

Well, it had to happen at some point. After eons of watching Mac OS and Windows users swiping away nonchalantly on their touchpads, Linux laptop buyers can now also join the multitouch fray. Synaptics has announced official Gesture Suite support for a wide range of Linux-based OS flavors — Chrome OS, Fedora, Ubuntu, RedFlag, SuSE, and Xandros get name-dropped in the press release — which will all benefit from its set of multi-fingered touch and swipe responses. The infamous pinch-to-zoom is quite naturally included in the Suite, which will come bundled with new installations of those operating systems. We’re not seeing any mention of a downloadable update as yet, but we imagine that’ll be corrected in due course, whether by the company itself or the resourceful Linux community. Full PR after the break.

Continue reading Synaptics extends multitouch Gesture Suite to Linux, Chrome OS included

Synaptics extends multitouch Gesture Suite to Linux, Chrome OS included originally appeared on Engadget on Tue, 20 Apr 2010 06:06:00 EST. Please see our terms for use of feeds.

Permalink PC World  |  sourceSynaptics  | Email this | Comments

Microsoft seeking patent for Windows Phone 7 Series panoramic GUI

The US Patents and Trademark Office has today made public a Microsoft patent application (serial no. 240,729) related to the graphical user interface found on the hotly anticipated Windows Phone 7 Series mobile OS. Filed in September 2008, this application describes a “contiguous background” that extends beyond the dimensions of the screen (either vertically or horizontally, but not both) with anchored “mixed-media” elements being littered atop it — all of which is to be served on a “media-playing device.” That should sound pretty familiar, given that it’s the central navigational concept of both Windows Phone 7 and the Zune HD, and as such it makes a lot of sense for Microsoft to seek to legally protect its uniqueness. Before you start wondering about potential conflicts with other UIs, take note that this requires a continuous graphical background rather than a tiled or repeating image, plus space-orientating graphical elements, which should make it sufficiently nuanced to avoid any more patently unnecessary squabbles should Microsoft’s claims be validated by the USPTO.

Microsoft seeking patent for Windows Phone 7 Series panoramic GUI originally appeared on Engadget on Thu, 01 Apr 2010 21:02:00 EST. Please see our terms for use of feeds.

Permalink Electronista  |  sourceUSPTO  | Email this | Comments

Omnimo: desktop Windows given fashion makeover with Phone 7 Series flair

Can’t wait for Windows Phone 7 Series, but can’t hack the emulator, either? Don’t lose hope, Windows junkies — you can still bring some semblance of WP7S order into your life with this Metro UI-inspired desktop HUD. Based on the open-source desktop customization platform Rainmeter, the “Omnimo UI” will overlay your desktop with a minimalist, tiled interface not unlike the one you’ve been drooling over for weeks, with live hooks into many useful services (including Gmail, iTunes, Steam, Twitter and SpeedFan) as well as the usual widgets and a host of program shortcuts. The best news of all? It’s available now for all versions of Windows since XP, completely free of charge; simply follow the source links or flit over to Lifehacker, where good folks will teach you how it’s done.

Omnimo: desktop Windows given fashion makeover with Phone 7 Series flair originally appeared on Engadget on Tue, 30 Mar 2010 18:48:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceRainmeter, Omnimo UI  | Email this | Comments

A Closer Look at Sony’s New Skin for Android Phones

sony-phones

Sony Ericsson’s new Android-based phone interface, like those from other cellphone manufacturers, integrates Facebook, Flickr, Twitter and other social networking services into one unified portal on your portable. The difference is that Sony Ericsson’s interface — UXP, formerly known as Rachael — actually looks useful.

The company plans to launch a slew of new Android-based phones this year. Top of the list is the Xperia X10 — which confusingly carries the same codename that UXP used to have: Rachael. It’s a device with a 4-inch touchscreen, a 1-GHz Snapdragon processor and 8.1 megapixel camera that will be available this quarter. The company will also introduce the Mini, a compact phone with a 2.6-inch display that will be available in a touchscreen-only version as well as one with a slide-out keyboard.

But it’s UXP that forms the heart of these phones’ experience. Sony Ericsson has been working on the UXP interface for more than two years, the company says.

“We have done extensive skinning of the Android platform. because we really wanted to make it a bespoke experience,” says George Arriola, head of user experience for Sony Ericcson.

Sony’s UXP interface attempts to do the same thing as rivals like Motorola’s MotoBLUR: namely, aggregate social networking feeds such as Facebook and Twitter into one stream, integrate that data with your phone address book and contacts, and personalize the multimedia experience.

“We took a very sophisticated PlayStation middleware and shrunk it to fit the Android OS,” says Arriola.

Palm was the first of the smartphone makers to kick off the trend of integrating social media updates and contacts with the launch of the Palm Pre, though the Pre was based on Palm’s own operating system webOS, not Android. But the Android phones launched since then have tried to follow the path blazed by Palm.

Motorola has the MotoBlur interface that’s now a part of most of its phones, including the Cliq, Backflip and Devour. HTC has introduced Sense, its custom UI that’s available on phones such as the HTC Hero and upcoming phones including Legend and Desire.

But Sony’s UXP interface is the most visually attractive implementation that I have seen so far.

timescape

At the heart of Sony’s experience is a widget called Timescape. Timescape collects social networking feeds and presents them in a card-like view.  A bar at the bottom of the screen has little icons that lets users filter the information stream by network such as Facebook, Twitter or Flickr.

The phone also updates the address book with a contact’s latest social networking update. That means if you click on a name in your address book, you can see their last social-feed post and use it as a reference point while making the call.

What makes this experience slick is the way the cards rain down on the screen, offering an almost 3-D–like effect as they scroll past. Clicking on one of the cards pulls up the contact and their status update.

Rather than contribute to info clutter, Sony’s attempt to jazz it up by using better visual effects actually does make it easier to handle the information stream.

The UXP interface also introduces a concept called “infinite pivot” — an infinity-shaped icon that helps you drill deeper and pull up related views.

mediascape

Sony is also trying to offer a better experience for music, video and photos. The widget that controls this is called Mediascape. Click on the Mediscape icon and you get three options: My Music, My Videos and My Photos.

Music and videos are divided into recently played, recently added and favorites. There’s also access to PlayNow, Sony Ericsson’s music-downloads service.

A recommendation engine can suggest other artists or songs based on the music preferences of a user. Clicking on the  infinite-pivot icon next to an artist’s name in music and videos offers suggestions and even searches the web.

And in a bid to keep the custom look throughout the phone,  Sony redesigned the interface to services such as the phone dialer, calendar and alarm, says Arriola.

Overall, Sony Ericsson’s UXP skin for Android is not as confusing as the MotoBlur interface and more polished than the HTC Sense UI. Instead, UXP is a snappy, sophisticated treat. It works, though, only if you buy into the premise that instead of checking your Facebook and Twitter when you want to (as in the iPhone), you would like these services streamed and updated constantly to your phone.

Now if only they could get U.S. wireless carriers to offer Sony Ericsson phones on contract — and at prices slim enough to match the hardware.

Check out the candid photos of the Sony UXP interface on the Xperia X10 phone below.


Google Adds Gesture Search to Android Phones

gesturesearch2Google had added a sweet little extra that’s likely to make many Android users happy. The company is offering a new app called Gesture Search that lets users search their phone by just drawing the letter on the touch screen.

Open up the app, scrawl for instance ‘n’, and it will search through phone contacts, bookmarks, applications and music to find everything that begins with that letter.

To refine your search, you can just draw another letter, and the search results will shift accordingly.

To wipe off a letter or just start over,  you can draw a horizontal line at the bottom. Drawing from right to left deletes the last letter of the query and going from left to right wipe off the entire query, says Google.

Gesture Search recognizes both lower case and upper case letters and is available for free.

Google says the search is “fast and fun to use.” But be warned, the app is available only for phones that run Android operating system version 2.0 and higher. That means many of Motorola’s latest phones including the newly released Backflip and Devour as well as phones released late last year such as the HTC Droid Eris won’t support Gesture Search.

Gesture Search isn’t a game-changing idea but it’s a neat service from a company that still does the best search.

But because it is a Google Labs project, which means it is still in the beta stages, Gesture Search is not available outside the U.S.

Photo: Google