Elliptic Labs is back on the scene with another demo of its touchless UI. This time ’round the company’s teamed up with Opera and presents us with a much more polished affair, not to mention a couple technical details. According to CEO Stian Aldrin, the device is based on ultrasound, tracks the hand itself (no reflector or sensor necessary), has a range of one foot, and has been designed to be either embedded in any electronic device (including a cellphone) or to connect to devices via USB. The company’s current demo shows the technology being used to flip through photos in an Opera widget. Sure, a couple simple one-gesture commands isn’t exactly “pulling all the stops,” as far as a proof-of-concept goes, but we’re looking forward to seeing what this company comes up with in the future. Peep for yourself after the break.
The sound of a fingernail raking across a table or a board may be enough to drive most people crazy. But get past that annoyance and it could become a way to answer your phone, silence a call or turn up the volume.
Scratch Input, a computer input technique developed by researchers at the Human-Computer Interaction Institute at Carnegie Mellon University, uses the sound produced when a fingernail is dragged over the surface of any textured material such as wood, fabric or wall paint. The technology was demonstrated at the Siggraph graphics conference this year.
“It’s kind of a crazy idea but a simple one,” says Chris Harrison, one of the researchers on the project. “If you have a cellphone in your pocket and want to silence an incoming call, you don’t have to pull it out of your pocket. You could just drag your fingernail on your jeans.”
As researchers study how people can interact in simpler and more innovative ways with computers and gadgets, going beyond the traditional keyboard, mouse and keypad has become important. Earlier this year, Harrison and his team demonstrated a touchscreen where pop-up buttons and keypads can dynamically appear and disappear. That allows the user to experience the physical feel of buttons on a touchscreen.
Scratch Input is another way to explore how we can interact with devices, says Harrison. Harrison, along with a colleague Julia Schwarz, and his professor Scott Hudson started working on the idea a year ago. Scratch Input works with almost any kind of surface except glass and a few other materials that are extremely smooth.
“With this we can start to think of every flat surface as an potential input area,” says Daniel Wigdor, user experience architect at Microsoft and curator of the emerging technology demos at Siggraph. “Imagine a cellphone with a mini projector. You can now turn an entire surface into a screen for the projector and use the surface to control it.”
Scratch Input works by isolating and identifying the sound of a fingernail dragging on an area.
“All the sound happening in the environment like people putting coffee cups on the table, cars going by or children screaming, we know what frequencies they are in,” says Harrison.
A fingernail on a surface produces a frequency between 6000 Hz and 13,000 Hz. Compare that to voice, which is typically in the range of 90 Hz to 300 Hz, or noise from a refrigerator compressor or air conditioning hum, which is in the range of 50 Hz or 60 Hz.
“It makes it easy for us to throw away all the other acoustic information and just listen to what your nail sounds like,” says Harrison.
Harrison and his team used that principle to rig up a system for Scratch Input. They attached a modified stethoscope to a microphone that converts the sound into an electrical signal. The signal is amplified and connected to a computer through the audio-input jack.
“If mass produced, this sensor could cost less than a dollar,” says Harrison.
Scratch Input also supports simple gesture recognition. Tracing the letter ‘S,’ for instance produces an acoustic imprint that the system can be trained to identify. The idea has its limitations. For instance, many letters that are written differently, sound very similar such as M, W, V, L, X or T. Scratch Input cannot accurately distinguish between these gestures. But still Harrison says the system can respond with about 90 percent accuracy.
Another problem is that the system cannot determine the spatial location of the input, says Wigdor. “For instance, with volume control, it can hear your finger spin in the appropriate gesture but the system can’t see it so sometimes it does not have enough information to react.”
Despite the limitations, the technology holds enough promise to make it into the hands of consumers, says Wigdor. “It is exciting because it is so low cost,” he says. “This idea has the potential to go beyond just a research project.”
While the world patiently awaits the release of the first Windows Mobile 6.5 device, it seems like the devs behind the software are warming to the fact that folks love those touchscreens. While existing versions of WinMo — not to mention early builds of WinMo 6.5 –have focused on switching between screens via clickable tabs, a new ROM pictured over at PPCGeeks shows a subtle but significant change. If you’ll notice, the screen on the right would prefer that you swipe left or right to get from ‘Version’ to ‘Copyrights’ or ‘Device ID,’ which should absolutely delight fans of the OS who also prefer touchscreen-based phones. Now, if only we could get Microsoft to push this stuff out onto a shipping handset, we’d really have a reason to cheer.
Okay, you know the drill by now: just because it’s in a patent doesn’t mean it’s happening anytime soon, if ever. With that said, we’d love to see what Nokia had in mind when they concocted this one. As Unwired View recently unearthed, the Finnish phone maker has drawn up a design doc / patent application for comfortable, stretchable material that fits over your skin and is used for device interaction. Gestures and stretches are computed and signaled into nearby computers, phones, or interestingly enough “near-eye displays” — sounds like we’re getting into a bit of virtual / augmented reality territory here — and they are also tailored to provide feedback via vibration. Again, don’t hold your breath on seeing this come to fruition any point in the near (or even long) future, but still, we know what you’re thinking: Nokia’s gonna have to think of a ton of kooky color descriptions to accentuate any future lineup of input wristbands / fingerbands.
Always on the lookout for bigger and better ways to faux-scratch a record with your PC, these students at Northeastern University have developed a human-computer interface that utilizes copper pads and our beloved theory of electrostatics. This little devil is able to track the position of a user’s hand in three dimensions, without attaching markers to the body or requiring the user to hold some sort of controller. We can think of a couple theremin players that would love to get their hands on one of these things (Mike Love, we’re looking at you). But don’t take out word for it — peep the video below to groove along with these dudes as they literally rock the (virtual) bells, play some organ, and even do a little fingerpainting.
With NAB just about to get started in Las Vegas, CoolCameraGear is getting out ahead of the crowd with a newfangled adapter sure to please those who find themselves offloading gobs of RED footage. The R2E LEMO to eSATA cable essentially takes the burden away from your FireWire 800, FireWire 400 or USB 2.0 bus by enabling bits and bytes to flow over eSATA. RED camera users simply plug in their RED-Drive or RED-RAM using the original power adapter, then plug the LEMO end of the R2E cable into the drive and the other end into a standard eSATA port. Boom. Just like that, RED owners have instant access to eSATA transfers. For those unaware, eSATA support on camcorder is still a rarity, though the benefits are obvious for pros shuffling through multiple takes. Interested consumers can check this one out when the CoolCameraGear website goes live on April 20th for $230.
Looking to have the best of both worlds in terms of virtual and physical interfacing, Media Computing Group’s developed the Silicon Illuminated Active Peripherals (SLAP) which, as the name suggests, consists of tangible widgets that can be placed anywhere on a surface computer and used for context-specific controls. Examples used are an Optimus-esque keyboard, a slider similar to those found on audio boards, and a knob for video editing. It’s a clever approach, sure, but here’s hoping future implementations will be able to include a sharper, higher resolution screen. Kindly direct yourself to the links below for video demonstration.
It’s pretty clear by watching the demonstration video (which is lurking in the read link, just so you know) that this stuff is still pretty preliminary, but we could definitely see it going places with the right people behind it. The Interface Database Concept was dreamed up by Alan Sien Wei Hshieh, and by utilizing a relatively simple set of Javascripts, he was able to overcome traditional platform incompatibilities that can so often hamstring medical hardware / software in day-to-day usage. The creation aims to enable “seamless and intuitive data transfer” and to “define a set of gesture and multitouch commands that will override controls and input devices that may be difficult to use on medical devices.” The aforementioned vid shows off gesture-based transfers and even an accelerometer-based cross-platform transfer, both of which make you forget that we’re just talking about X-rays and blood tests.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.