Fujitsu Tablet Prototype Lets You Feel Rough and Smooth Textures on Screen

Fujitsu has rolled out a prototype tablet that has a cool bit of tech inside. The tablet has haptic sensory technology inside that allows you to feel images on the screen. Users are able to feel smooth or rough textures depending on what is displayed on the screen.

fujmagnify

The tablet uses ultrasonic vibrations to create a cushion of high-pressure air between the finger and the screen. That air acts like a cushion and makes the screen feel very smooth. Depending on what image is on the screen, the vibrations can be rapidly cycled to create the feeling of a rough surface.

fujitsu tb 620x184magnify

Fujitsu is showing the tech off at Mobile World Congress 2014 with several images that do things like allow the user to feel the skin of an alligator, pluck virtual strings on a harp, or feel the sensation of opening a combination lock.

Touch or Tickle Your Beloved Even When You’re Far Away with Bond

Now you can caress and even tickle your significant other, even when you’re far away, with Bond. It’s basically a wearable system that lets you reach out and “touch” people from far away.

bond remote touch gadget 620x364magnify

The Bond device works with an iOS or Android device via Bluetooth to deliver the virtual caresses. It’s pretty versatile too, as the sensor can be worn as either a bracelet or a pendant.

Once activated, users can “touch” and tickle others remotely anywhere in the world where there’s cell service coverage.

If you touch it for 1 second, your friend will get a one second tickle. Tickles can be up to five seconds long and any colour of the rainbow – the colour just depends on how long you touch it for. Swipe BOND and you will send a rainbow tickle.

Bond is currently up for funding on Indiegogo through December 3rd, where a minimum pledge of $170(USD) will get you a pair – with one for you to keep and one for you to give away to your significant other.

Disney Research Simulates 3D Geometry on Touch Surfaces: Touch & Feel Screen

The geniuses at Disney Research are obsessed with touch-based input. One of their latest breakthroughs is an algorithm that can “simulate rich 3D geometric features (such as bumps, ridges, edges, protrusions, texture etc.) on touch screen surfaces.” In other words, it provides the feeling of touching a 3D object even though the user is only touching a flat surface. Someday we’ll know what an Angry Bird feels like.

disney 3d tactile rendering on touchscreen 620x348magnify

To prove that their algorithm works, Seung-Chan Kim, Ali Israr and Ivan Poupyrev of Disney Research Pittsburgh used an “electro-vibration based friction display.” The display emits a voltage that simulates the friction that our hands would feel if we were actually touching the object shown in the image or video. The researchers say that they can get depth maps from 3D models or from a depth sensor such as Kinect.

Combine this with the Oculus Rift and adult films – er videogames will attain a higher level of realism.

[via Disney Research via Reddit]

Touchy-Feely Speakers Hint at the Future of Haptic Interfaces

Touchy-Feely Speakers Hint at the Future of Haptic Interfaces

Touchscreens are flat and hard by necessity—thanks to their dense layers of glass, conductive metal, and capacitors. But as haptic interfaces start to appear in commercial gadgets, touchscreen devices are poised to become even more… touchy. Enter Eunhee Jo, a Korean designer who’s spending the next year as a designer in residence at London’s Design Museum, and who specializes in haptic interfaces.

Read more…


    



XCM X1 Plus Controller Shell Adds Xbox One Vibration Triggers to Xbox 360

One of the coolest features of the new Xbox One is the extra vibration motors in the triggers. These add a new sensation to gameplay which can provide feedback directly to your fingertips. Now, there’s a mod available for the Xbox 360 controller which adds a similar feature.

xcm x1 plus 1

The new XCM X1 Plus controller shell not only replaces the outside of your Xbox 360′s stock wireless controller with something much cooler looking, it adds in a pair of rumble motors in the triggers as well.

xcm x1 plus 2

It’s not clear at this point how the motors are activated though, as current Xbox 360 games don’t pass along data for these triggers independently like they do on the Xbox One. From what I can tell from the video below, they’ve got them set up to vibrate automatically whenever you press the triggers, so I’m not sure how that would feel, or if it would just be annoying.

There’s no word yet on pricing or a release date for the XCM X1 Plus controller shell, but keep your eye out on their website for more info.

Ford Vibrating Shift Knob Tells Drivers When to Shift: Semi-Automatic Transmission

Earlier this month we saw a car with a joystick shift lever. It looks cool, but it doesn’t have any additional function. Ford engineer Zachary Nelson made a more high-tech shift lever mod that’s geared towards newbie drivers. It’s a shift knob that vibrates to tell you when to shift gears.

ford vibrating shift knob by zach nelson

The shift knob is based on the Arduino Pro Mini microcontroller. Using an Android app and the OpenXC Vehicle Interface, the knob “monitors the vehicle’s speed, RPM and accelerator pedal position. Based on this information, the application calculates and then indicates to the driver when he or she should shift by vibrating the shift knob.” Additionally, the knob can be set to prioritize speed or fuel economy. Zach used a motor from an Xbox 360 controller to make the knob vibrate and then designed and 3D printed the knob’s case. It was then installed onto the manual shift lever from a Ford Mustang.

Start your browser’s engine and head to OpenXC to find out how to make a vibrating shift knob. Or not.

[via Wired via Gearfuse]

Hands-on with Disney Research’s AIREAL haptic feedback technology (video)

Handson with Disney Research's AIREAL haptic feedback technology video

If you’re hoping to get some more tactile feedback out of augmented reality environments, the folks at Disney Research have devised the AIREAL system that could end up doing just that. The team is showing off the project at SIGGRAPH’s Emerging Technologies space, so we made sure to stop by for a look and feel. As quick refresher, the technology reacts to the user’s gestures by churning out a vortex of air to provide tactile feedback in real space — thanks to an almost entirely 3D printed enclosure and a smattering of actuators and depth senors. In the demo we saw, hovering our hand just over a display summoned a butterfly.

Once it landed, that small bit of air offered up the physical sensation that it was actually touching us. As we moved closer to a virtual open window, wings went a flutter and the whole sensation increased a bit. Sure, what we saw was a fairly simple use scenario, but there are aspirations for this to enhance gaming experiences and other augmented environments (likely within the confines of a Disney park, of course) with the addition of haptic feedback. Looking for a bit more info? Consult the video after the break for just that.

Filed under:

Comments

Disney Research’s AIREAL creates haptic feedback out of thin air

DNP Disney Research's Aireal creates haptic feedback out of thin air

Disney Research is at it again. The arm of Walt’s empire responsible for interactive house plants wants to add haptic feedback not to a seat cushion, but to thin air. Using a combination of 3D-printed components — thank the MakerBots for those — with five actuators and a gaggle of sensors, AIREAL pumps out tight vortices of air to simulate tactility in three dimensional space. The idea is to give touchless experiences like motion control a form of physical interaction, offering the end user a more natural response through, well, touch.

Like most of the lab’s experiments this has been in the works for awhile, and the chances of it being used outside of Disneyworld anytime soon are probably slim. AIREAL will be on display at SIGGRAPH in Anaheim from Sunday to Wednesday this week. Didn’t register? Check out the video after the break.

Filed under: ,

Comments

Via: Gizmodo (Australia)

Source: Disney Research

Disney’s AIREAL Creates Tactile Feedback in Mid-Air

The technical magicians at Disney Research are at it once more. This time, they’re working on a technology which allows users to feel sensations without actually having to touch a surface.

disney aireal air haptic feedback

AIREAL is a combination of hardware and software which can create tiny air vortexes in 3D space. It was developed by researchers Rajinder Sodhi, Ivan Poupyrev, Matthew Glisson, and Ali Israr. A set of these small haptic-feedback devices can be used in combination with gesture-based control devices to let users feel sensations and virtual textures while interacting with their computers and video game systems. This is truly some science fiction stuff made real.

Check out some examples of AIREAL in action in the clip below:

Pretty amazing concept, no? Wouldn’t it be cool to combine this with a head-mounted display like the Oculus Rift? The wind could blow in your hair as you run through a virtual world, or you could feel bullets whizzing by when you’re being shot at. Crazy stuff. Or it might just turn up in a next generation of Disney’s Haunted Mansion – where you can actually feel the ghosts surrounding you. Hopefully the Disney Research guys talk to the Imagineers.

You can read the entire research paper on AIREAL here. [PDF]

Putting Your Finger in this Japanese Robot is a Step Toward Actual Virtual Reality

Haptic system from NHK

Welcome to Touchable TV!
In addition to showcasing their 8K, 7680×4320, Ultra-High-Def (Ridiculous-Def?) TV broadcasting kit last weekend, Japan’s NHK also demoed a haptic feedback device that simulates virtual 3D objects in real time. And the thing is, it’s really just a robot that, when you touch it, kinda touches you back.

NHK (Nippon Hōsō Kyōkai/Japan Broadcasting Corporation) is a public media organization somewhat analogous to the American PBS. However, entirely not at all like its American counterpart, the J-broadcaster’s got this: NHK Science & Technology Research Laboratories. Which is nice, because in cooperation with various corporate partners, NHK seriously delivers the tech.

Okay fine… so where’s the robot?

Haptic Virtual Reality that’s Actually Virtual – Just Put Your Finger in This Robotic Thingy!
In the image above, a brave test pilot is placing his index finger into the locus of a five-point artificial haptic feedback environment. Based on the analysis & modeling of a virtual 3D object that in turn informs the movements and relative resistances among five robotic arms controlling the five feedback points, a focused area of stimuli/response is generated. Sounds complicated to explain “robotic, artificial sense of touch” that way, but conceptually the idea is quite simple:

#1. Put your finger in here and strap on the velcro:

#2. It’ll feel like you’re touching something that doesn’t physically exist, like Domo-kun (Dōmo-koon) here:

Each of those shiny round points is the terminus of a robotic arm that either gives way or holds steady based on the relative position of the finger to the contours of the object being simulated. Each point’s position-resistance refreshes every 1/1000th of a second. Not bad.

For practical, full-immersion VR to exist (in a physical sense; that is, before VR becomes a direct neural interface a la The Matrix), for now and for a while our low-to-medium-resolution interactive haptic feedback interfaces will be intrinsically robotic. And for virtualizing entirely digital, non-real artifacts, NHK’s device is a step in that direction.

Of course five points of interactivity might not sound like much, but mindful of the generally leapfroggy nature of technological advancement, effectively replicating and surpassing the haptic resolution we now experience via the estimated 2,500 nerve receptors/cm² in the human hand doesn’t seem too tall an order.

If that does seem too tall, if that does sound too far out and overly optimistic, if it seems impossible that we’d ever be able to cram 2,500 sensory & feedback robots into a square centimeter – well, then your robo-dorkery score is low and you need to pay more attention. Because dude, we’re already building nanorobots atom-by-atom. Not an “if” question, this one.

Neat… But Anything Really New Here?
Of course, a wide variety of teleoperated force-feedback systems are either already in use or in-development (the da Vinci Surgical System; NASA’s Robonaut 2; etc.), so it’s important to emphasize here that NHK’s device is novel for a very particular reason: Maybe all, or nearly all, of the force-feedback haptic systems currently in use or development are based on an ultimately analog physicality. That is to say, whether it’s repairing a heart valve from another room, or, from a NASA building in Texas, tele-pushing a big shiny button on the International Space Station – what’s being touched from afar ultimately is a physical object.

So, what we might consider contemporary practical VR is more accurately a kind of partial VR. As the sense of touch is essential to our experience as human beings, incorporating that sense is a step toward interactive, actual factual, truly virtual virtual reality. Modeling and providing haptic feedback for non-physical objects, i.e., things that don’t really exist, in concert with other virtualization technologies – that’s a big step.

So What Can/Does/Will it Do?
NHK is kind of talking up the benefits for the visually impaired – which is good and noble and whatnot – but perhaps focusing on that is a bit of a PR move, because at least in theory this technology could go way, way beyond simple sensory replacement/enhancement.

An advanced version, incorporating the virtual touching of both simulated and/or real objects, could add layers of utility and interactivity to almost any form of work, entertainment, shopping… from afar we might discern how hard it is to turn a valve in an accident zone (partial VR), how bed sheets of various thread count feel against the skin (partial or full VR), the rough surface of the wall one hides behind in a videogame (proper VR), or even pettting the dog, or petting… ummm, a friend (partial and/or proper VR – chose your own adventure)!

That’s a ways off, but in the short-to-near-term, here’s how NHK envisions functionality for their touchable TV tech:

Matchmaker, Matchmaker, Make Me a Full-Immersion Omni-Sensory VR System!
Okay, so to get this ball rolling: NHK, meet VR upstart Oculus Rift. NHK & Oculus Rift, meet VR/AR mashup Eidos. NHK, Oculus Rift, and Eidos, meet UC Berkely’s laser-activated pseudo-robotic hydrogels.

We’re all waiting for your pre-holodeck lovechild.

• • •

Reno J. Tibke is the founder and operator of Anthrobotic.com and a contributor at the non-profit Robohub.org.

Via: MyNavi (Japanese/日本語); DigInfo

Images: DigInfo; NHK