3M Touch Systems 84-inch Projected Capacitive Display hands-on

3M Touch Systems 84inch Projected Capacitive Touch Display handson

We knew we’d be seeing 3M Touch Systems’ monster capacitive display once again, but we had no idea just how massive this year’s iteration would be. Taking up residence in a corner of CES Unveiled 2013, the company’s latest multi-touch prototype now measures in at 84 inches, far surpassing its 46-inch predecessor, with 100-inch versions waiting in the wings. This particular touch table now supports Ultra HD resolution (4K) and was shown running a software demo currently in use at Chicago’s Museum of Science. As you may be able to tell from the accompanying gallery, those floating images aren’t of the crispest quality, but that’s because the files aren’t fully high-res. Of course, tech of this kind isn’t necessarily intended for households — not yet, anyway — it makes for a more natural fit in commercial environments (think: airports, car dealerships or wireless retailers). At present, the table here on the showfloor is calibrated to support 40 individual touch points, but a company rep assured us it could be configured for up to 60, allowing for large groups of people to interact simultaneously. While touch tabletops of this kind are still quite rare in the wild, expect to see them crop up more commonly in the near future. Check out our gallery below and stay tuned for a video demo.

Sarah Silbert contributed to this report.

Filed under:

Comments

DIY Google Glass puts iOS in front of your eyes

Google may be beavering away on the last stages of Project Glass before the Explorer version arrives with developers, but meanwhile DIY wearable computers are springing up, some with Apple’s iOS at their core. A straightforward combination of an iPod touch, off-the-shelf wearable display, Bluetooth camera and a set of safety goggles was enough for AI researcher Rod Furlan to get a glimpse at the benefits of augmented reality, he writes at IEEE Spectrum, though the headset raised as many questions as it provided answers.

Print

Furlan’s hardware falls roughly in line with what we’ve seen other projects piece together in earlier AR attempts. He opted for a MyVu eyepiece – a 0.44-inch microdisplay culled from a cheap Crystal headset, such as used in this UMPC-based wearable back in 2009, and this Beagleboard version in 2010 – hooked up to the composite video output of a 4th-gen iPod touch; that way, he can see a mirror of the iPod’s UI floating in his line of sight.

Meanwhile, a Looxie Bluetooth Video Camera – stripped of its casing and attached to the goggles – streams video to the iPod touch wirelessly. Furlan says he’s cooking up a second-gen version running off a Raspberry Pi, again another approach we’ve seen other wearables experimenters take. That, Furlan says, will allow for more flexibility with the Looxie’s input, as well as greater support for other sensors such as accelerometers.

The interesting part is how Furlan’s experience of the wearable evolved, from initial discomfort and a sense of information overload – the feeling of needing to keep up with every notification, server status, stock price, and message that pops up – to a less conscious consumption of the data flow:

“When I wear my prototype, I am connected to the world in a way that is quintessentially different from how I’m connected with my smartphone and computer. Our brains are eager to incorporate new streams of information into our mental model of the world. Once the initial period of adaptation is over, those augmented streams of information slowly fade into the background of our minds as conscious effort is replaced with subconscious monitoring” Rod Furlan

That fits in line with what we’ve heard from Google itself; Glass project chief Babak Parviz said recently that part of the company’s work on software has been to deliver a pared-back version of the usual gush of information that hits our smartphone and tablet displays. Developers, for instance, will be able to use a set of special cloud APIs to prioritize specific content that gets delivered to the Android-based wearable.

Furlan concludes that the biggest advantage of wearables won’t be overlaying data on top of the real world – what we know as augmented or mediated reality – but being able to persistently record (and recall) all of our experiences. That does differ from Google’s perception, where capturing photos and videos is only seen as a subset of Glass, and the headset is gradually being positioned as a way to access a curated feed of the digital world, whether that be from Google Now prompts or something else.

[via] 9to5Mac]


DIY Google Glass puts iOS in front of your eyes is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

PowerVR Series6 mobile GPUs are almost here, we go eyes-on with a test chip (video)

PowerVR Series 6 mobile GPUs are almost here, we go handson with a test chip video

Imagination Technologies is on a high right now. Throughout 2012, the company’s PowerVR graphics processors continued to monopolize the iPhone and iPad as well as appearing in (late 2011) Android flagships, the PlayStation Vita and even the first Clover Trail-powered Windows 8 tablets. But you know what? That’s old news, because all those devices run current-gen PowerVR Series5 silicon. Most new top-end devices in 2013 and 2014 will either contain the latest Mali GPUs from rival ARM, or they’ll pack PowerVR Series6, aka Rogue. This latter chip is currently being developed by at least eight different smartphone and tablet manufacturers and is expected to make a good bit of noise at CES next week.

But who’s going to wait that long if they don’t absolutely have to? To get a fuller understanding of what awaits us in the coming weeks and months, we scoped out a Rogue test chip at Imagination’s sparkly new HQ just outside of London, UK. The test silicon doesn’t represent the true power of Series6 because it’s running on an FPGA board that severely limits its bandwidth, but it’s still able to show off one crucial advantage: namely the ability to run OpenGL ES 3.0 games and apps. This API is all about improving mobile graphics through making smarter use of GPU compute, without annoying the battery, and the three demos after the break show just how it pulls that off.

Continue reading PowerVR Series6 mobile GPUs are almost here, we go eyes-on with a test chip (video)

Filed under: , ,

Comments

Google Glass spotted in wild with prescription lenses

Google’s Glass wearable computer has been spotted in the wild in New York City, complete with what appears to be integrated prescription lenses. The bright red augmented reality headset – set to ship to developers in $1,500 Explorer Edition form early in the new year – was spotted by a Road to Virtual Reality tipster on what’s presumably a lucky Googler testing Glass while out and about.

google_glass_nyc_wild_px

Google’s Sergey Brin sported a set of Glass with sunglasses lenses back at Google IO, with the tinted sections apparently clipping into the brow frame. Meanwhile, Google had also confirmed that it was looking at prescription lens support.

google_glass_sunglasses_1

Google is also exploring the potential to integrate the Glass display cube into a set of prescription lenses themselves, rather than using a separate display altogether. That would require more precise optical work, of course, and could prove significantly more expensive when it comes to changing your prescription.

google_io_project_glass-580x386

Exactly how well the Explorer version will handle lenses remains to be seen; Google has described it as a test kit for developers to begin coding augmented reality-compatible apps, rather than the final form-factor of the hardware. It’s also believed to feature a bone-conduction earpiece for sound inaudible to anyone but the wearer. Judging by what look to be discrete metal lens rims, however, it’s an altogether slicker system than the large black glasses Google showed photos of at I/O 2012.


Google Glass spotted in wild with prescription lenses is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Google Glasses rapid prototype built in just two hours

Rapid prototyping isn’t anything new, but making prototypes for future technologies in under a couple of hours is pretty incredible. In what almost seems like something you would see MacGyver do, a team of rapid prototypers have come up with working prototypes of several different technologies, including Google Glasses and the touch interface featured in Minority Report.

At Mind The Product 2012, Google’s Tom Chi demonstrated that anyone can build these incredible products and ideas by using everyday materials and a bit of ingenuity. For example, Chi’s team built a fully-working prototype of Google Glassses from a coat hanger, a piece of plexi-glass, a Pico protector, a wire harness, and a netbook.

Chi’s team also built a prototype of the gestural interface as seen in the movie Minority Report. Unlike the Google Glasses rapid prototype, this only took 45 minutes to throw together, and it uses materials that you would normally find in any office or home, including a coat hanger, a whiteboard, fishing wire, a couple of hairgrips, a chopstick, and a presentation clicker.

Obviously, these rapid prototypes aren’t that practical, but the important thing that Chi notes is they get you to think and do the first things that pop in your mind. Chi also notes that the first thing that does pop in your head is “the right thing” only about 5% of the time, but he mentions that 5% is usually the rate of success for most startups. Chi notes that once you begin rapid prototyping, you go through ideas a lot quicker, meaning a higher success rate:

“By the time you try 20 things, even if each individual thing only has a 5% chance of success, by the time you try 20 things, your chance of success goes up to 64%. By the time you try 50 things, it goes up to 92%. It’s almost like you can’t fail!”

[via Mind the Product]


Google Glasses rapid prototype built in just two hours is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Japan Display shows low-power reflective LCD that does color, video

DNP Japan Display shows lowpower reflective 'paper' display fast enough for video

Seen any color video in your e-reader lately? Us neither, and Japan Display wants to change all that with a new reflective, paper type LCD capable of the feat that burns very little juice, to boot. To pull it off, the prototype uses a so-called light control layer, allowing it to collect rays and bounce them toward your eyes, exactly like plain old analog paper. The consortium developed a low color fidelity version with five percent NTSC coverage and a bright 40 percent reflection, along with a dimmer version carrying a third less reflectivity but a more faithful 36 percent hue gamut. The latter still needs some tweaking, according to Japan Display, but the more reflective version is now good to go for production, meaning it might start popping up in new readers imminently. For more info, check the video after the break.

[Image credit: Diginfo]

Continue reading Japan Display shows low-power reflective LCD that does color, video

Filed under:

Japan Display shows low-power reflective LCD that does color, video originally appeared on Engadget on Mon, 05 Nov 2012 23:17:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceDiginfo  | Email this | Comments

Moog’s LEV-96 sensoriactuator prototype wields touch control of 96 simultaneous harmonics, we go eyes-on (video)

Moog's LEV96 sensoriactuator prototype wields touch control of 96 simulataneous harmonics, we go eyeson video

Late last week, Moog outed its LEV-96 sensoriactuator prototype and offered a glimpse at its latest R&D unit. Even though it’s still in the early phases of beta-testing, we were able to stop by the Moog Music factory for a closer look and a brief glimpse of the gear in action ahead of its appearance at Moogfest. While the unit is installed on acoustic guitars for the time being, the company says that similar tech can be used on other acoustic instruments and eventually to other surfaces — this is just the current manifestation. Since the tech modifies the guitar’s natural harmonics and string vibrations, the LEV-96 is getting cozy on both traditional acoustic guitars and those outfitted with pickups in its present state.

As far as controls go, the entire unit is capacitive touch-enabled from the moment a finger swipe powers it on. Sliders allow for adjusting the intensity, harmonics and note duration while the other buttons enable arpeggio presets and modulation that includes tremolo and random harmonic tweaks. Those sliders remain in play when a preset is activated, serving to enable further adjustments on selected There is a lock button, too, so that you don’t accidentally make a switch mid-strum. All of these finger-friendly surfaces work alongside two pairs of electromagnetic pickup channels per string to wrangle the 96 simultaneous harmonics. Magnets work to either increase of decrease the string’s motion, bringing out vibrational modes that have always been in-play on acoustic instruments, but have never been offered the power needed to make ’em sing. The folks at Moog are quick to remind us that the LEV-96 is still in its infancy, but you can rest assured we’ll be keeping an eye our for what develops. For a peek at the tech in action, head on past the break for a really quick demo that we kept brief due to that fact that this is an early prototype.

Continue reading Moog’s LEV-96 sensoriactuator prototype wields touch control of 96 simultaneous harmonics, we go eyes-on (video)

Filed under:

Moog’s LEV-96 sensoriactuator prototype wields touch control of 96 simultaneous harmonics, we go eyes-on (video) originally appeared on Engadget on Thu, 25 Oct 2012 15:40:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Robotic wheelchair concept adds leg-like movement, tackles stairs with ease (video)

Robotic wheelchair concept adds leglike movement, tackles stairs with ease video

Why choose between legs and wheels when you can have both? Well, that’s the theory behind a robotic wheelchair concept from the Chiba Institute of Technology, which uses leg-like motion to conquer obstacles a run-of-the-mill wheelchair can’t. The key is the five axes its base rotates on, allowing individual wheels to be lifted off the ground and moved in a walking style. It can tackle steps and various other obstacles whilst remaining stable, and can even turn 360 degrees around its center with the help of some onboard stabilizers. A gang of sensors on the chair detect incoming obstructions and deal with them automatically, but changes in wheel torque can act as substitute triggers, should the sensors fail. Judging from the video below, it’s pretty advanced for a concept, but its creator wants a bunch of people to try it out so he can “fine-tune the user experience.” It may not be quite as cool as Stompy or the mighty Kuratas, but it’s definitely more practical for a trip to the shops.

Continue reading Robotic wheelchair concept adds leg-like movement, tackles stairs with ease (video)

Filed under: ,

Robotic wheelchair concept adds leg-like movement, tackles stairs with ease (video) originally appeared on Engadget on Mon, 15 Oct 2012 07:18:00 EDT. Please see our terms for use of feeds.

Permalink DigInfo TV  |  sourceChiba Institute of Technology  | Email this | Comments

LG Nexus Prototype detailed in full

The next Nexus family device set to be released by Google and LG appears to have been revealed essentially top to bottom by a man with a prototype. This LG Nexus device being a prototype shown off by the Onliner lends quite a bit of detail as far as how this device will look and feel, but does not send out final details insomuch as it’s working with software that’s already out on the market and does not seem to be performing as it would have to in benchmark tests of several types. This review of the device release today is therefor a look at the ideation process behind Google and LG’s collaboration.

The photographs above and below appear very much to be showing an LG device based on their recently revealed LG Optimus G, a device with a quad-core processor from Qualcomm and a massive camera. The USA versions of the Optimus G have two different cameras, one model with a 13 megapixel camera, the other with an 8 megapixel camera. This LG Nexus device quite likely has an 8 megapixel camera, this assessment based on the apparent size of the module compared to the G and our understanding of how Google advances their Nexus line one step at a time – aka they wouldn’t skip 8 and go right to 13.

The Galaxy Nexus, a Samsung device, surprised the masses last year with a release that included a 5 megapixel camera. We expect that this LG device – if indeed it does pan out – will include an 8 megapixel camera at least. This prototype carries with it the code LG E960 as well as the name “Mako”. It’s also been suggested that this device is code-named LG Nexus 4.

The LG Nexus 4 name comes from the idea that it is the 4th major Nexus smartphone on the market – or will be in the near future. The prototype review we’re seeing today shows the device to be extremely similar to the Galaxy Nexus in size and shape with a slightly more flat front and back, a glittery sort of back panel not unlike the Optimus G, and a front panel with glass that curves downward near its edges.

The display is shown here to be an IPS LCD and has the same amount of pixels that the Galaxy Nexus had at 720 x 1280 across a 4.7-inch panel. The image you’re seeing below shows the device next to the iPhone 5, that device having a 4-inch panel, both devices apparently showing the same wallpaper to make it clear which is the brighter and the sharper of the two. We do not know if the reviewer here made both devices hit their maximum brightness.

Keep your eyes peeled for a real look at this device as it leaves its prototype stages and brings on the next generation of Google’s Vanilla-flavored Android system. We’re expecting a very clean version of whatever Google has to offer next for Android with no carrier additions. What we’re suspecting, in addition to this, is a worldwide release – or something close to it – with Google’s own Google Play store online offering up the device for sale for everyone all at once.

We shall see soon, and very soon, if the rumors are correct!

[Thanks Gene for the tip!]


LG Nexus Prototype detailed in full is written by Chris Burns & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


Vibrating gloves helps you locate items in store

Have you ever gone to a hypermarket like Tesco or Carrefour, only to realize that the place is so darn vast, that it can be difficult to locate your can of baked beans – especially if this is the first time you paid a visit to such a huge place instead of your regular grocery store down the street? Researchers over at the Helsinki Institute for Information Technology can certainly identify with such a feeling, hence coming up with a pair of conceptual vibrating gloves that will help direct you to the specific item you’re looking for.

This prototype glove relies on vibration feedback as it guides you towards a predetermined target. Your hand will inadvertently be steered towards the object, where it operates on a “hot/cold” principle – the closer you are to the object and in the right direction, the more it will vibrate, and vice versa when you are further away from the object you’re looking for. Of course, this can also be used in other situations such as when you are looking for your car in a vast parking lot, or even tracking down a book of your choice. So far, initial tests seem promising. I am quite sure that wearing this will also help you find those droids you were looking for…

By Ubergizmo. Related articles: Nokia “Facehugger” face sensor is something right out of Giger’s imagination, Photographs of early iPad prototype spotted,