Microsoft Has Invented a Way To Embed Data in 3D Printed Objects

Say you have a 3D-printed item you’re selling. Maybe you slap a barcode on it that identifies what it is, where it came from, and how much it costs. However, Microsoft has developed a new technology called InfaStructs that offers a better alternative. With it, you could embed information directly inside of an item.

Read more…

    

The Daily Roundup for 07.25.2013

DNP The Daily RoundUp

You might say the day is never really done in consumer technology news. Your workday, however, hopefully draws to a close at some point. This is the Daily Roundup on Engadget, a quick peek back at the top headlines for the past 24 hours — all handpicked by the editors here at the site. Click on through the break, and enjoy.

Comments

How Disney Is Using 3D Printing To Give Robots Soul-Piercing Eyes

Some toys look like they’re about ready to come to life and stab you. And if Disney has anything to do with it, we’re going to see even more toys with creepy, lifelike faces in the future, thanks to a new technology called Papillion that 3D-prints eyes onto toys, robots, and other interactive characters.

Read more…

    

NVIDIA Tegra 5 Graphics, Release Date

The first Tegra 4 handsets have not yet reached the market that NVIDIA is already showing its next-generation mobile chip codenamed “Logan” (the first name of super-hero Wolverine) at SIGGRAPH. Details on the SoC itself are pretty scarce since NVIDIA […]

Like It , +1 , Tweet It , Pin It Original content from Ubergizmo.

    

Disney Research’s AIREAL creates haptic feedback out of thin air

DNP Disney Research's Aireal creates haptic feedback out of thin air

Disney Research is at it again. The arm of Walt’s empire responsible for interactive house plants wants to add haptic feedback not to a seat cushion, but to thin air. Using a combination of 3D-printed components — thank the MakerBots for those — with five actuators and a gaggle of sensors, AIREAL pumps out tight vortices of air to simulate tactility in three dimensional space. The idea is to give touchless experiences like motion control a form of physical interaction, offering the end user a more natural response through, well, touch.

Like most of the lab’s experiments this has been in the works for awhile, and the chances of it being used outside of Disneyworld anytime soon are probably slim. AIREAL will be on display at SIGGRAPH in Anaheim from Sunday to Wednesday this week. Didn’t register? Check out the video after the break.

Filed under: ,

Comments

Via: Gizmodo (Australia)

Source: Disney Research

A Revolutionary All-Seeing Camera Lens That Puts the Lytro To Shame

It hasn’t exactly been a runaway hit with consumers, but on a technical level the Lytro camera introduced some brilliant innovation to the world of digital photography. Its revolutionary optics capture an almost infinite depth of field letting you adjust focus to whatever’s in the frame when you’re post-processing. But as researchers from Saarland University in Saarbrücken, Germany, have demonstrated with a new camera accessory, the Lytro is just the tip of the iceberg.

Read more…

    

A Sneak Peek At the Mind-Boggling Future of Computer Graphics

Computer graphics have come a long way since a T-Rex ate that lawyer in Jurassic Park. But if these glimpses of what the next generation of CG has in store, we ain’t seen nothing yet. Cloth simulations with hyper-realistic wrinkling, modelling complex human hair using thermal imaging, and new approaches to smoke rendering will make our future blockbusters even more blockbustery. More »

Fabricated: Scientists develop method to synthesize the sound of clothing for animations (video)

Fabricated Scientists synthesize the sound of moving clothing, but you'll still need the Wilhelm Scream

Developments in CGI and animatronics might be getting alarmingly realistic, but the audio that goes with it often still relies on manual recordings. A pair of associate professors and a graduate student from Cornell University, however, have developed a method for synthesizing the sound of moving fabrics — such as rustling clothes — for use in animations, and thus, potentially film. The process, presented at SIGGRAPH, but reported to the public today, involves looking into two components of the natural sound of fabric, cloth moving on cloth, and crumpling. After creating a model for the energy and pattern of these two aspects, an approximation of the sound can be created, which acts as a kind of “road map” for the final audio.

The end result is created by breaking the map down into much smaller fragments, which are then matched against a database of similar sections of real field-recorded audio. They even included binaural recordings to give a first-person perspective for headphone wearers. The process is still overseen by a human sound engineer, who selects the appropriate type of fabric and oversees the way that sounds are matched, meaning it’s not quite ready for prime time. Understandable really, as this is still a proof of concept, with real-time operations and other improvements penciled in for future iterations. What does a virtual sheet being pulled over an imaginary sofa sound like? Head past the break to hear it in action, along with a presentation of the process.

Continue reading Fabricated: Scientists develop method to synthesize the sound of clothing for animations (video)

Filed under: ,

Fabricated: Scientists develop method to synthesize the sound of clothing for animations (video) originally appeared on Engadget on Wed, 26 Sep 2012 23:40:00 EDT. Please see our terms for use of feeds.

Permalink PhysOrg  |  sourceCornell Chronical  | Email this | Comments

SIGGRAPH 2012 wrap-up

SIGGRAPH 2012 wrapup

Considering that SIGGRAPH focuses on visual content creation and display, there was no shortage of interesting elements to gawk at on the show floor. From motion capture demos to 3D objects printed for Hollywood productions, there was plenty of entertainment at the Los Angeles Convention Center this year. Major product introductions included ARM’s Mali-T604 GPU and a handful of high-end graphics cards from AMD, but the highlight of the show was the Emerging Technologies wing, which played host to a variety of concept demonstrations, gathering top researchers from institutions like the University of Electro-Communications in Toyko and MIT. The exhibition has come to a close for the year, but you can catch up with the show floor action in the gallery below, then click on past the break for links to all of our hands-on coverage, direct from LA.

Continue reading SIGGRAPH 2012 wrap-up

Filed under:

SIGGRAPH 2012 wrap-up originally appeared on Engadget on Fri, 10 Aug 2012 13:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Colloidal Display uses soap bubbles, ultrasonic waves to form a projection screen (hands-on video)

Colloidal Display uses soap bubbles, ultrasonic waves to form a projection screen handson video

If you’ve ever been to an amusement park, you may have noticed ride designers using some non-traditional platforms as projection screens — the most common example being a steady stream of artificial fog. Projecting onto transparent substances is a different story, however, which made this latest technique a bit baffling to say the least. Colloidal Display, developed by Yoichi Ochiai, Alexis Oyama and Keisuke Toyoshima, uses bubbles as an incredibly thin projection “screen,” regulating the substance’s properties, such as reflectance, using ultrasonic sound waves from a nearby speaker. The bubble liquid is made from a mixture of sugar, glycerin, soap, surfactant, water and milk, which the designers say is not easily popped. Still, during their SIGGRAPH demo, a motor dunked the wands in the solution and replaced the bubble every few seconds.

A standard projector directed at the bubble creates an image, which appears to be floating in the air. And, because the bubbles are transparent, they can be stacked to simulate a 3D image. You can also use the same display to project completely different images that fade in and out of view depending on your angle relative to the bubble. There is a tremendous amount of distortion, however, because the screen used is a liquid that remains in a fluid state. Because of the requirement to constantly refresh the bubbles, and the unstable nature of the screen itself, the project, which is merely a proof of concept, wouldn’t be implemented without significant modification. Ultimately, the designers hope to create a film that offers similar transparent properties but with a more solid, permanent composition. For now, you can sneak a peek of the first iteration in our hands-on video after the break.

Continue reading Colloidal Display uses soap bubbles, ultrasonic waves to form a projection screen (hands-on video)

Filed under:

Colloidal Display uses soap bubbles, ultrasonic waves to form a projection screen (hands-on video) originally appeared on Engadget on Fri, 10 Aug 2012 12:24:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceYoichi Ochiai  | Email this | Comments