This Contact Lens Puts a Display Right On Your Eye

It seems like everyone and their brother is working on some sort of smart glasses, but how about smart contact lenses? It turns out those might actually be closer than you think. More »

Google Glass bone conduction earpiece tipped for private audio

Google has used bone conduction for its Project Glass wearable computer, it’s claimed, promising discrete notifications that only the wearer themselves can hear. The headset makes contact with the mastoid process, linked directly to the middle ear, insiders tell Geek, meaning any audio output – such as new messages, Google+ alerts, or other notifications – is piped in directly, completely inaudible to those around the Glass owner, and yet can still be perceived despite high background noise.

Bone conduction has been implemented on a number of wearable audio devices, from Bluetooth headsets – Jawbone’s headsets use speech vibrations picked up through the upper cheek to perform noise cancellation, for instance – to stereo headphones. As well as cutting through loud background noise more efficiently, they also can help keep the user’s ears open rather than plugging them up with earbuds.

That’s particularly useful if you’re using an AR device like Google Glass, which is intended to be worn semi-permanently. Google is yet to give any specific hardware details about audio from the headpiece – in fact, all specifications publicly shared to-date are subject to change, as Google tweaks the design ahead of the initial “Explorer” developer versions set to ship early in the new year – but it was assumed that a small speaker was embedded in the oversized arm-piece.

Such a speaker would have drawbacks, however. For instance, controlling volume would require repeated stabbing at buttons on the Glass device itself, unless automatic volume levels were implemented; that could lead to distractions for those around the wearer, if the volume was set too high. Meanwhile, some notifications might be private, or the audio could be a hands-free call, and discretion preferred.

Audio quality of bone conduction systems tends to be less audiophile-level than traditional headphones, but the technology’s other advantages may well outweigh any shortcomings there. It’s possible that the oblong pad in the image above – shared by Google back in May – is the bond conduction assembly.


Google Glass bone conduction earpiece tipped for private audio is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Minecraft Reality for iOS Takes In-Game Creations to the Real World, No 3D Printing Needed

There are already a handful of ways to bring – or at least replicate – your Minecraft masterpieces to the real world. But if you’re looking for a cheap route, check out this Mojang-approved iOS app made by 13th lab. It’s called Minecraft Reality and it uses augmented reality to make the two worlds meet.

minecraft reality ios app

I have not tried the app yet, but apparently it can even scale the objects you upload relative to the real world scenery. You can also save the location of the object(s) you place so that other users of the app can see your work and vice versa.

Minecraft Reality is already on the App Store and sells for $1.99 (USD). Sadly, it’s for newer iOS devices only. It’s not compatible with the 4th gen iPod Touch and any devices that came before it, and the developer also mentioned that “most functionality is not available on iPhone 4.” Fortunately our imagination has no such system requirements. I just placed a Weeping Angel behind you.

[via The Verge via Joystiq & Minecraft Reality]


Mojang launches Minecraft Reality for iOS

If you’ve ever caught yourself wishing that real life was like Minecraft, then boy does Mojang have the app for you. Developed primarily by studio 13th Lab, Minecraft Reality is a new augmented reality iOS app that allows you to drop your most cherished Minecraft creations into the real world. Do you particularly like that towering Pikachu pixel art you made with Minecraft blocks? Why not place it into the real world and see what it would look like if a giant Pikachu invaded New York City?


It’s definitely a cool idea, but there’s more to Minecraft Reality than simply seeing your Minecraft creations show up in the world around you. You can actually walk around them to view them from all different angles, and if your friends have Minecraft Reality installed on their iDevice, they’ll be able to track down your creations and check them out too. That’s made possible by the app’s use of GPS tracking to remember where in the world you placed your creations, which is an awesome feature if you ask us.

You can resize the object before you place it, letting you make sure that it fits in with the surrounding well enough before plopping it down in the real world. The app also comes with a few pre-made models ready to be placed, so you can begin using this app right away even if you don’t have anything of your own waiting to be uploaded. Speaking of uploading, players can head to http://minecraftreality.com to upload their own Minecraft worlds for use in the app. Check out Minecraft Reality in action below.

Minecraft Reality is available now on the iTunes App Store for $1.99 [download link]. The listing on the App Store says it isn’t compatible with the iPod Touch 4G or earlier and that it most of the app’s functionality isn’t available on the iPhone 4, so keep that in mind if you’re planning to buy. Do you think you’ll be picking up Minecraft Reality?

[via Mojang]


Mojang launches Minecraft Reality for iOS is written by Eric Abent & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


Microsoft Patent Hints at Google Project Glass Competition

In the past, we’ve talked about the interesting and odd project from Google called Project Glass. The technology is basically a small wearable augmented reality system with a display embedded into a pair of glasses. A patent application has surfaced from Microsoft that shows the company is at least considering a competing product.

ms glasses

Artwork included with the patent application shows one example application as view of a baseball game offering factoids about players, including stats, hovering above them while you watch the game. They also showed another example where subtitles are displayed over an opera performance. Here’s the abstract from the patent application:

A system and method to present a user wearing a head mounted display with supplemental information when viewing a live event. A user wearing an at least partially see-through, head mounted display views the live event while simultaneously receiving information on objects, including people, within the user’s field of view, while wearing the head mounted display. The information is presented in a position in the head mounted display which does not interfere with the user’s enjoyment of the live event.

Microsoft’s offerings are also not meant to be worn at all times, while Google is hoping we’ll be wearing their glasses everywhere we go. It appears that Microsoft is tying their device to live events like sports and concerts. The Microsoft product would be able to the project text and audio overlays onto whatever the wearer is viewing.

msft glasses patent 1

The patent app was originally filed in May of 2011, but was updated this week. There is no indication of what the status of this project is at Microsoft right now; it could be significantly further along considering it’s been over a year since the application was filed, or it could just be a concept. Whereas Google already has usable prototypes of their system, it appears Microsoft’s is in the planning stages.

You can view Microsoft’s complete patent application here.

[via UnwiredView]


Microsoft Has Plans For Its Own Project Glass

A patent application published yesterday reveals that Microsoft is sitting on plans for its own version of Google’s Project Glass. More »

Microsoft’s Google Glass rival tech tips AR for live events

Microsoft is working on its own Google Glass alternative, a wearable computer which can overlay real-time data onto a user’s view of the world around them. The research, outed in a patent application published today for “Event Augmentation with Real-Time Information” (No. 20120293548), centers on a special set of digital eyewear with one or both lenses capable of injecting computer graphics and text into the user’s line of sight, such as to label players in a sports game, flag up interesting statistics, or even identify objects and offer contextually-relevant information about them.

The digital glasses would track the direction in which the wearer was looking, and adjust its on-screen graphics accordingly; Microsoft also envisages a system whereby eye-tracking is used to select areas of focus within the scene. Information shown could follow a preprogrammed script – Microsoft uses the example of an opera, where background detail about the various scenes and arias could be shown in order – or on an ad-hoc basis, according to contextual cues from the surrounding environment.

Actually opting into that data could be based on social network checkins, Microsoft suggests, or by the headset simply using GPS and other positioning sensors to track the wearer’s location. The hardware itself could be entirely self-contained, within glasses, as per what we’ve seen of Google’s Project Glass, or it could split off the display section from a separate “processing unit” in a pocket or worn on the wrist, with either a wired or wireless connection between the two.

In Microsoft’s cutaway diagram – a top-down perspective of one half of the AR eyewear – there’s an integrated microphone (910) and a front-facing camera for video and stills (913), while video is shown to the wearer via a light guide (912). That (along with a number of lenses) works with standard eyeglass lenses (916 and 918), whether prescription or otherwise, while the opacity filter (914) helps improve light guide contrast by blocking out some of the ambient light. The picture itself is projected from a microdisplay (920) through a collimating lens (922). There are also various sensors and outputs, potentially including speakers (930), inertial sensors (932) and a temperature monitor (938).

Microsoft is keeping its options open when it comes to display types, and as well as generic liquid crystal on silicon (LCOS) and LCD there’s the suggestion that the wearable could use Qualcomm’s mirasol or a Microvision PicoP laser projector. An eye-tracker (934) could be used to spot pupil movement, either using IR projection, an internally-facing camera, or another method.

Whereas Google has focused on the idea of Glass as a “wearable smartphone” that saves users from pulling out their phone to check social networks, get navigation directions, and shoot photos and video, Microsoft’s interpretation of augmented reality takes a slightly different approach in building around live events. One possibility we could envisage is that the glasses might be provided by an entertainment venue, such as a sports ground or theater, just as movie theaters loan 3D glasses for the duration of a film.

That would reduce the need for users to actually buy the (likely expensive) glasses themselves, and – since they’d only be required to last the duration of the show or game – the battery demands would be considerably less than a full day. Of course, a patent application alone doesn’t mean Microsoft is intending a commercial release, but given the company’s apparently increasing focus on entertainment (such as the rumored Xbox set-top box) it doesn’t seem too great a stretch.

microsoft_augmented_reality_patent_1
microsoft_augmented_reality_patent_2
microsoft_augmented_reality_patent_3
microsoft_augmented_reality_patent_4
microsoft_augmented_reality_patent_5

[via Unwired View]


Microsoft’s Google Glass rival tech tips AR for live events is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


Sphero’s New Augmented Reality App Allows You To Walk A Beaver Around Your House

Screen Shot 2012-11-16 at 3.50.14 PM

Meet Sharky the Beaver, Sphero‘s first augmented reality character for their little robotic spheres. Over the past few months, Orbotix has been developing many new ways to use the toy. With this new app, the company is making its first step into the world of augmented reality.

As a reminder, Sphero is a ball with an internal motor. You can control it using a Bluetooth-enabled smartphone or tablet. It also has a built-in gyroscope, accelerometer and compass.

Now they’re working on Augmented Reality with Sharky the Beaver. It will transform your sphere into a cute walking 3D character that you control with your phone or tablet. In order to see the character, you have to keep the sphere in sight and look at it through the camera of your device.

You can throw virtual cupcakes and the little beaver will run toward those cupcakes. In the real reality, the sphere will roll toward an invisible cupcake. The virtual character was really smooth and everyone seemed to love the idea, if not the toy.

Sharky is just the first augmented reality application, with new characters and games potentially coming soon. With regular firmware updates, more than 20 apps, and an SDK that fuels Sphero hackathons — and sometimes even result in a Kickstarter campaign – it looks like Sphero is here to stay.


Sphero’s Augmented Reality Engine gets fully realized in Sharky the Beaver (video)

Sphero's Augmented Reality Engined gets fully realized in Sharky the Beaver

Been keeping up to date with the quirky robotic ball named Sphero? We’ve been wondering when its Augmented Reality Engine would finalize into a full-fledged app since we first witnessed it as E3 as a simple 2D tech demo. Well, Today is the day that this Android and iOS-controlled ball makes it first official-release steps into the world of AR — the engine has grown up, powering Orbotix’s latest free app, Sharky the Beaver. While the game itself is still admittedly silly and demo-like since we saw an early adaptation in August, there’s no question that the AGR is now is a polished state.

As a refresher, unlike other implementations that require a stationary marker, Sphero serves as one that can move around your area, while also relaying information about its position. The 3D character on screen rotates its directions as you spin Sphero, and, as you can see above, it even allows you to pick the ball up while it’s being tracked. The frame-rate of tracking in the app itself looked very smooth, and it does an admirable job keeping track of the ball, even if it ends up off-screen. At the point, gameplay is limited to flicking cupcakes on the ground that Sharky goes to automatically, and there’s no word on if and when we’ll see the features shown off in the early version (namely, the part where the Sharky part of the name was actually a key element, as you chased people on-screen to get their cupcakes). All in all, we’re more curious than anything to see what else the folks at Orbotix will come up with in the realm of AR — for more in the meantime, check out the our video hands-on after the break.

Continue reading Sphero’s Augmented Reality Engine gets fully realized in Sharky the Beaver (video)

Filed under: , ,

Sphero’s Augmented Reality Engine gets fully realized in Sharky the Beaver (video) originally appeared on Engadget on Fri, 16 Nov 2012 11:59:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Google Ingress revealed as massive augmented smartphone adventure

Those of you with Android devices should get mega pumped-up at the word we’re seeing today from Google – word of a massive game played by everyone with a device that’s ready to take on their environment straight through their back-facing camera. This game goes by the name Ingress and was up until now shown as a viral ad collection by the name of Niantic Labs. This game is still just a bit cryptic, but appears to be having users walk around their city finding clues and solving mysteries by tapping locations when they discover magical properties through their smartphone’s camera.

This update shows a relatively lengthy show of what the game will be all about, keeping with the whole “the world around you is not what it seems” theme that’s been here even before it was written. This game has you being part of one team or the other, both of them aiming to work with “the power.” This power will either be cultivated or destroyed by you and your team – sort of like tagging mode in Tony Hawk Pro Skater. No mention has been made as of yet on if skateboarding will be involved – likely not.

You’ll be grabbing this world energy called “XM” that will be existing in real locations around your city. We must assume that this game won’t be working everywhere in the world – at least at first – unless Google has made it so that every single location in the world can be tagged. At the moment it appears that heavily populated cities will be handles first and foremost.

The man behind Niantic Labs and the project leader here with Ingress John Hanke spoke this week with AllThingsD about the app, noting that it’s something like World of Warcraft in its taking control of your mind with a whole alternate reality situation:: “The concept is something like World of Warcraft, where everyone in world is playing the same game.” He also added that his team was “definitely inspired by JJ Abrams, but we don’t want to leave people in ‘Lost’ situation where they get into fiction of world but then it never ends.”

Expect great things in the near future – feel free to take a peek at the game right this minute on the Google Play app store and let us know what you find! Note though that you’ll need to bust past the Closed Beta status – seeya there soon!

unnamed-5
unnamed-4
unnamed-3
unnamed-2
unnamed-1
unnamed
7-ScannerZoomed
4-PortalKey
ingress


Google Ingress revealed as massive augmented smartphone adventure is written by Chris Burns & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.