Maybe it’s all the beautiful clothes and models, but this short film shot through the eyes of Diane Von Furstenberg and her various fashion week models and stylists with Google Glasses is wonderful. More »
Google Glass‘ early luxury brand pricing appears to have put it in good stead, with the elite at New York’s Fashion Week getting an early close-up look at Google’s wearable camera future. Diane von Furstenberg, who’s no stranger to a tech tie-in, has added the lightweight frames to her latest show, using them to make a documentary about fashion’s creative process. The project is set to appear on von Furstenberg’s Google+ page later this week, but if you’re not a world-renowned fashion designer (or model), we’d be paying more attention to that two-year wait.
Filed under: Wearables
Google Glass makes catwalk debut at New York Fashion Week originally appeared on Engadget on Mon, 10 Sep 2012 04:29:00 EDT. Please see our terms for use of feeds.
Permalink AllThingsD |
DVF (Google+) | Email this | Comments
There’s still quite a few months before those lucky early adopters can get their eager hands (and eyes) on Google’s Explorer Edition set of wearables, but in the meantime, the company’s not wasting any time and is building up its team to have the frames as loaded as can be. One of the latest additions to Mountain View’s Project Glass squad is former Rdio and Danger software engineer Ian McKellar — who’d previously worked on the streaming service’s API, among other things. Mum’s the word on what exactly he’ll be tinkering with at the Project Glass laboratories, though we can’t imagine it’ll be anything short of amazing. In case you’d like to dive into his thoughts a little more, you can check out his tweet on the matter at the link below.
Filed under: Misc, Wearables, Software
Former Rdio software engineer joins Google’s Project Glass team originally appeared on Engadget on Tue, 28 Aug 2012 07:44:00 EDT. Please see our terms for use of feeds.
Permalink SlashGear |
Ian McKellar (Twitter) | Email this | Comments
Google gets patent for eye tracking-based unlock system, shifty looks get you access
Posted in: Today's ChiliLook up. Now down. Back up here again? Imagine having to do that every time you wanted to unlock your phone, as this granted Google patent for “Unlocking a screen using eye tracking information” possibly suggests. Okay, it actually looks more like it’s intended for the firm’s super spectacles — which given their general hands-free nature — makes more sense. The claims are fairly straightforward, unlocking of a device would be granted based on “determining that a path associated with the eye movement substantially matches a path of the moving object”. As long as those moving objects aren’t moving too fast, we think we can work with that.
Filed under: Wearables
Google gets patent for eye tracking-based unlock system, shifty looks get you access originally appeared on Engadget on Tue, 07 Aug 2012 09:59:00 EDT. Please see our terms for use of feeds.
Permalink | USPTO | Email this | Comments
Google posts video highlights of I/O 2012, for those craving one last sugary fix
Posted in: Today's ChiliWeren’t able to fill up on all the Jelly Bean-flavored geekery that was Google I/O 2012? It’s no matter, because you can catch all the highlights from Project Glass to the Nexus 7 in Google Developer’s latest video — provided you’ve got about four minutes spare to reminisce. You’ll find the clip after the break, and naturally, we’d suggest landing at our hub for the event if you’re hungry for another fixin’ of our extensive coverage — no parachute required.
P.S. Don’t forget to see if you can spot any Engadget editors in the clip while you’re at it!
Continue reading Google posts video highlights of I/O 2012, for those craving one last sugary fix
Google posts video highlights of I/O 2012, for those craving one last sugary fix originally appeared on Engadget on Thu, 26 Jul 2012 18:22:00 EDT. Please see our terms for use of feeds.
Editorial: Engadget on EyeTap, Project Glass and the future of wearable cameras
Posted in: Today's ChiliSummer in Paris — you can’t walk a block on Champs-Élysées without locking eyes with at least one camera-equipped tourist. But Steve Mann’s shooter wasn’t dangling from his shoulder and neck; it was mounted on his head, with a design strikingly similar to Google’s Project Glass. Unlike that mainstream Mountain View product, however, Mann’s version has reportedly been around in one form or another for 34 years, and was designed with the objective of aiding vision, rather than capturing stills and video or providing a bounty of database-aided readouts. It’s also street-ready today. While on vacation with his family, the Ontario-based “father of wearable computing” was sporting his EyeTap as he walked down the aforementioned French avenue, eventually entering a McDonald’s to refuel after a busy day of sightseeing. He left without his ranch wrap, but with seriously damaged hardware.
What allegedly occurred inside the restaurant is no doubt a result of the increasing presence and subsequent awareness of connected cameras, ranging from consumer gear to professional surveillance equipment. As Mann sat to eat, he writes that a stranger approached him then attempted to pull off his glasses, which, oddly, are permanently affixed to his skull. The man, at that point joined by one other patron and someone that appeared to be a McDonald’s employee, then pushed Mann out of the store and onto the street. As a result of the attack, the eyewear malfunctioned, resulting in the three men being photographed. It wouldn’t be terribly difficult for police to identify those involved, but this encounter may have greater implications. McDonalds has since launched an investigation into the matter and seems to be denying most of the claims, but it’ll be some time yet before the full truth is uncovered. Still, the whole ordeal got us at Engadget thinking — is the planet ready for humans to wear video recorders, and will it ever shake a general unease related to the threat of a world filled with omnipresent cameras? Join us past the break for our take.
Continue reading Editorial: Engadget on EyeTap, Project Glass and the future of wearable cameras
Filed under: Digital Cameras, Displays
Editorial: Engadget on EyeTap, Project Glass and the future of wearable cameras originally appeared on Engadget on Wed, 18 Jul 2012 13:41:00 EDT. Please see our terms for use of feeds.
Permalink | Signs of the Times, Slashgear | Email this | Comments
Watch out, Google. Here comes Olympus with the MEG4.0 and don’t dismiss this as a Google Glass knockoff. Olympus has been researching and developing wearable displays for more than 20 years. The MEG4.0 concept, and with that, its eventual production counterpart, has been a long time coming and could be a serious competitor in the space.
Olympus made it clear in today’s announcement that the 30g MEG4.0 is both a prototype and a working name. The stem-like system sits on one side of the glasses and connects to a tablet or phone through Bluetooth. A 320 x 240 virtual screen floats above the wearer’s eye line. The MEG4.0 is designed for all-day use and should last eight hours on a charge, although Olympus states the glasses are designed for bursts of use, 15-seconds at a time.
Google isn’t the only player in the augmented reality game. In fact several companies have toyed with the concept for the last few years including Olympus. The company introduced a working set of AR glasses back in 2008. Called the Mobile Eye-Trek (shown above) the glasses were designed to be worn on a daily basis, feeding information like email to the wearer on a screen placed 50cm in front of the eyes, making it appear as a 3.8-inch screen.
While the Mobile Eye-Track never hit the retail market, Olympus indicated at the time that the prototype would lead to a production version by 2012.
However, much like Google, Olympus is not revealing the user interface yet. If the MEG4.0 is to be a success, the interface, and more importantly, the depth of the information available needs to be as mature as Google Glass. Price and availability was not announced.
Google patent filing would identify faces in videos, spot the You in YouTube
Posted in: Today's ChiliFace detection is a common sight in still photography, but it’s a rarity in video outside of certain research projects. Google may be keen to take some of the mystery out of those clips through a just-published patent application: its technique uses video frames to generate clusters of face representations that are attached to a given person. By knowing what a subject looks like from various angles, Google could then attach a name to a face whenever it shows up in a clip, even at different angles and in strange lighting conditions. The most obvious purpose would be to give YouTube viewers a Flickr-like option to tag people in videos, but it could also be used to spot people in augmented reality apps and get their details — imagine never being at a loss for information about a new friend as long as you’re wearing Project Glass. As a patent, it’s not a definitive roadmap for where Google is going with any of its properties, but it could be a clue as to the search giant’s thinking. Don’t be surprised if YouTube can eventually prove that a Google+ friend really did streak across the stage at a concert.
Google patent filing would identify faces in videos, spot the You in YouTube originally appeared on Engadget on Tue, 03 Jul 2012 15:11:00 EDT. Please see our terms for use of feeds.
Permalink | USPTO | Email this | Comments
Last week’s awesome Google Glass demo proved that strapping a decent quality camera to a person’s face makes for awesome first-person viewing. Funnily enough, the porn industry thought the exact same thing. More »