Google places strict NDA on Project Glass Hackathon events

There’s understandably a huge amount of buzz surrounding Google’s Project Glass, and the fact that the company is holding two developer events at the end of the month isn’t helping to quell the excitement. Sadly, if you were waiting for all kinds of details to come out of these developer events, we’ve got some sour news for you. Google has placed a rather strict non-disclosure agreement on the event, and everyone who attends will have to sign one.

google-project-glass-Diane-von-Furstenberg-01-540x426

Obviously, this means that those who are in attendance won’t be able to talk about what they saw while they were at the event, but Google has placed a few additional restrictions we don’t always see in non-disclosure agreements. Any photos or video attendees take while at the events become the property of Google, so there probably aren’t going to be very many pictures floating around after the events take place.

ReadWriteWeb points out that developers at the show won’t even get to use their own Google+ profiles to test out Project Glass. Instead, Google will supply special developer accounts for them to use, and you can bet that those will be all locked up as well. In short, Google wants to keep as much information about Project Glass under wraps as it can, and we’re thinking there won’t be many developers willing to leak information while their names are on these strict NDAs.

Interestingly enough, it seems that some people attending the “Glass Foundry” might be able to take a Google Glass device home with them for additional testing. Of course, that doesn’t invalidate the NDA in any way, but it’s exciting to think that some people might be leaving the event with Project Glass in-hand. In any case, it seems that those craving more details on Project Glass will have to wait even longer, as you’ll be hard-pressed to find someone willing to share new information after these events wrap up.

[via Mashable]


Google places strict NDA on Project Glass Hackathon events is written by Eric Abent & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Google Glass bone-conduction increasingly possible with indirect audio patent

Signs that Google is using bone-conduction for private audio from its Project Glass headset continue to mount, with a new patent application from the company describing exactly how the surreptitious system might work. The patent filing, a “Wearable computing device with indirect bone-conduction speaker” uses the same basic Google Glass diagrams as we’ve seen in other recent wearables patents, but this time details “at least one vibration transducer” the movements of which are passed through the headset and into the wearer’s bone structure.

google_glass_bone-conduction_patent

That way, unlike other bone-conduction systems we’ve seen – such as in Bluetooth headsets – there’s no need for a direct point of contact between the conduction speaker itself and the wearer. Instead, since at least the ear-hooks and nose pad will be touching the user, they’ll be relied upon to pass through any vibration-based audio.

google_glass_bone-conduction_patent_2

The advantage of bone-conduction as a technique rather than, say, earphones or a small, traditional speaker, is privacy and convenience. Earpieces block out external sounds, while speakers – even at low volume – can be heard by those nearby; with bone-conduction, only the wearer can hear the audio, and it won’t distract even those standing right beside them.

Chatter of such a system being intended for Google Glass began back in December, when insiders claimed the headset funnelled audio directly to the mastoid process. At the time, we speculated that the protruding lozenge of plastic seen near the battery section on the inner surface of Glass in the photo below might be the contact point of the bone-conduction system.

project_glass_bone_conduction

However, judging by Google’s patent application, the speaker needn’t actually make direct contact at all (though it’s possible that, if Glass does indeed use bone-conduction, the prototype shown in the image doesn’t have the indirect system installed). Google outlines multiple arrangements for the speaker system, in fact, so there are various ways it couple implement it.

We’ll presumably know more when the first developers get to play with Glass Explorer Edition headsets at Google’s Glass Foundry events, the developer workshops kicking off in New York at the end of the month.

[via Engadget]


Google Glass bone-conduction increasingly possible with indirect audio patent is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Google’s Acquisition of Motorola Is Still a Bust And Will Be For a While

During Google’s earnings call today, Google executives made it clear that the acquisition of Motorola comes with quite a bit of baggage. More specifically, Google has “inherited a 12-18 month product pipeline” that Google CFO Patrick Pichette says the company is “still working through.” Yikes. More »

Brin: Google Glass Explorer Edition will ship “in a couple of months”

Google’s Project Glass Explorer Edition, the $1,500 limited-edition developer version of the wearable computer, will ship “in a couple of months” Sergey Brin has confirmed, after being spotted wearing a prototype headset in NYC this week. Brin, who has been a significant motivator for Google’s augmented reality and wearables R&D, revealed the rough timescale to Noah Zerkin, who recognized the Google co-founder on the NYC subway. The exec also touched upon how many Glass prototypes are in the wild.

google_glass

According to Brin, who acknowledged that he was a part of the core Google X Lab responsible for developing Project Glass in addition to other high-concept research such as self-driving cars, around one hundred people outside of the X team currently have wearable prototypes. Exactly how many people are employed in Google’s R&D-centric X division is unclear.

Google took preorders for Project Glass Explorer Edition back at Google IO 2012, offering keen developers the opportunity to secure a unit in return for the not-inconsiderable sum of $1,500. However, select developers on the wait-list for a Glass headset will get an early opportunity to play with the wearable, as Google kicks off its Glass Foundry developer events later this month.

The first of these events, in New York City (a second, in San Francisco, will take place at the start of February) will see Google outline its new Mirror API, which will bridge the cloud and Glass headsets and allow developers to feed information from their apps to the wearable. However, attendees won’t be able to take away the Glass prototypes themselves.

Google previously said that it expects to have consumer versions of Glass on the market less than twelve months after the Explorer Edition sets start shipping. Pricing for the mass-market version is unknown, though Brin has indicated that it would be “significantly” cheaper than the $1,500 developer kit.


Brin: Google Glass Explorer Edition will ship “in a couple of months” is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Will Wearables Fuel – or Fracture – Convergence?

The candid snapshot of Google exec Sergey Brin, riding the subway on a $2.25 fare while sporting a Glass prototype worth thousands of dollars, has reignited questions around ubiquitous computing. That sighting of Brin is a timely one. Not only is Google’s Glass Foundry developer schedule kicking off at the end of January, but several other wearables projects have reached milestones this month; Vuzix brought out prototypes of its Glass rival a few weeks back, while Kickstarter success Memoto applied some extra-sensor balm to the sting of an unexpected hardware delay today.

As each project tracks toward release, however, the ecosystem of more straightforward body-worn gadgetry such as activity monitors like Jawbone’s UP picks up for what’s predicted to be a bumper year of sales. Still, among sensor ubiquity and the specter of power paucity, the fledgling wearables industry hasn’t apparently decided whether it’ll face this brave new augmented world hand-in-hand, or jealously guarding its data.

sergey_brin_project_glass_wireless_pan

[Original Sergey Brin image via Noah Zerkin]

Project Glass and Memoto both take photos, but otherwise they come at the wearables space in a very different way. The Google headset shoots stills and video on-demand, but isn’t – as far as we know – intended for permanent streaming. Memoto’s camera, however, is intended as a life-logging tool, periodically snapping shots and tagging them with location and direction; earlier today, the team behind the project confirmed there’d now be a digital compass in there too. Other wearables take their own routes to your wrist, jacket lapel, or elsewhere on the body, such as UP or other digital activity monitors.

Though the ethos may be different, much of the hardware is the same. Headset, wearable camera, and wrist-born pedometer-on-steroids all have motion sensors; both Glass and Memoto have digital compasses, and GPS. There’s a huge degree of overlap, even more when you factor in that most users of wearables will also be carrying a smartphone, with its own battery of sensors and radios.

So, with Memoto’s new-found digital compass, how does its hardware differ from that of an UP, or Fitbit’s Flex? All three have the ability to monitor patterns of movement and figure out if you’re running, or walking, or sleeping; all that’s missing is the software to do the crunching of that data on the camera. Why should tomorrow’s wearables enthusiast carry two, or three, or more accelerometers and magnetometers, when the data from one is sufficient?

Of course, sharing sensors is only one element of what convergence demands: there’s a bigger compromise to be made, when fewer gadgets perform more tasks. Battery life continues to be the bane of the consumer electronics world, and that headache is only going to be magnified when it comes to body-worn technology. A hefty smartphone with a big screen and a 3,000mAh+ battery might be acceptable in your jacket pocket, but a power pack of that size simply isn’t going to fly when you’re wearing it on the side of your head.

“The Personal Area Network is inescapable”

In many ways, then, the PAN – or Personal Area Network – is inescapable. The early iterations of wearables are naively insular in their approach: they try to do everything themselves, with little reliance and few expectations of the other gadgetry on your person. Take, for example, Vuzix’s Smart Glasses M100, a prototype of which we played with at CES earlier this month. Inside the chunky headset there’s a full Android computer, with all the connectivity you’d expect from a reasonably recent smartphone, bar the cellular data.

vuzix_m100_wearable_hands-on_7 (1)

That makes for a wearable with impressive standalone abilities, but also one that’s greedy for power. Vuzix’s headline estimate is up to eight hours of “typical use”; however, what’s “typical” in the manufacturer’s opinion is sporadic activation summing just two hours in total, or even half that if you want to use both display and camera. All that despite the fact that your smartphone – which you’ll probably need anyway, since Vuzix supplies a remote control app to more easily navigate the M100′s apps – has a processor, battery, radios, sensors, and other hardware already.

Bluetooth 4.0, the most power-frugal iteration of the technology, may go some way to popularizing PANs. Still, that’s just the virtual cable: the glaring omission is any sort of wearables standardization, which would allow your eyepiece from manufacturer X to output the information from smartphone Y, having called upon sensors Za, Zb, and Zc dotted around your body (not to mention in spread around the ecosystem around you).

Predictions have it that the wearables market will explode over the next 4-5 years, albeit beginning with more humble tech like activity tracking bracelets, but building to Glass-style headsets once the technology gets in line with affordable pricing. That may well be the case, but it will take more than slick hardware and project execs that drink the Kool-Aid to motivate the industry. We’ve put up with silo’d ecosystems in smartphones, and stomached it in tablets, but if wearables are to succeed the consumer electronics industry will need to set aside its appetite for insularity and embrace openness in augmentation.


Will Wearables Fuel – or Fracture – Convergence? is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Sergey Brin Spotted With Project Glass On NYC Subway

sergei glass Sergey Brin Spotted With Project Glass On NYC SubwayWhen you are one of the richest men in the US, surely getting around on public transport would get you identified quite easily, so it is a pleasant surprise for the geek world to see Sergey Brin, who normally guards his privacy well, travel along the NYC subway while wearing one of his pet projects, the long awaited Google Glass – albeit in prototype form. Project Glass was announced last year at the Google I/O conference, and it has yet to make its way to the mass market, but I guess it makes perfect sense to make sure it is perfected instead of rolling out in a gimped manner.

Sergey Brin apparently ran into Brooklyn resident and Augmented Reality enthusiast Noah Zerkin, where the latter struck up a short conversation that Zerkin is sure to treasure for the rest of his life. Well, there is still little that the public knows about Google Glass, and hopefully, when developer events for Google Glass kick off in full swing next week, more will be revealed to the world.

By Ubergizmo. Related articles: Good Night Lamp Is Connected To The Internet, Machina Midi Controller Jacket Complete At Last,

Sergey Brin spotted on NYC subway rocking Google Glass

When you’re Sergey Brin, you can afford to take a limo through the streets of NYC, though the photo opportunities for your Google Glass headset are probably more plentiful on the subway. Augmented reality enthusiast Noah Zerkin spotted Brin on the downtown 3 train, complete with a surprisingly discrete black Glass wearable, in the latest in-the-wild sighting of Google’s head-worn computer.

sergey_brin_nyc_subway_google_glass

It’s not the first NYC sighting, either; last month, an unknown man wearing an altogether more eye-catching red Glass headset was caught on camera. That version also included what appeared to be prescription lenses, reassuring eyeglasses-wearers that they wouldn’t necessarily be left out of the augmented reality fun.

Exactly what Brin was using the headset for is unclear, but its cloud-based functionality is likely to have been significantly curtailed since he was out of signal range. Exactly which features Glass can carry out when isolated from a network, and what relies on access to the Mirror API, have not been explained, but certainly uploading photos and video to Google+ would be out of the question while Brin was on the subway.

The spottings come as Google prepares for a pair of developer events – one in New York, another in San Francisco – at the end of January, at which select coders will get their first chance to cook up apps and features for Glass.

Google is yet to confirm when the first Glass “Explorer Edition” headsets – the $1,500 apiece, early-access units sold to developers at Google I/O last year – will be shipped, and it seems the units at the developer events this month won’t be handed out to attendees to actually take home.

[via The Next Web]


Sergey Brin spotted on NYC subway rocking Google Glass is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

When Sergey Brin Rides the Subway, He Does It Wearing Project Glass

Noah Zerkin spotted Sergey Brin riding New York’s downtown 3 train last night—and the man from Google happened to have a pair of the company’s glasses strapped to his face. Quite why he needs to wear them on public transport is unclear, but combined with his outfit they make him look like he should feature in the next Mission:Impossible movie. [Noah Zerkin via TNW] More »

Google Glass sees laser-projected keyboard possibilities

This week a patent has been revealed as filed by Google for what very much appears to be a laser-projected set of controls emanating from a pair of smart glasses. This of course could mean that Google’s Project Glass is about to see some amazing virtual reality controls as their two upcoming developer events come to fruition at the start of next month. And this isn’t the first time we’ve seen wild, futuristic control possibilities for Google’s Glass, either!

google_io_project_glass-580x386

The patent we’re seeing today looks at first to be rather similar to what we’ve seen in the 2012 version of Total Recall – but this phone isn’t embedded into our hand, it’s projected unto it. Here we’re being shown a phone number dial pad as projected to a human hand as well as a series of numbers projected to a human arm. Of course the imagery is vague enough that it could be any sort of projection-friendly glasses that are making this all possible, but really, this would be a perfect fit for Glass.

Google-Project-Glass-virtual-keyboard

The patent includes the ability to work with such a system physically, this meaning you’ll be able to use your second hand to tap numbers on your first, the system recognizing the interruption in the projected light and reacting accordingly. The above patent was found by Unwired View this week while in the months that’ve lead up to this have shown several other means of control for Google’s futuristic pair of goggles.

google_glass_patent_1-580x417

One of these instances shows a set of six patents for the near-final shape of Project Glass as well as Kinect-style motion control using rings or temporary tattoos. Another instance included a set of tap controls on the sides of the glasses as well as through virtual means with swipes across the space in front of the person wearing said unit.

You’ll do well to have a peek at our archive of Google Project Glass articles while we head into our first up close and personal experience with the developer-ready unit at the start of February. Expect more action from Google at Mobile World Congress 2013 as well!


Google Glass sees laser-projected keyboard possibilities is written by Chris Burns & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Did You Preorder a Set of Project Glass Explorer Editions?

Later this month and early next month, Google is planning two hackathons in San Francisco (1/28-1/29) and New York (2/1-2/2) that they’re calling “Glass Foundry” for any developer who preordered a set of Glass from last year’s I/O conference. More »