Google Announces Project Tango

While Google Maps might be one of the more widely used mapping applications around and it looks like it is about to get better, thanks to a new Google effort called Project Tango that was recently announced by Sundar Pichai. What is Project Tango, you ask? Well it is a prototype phone built by Google that has the ability to learn its surroundings, like the dimension of a room and the spaces just by being in the room itself and as it is being moved around. For example with the user carrying the device and walking around their house, the device will be able to learn the shape of the home which will help create a more robust and precise map, which in turn will help Google Maps become more precise in the future. (more…)

  • Follow: Cellphones, , project tango, ,
  • Google Announces Project Tango original content from Ubergizmo.

        



    A Google Now HTC Smartwatch could conquer context

    HTC, so the rumor machine goes, is readying its first smartwatches, at least one of which is said to pipe Google Now directly to the wrist. If you’re going to … Continue reading

    HTC One 2 (M8) specifications in short supply at FCC

    It’s nearly time for the next HTC One to be released in a reboot only 2014 could handle, and the device has hit the FCC. This device will be released … Continue reading

    Google Project Tango will 3D map the whole world with hiDOF

    Today Google announced Project Tango, delivered first as a smartphone that’s able to scan the world around it in real time. This phone knows its position in space in real … Continue reading

    Google Project Tango: 200 phones with 3D sensors for room-scanning

    This week the experimental developer-aimed group known as Google ATAP – aka Advanced Technology and Projects (skunkworks) have announced Project Tango. They’ve suggested Project Tango will appear first as a … Continue reading

    And Now For Your Smartphone’s Next Trick: Seeing And Understanding, Courtesy Of Google

    boxes

    Your smartphone hosts a bevy of sensors that do many things within its sleek case. But thanks to a new project, dubbed Tango, by Google’s Advanced Technology And Projects (ATAP) group, your next one might have one more superpower: visual spatial awareness. As in, your next smartphone might be able to not only see, but also to understand its surroundings.

    If that seems like something out of science fiction, that’s because it very much could be – the AI assistant in Her, for instance, manages to be so convincingly human because it shares an awareness of the user’s world, including recognizing the environment and putting that into the proper context. Google’s new prototype hardware and development kit, with its Myriad 1 Movidius Vision Processor Platform, is an attempt to give mobile devices exactly this type of “human-like understanding of space and motion,” according to ATAP Program Lead Johnny Lee.

    This Isn’t Just A Fancy Camera

    To be clear, this isn’t just a new camera for smartphones – it’s more like the visual cortex of the brain, made into a device component like any accelerometer or gyro currently found in smartphones. And to some extent, that’s the battleground for next-gen mobile technology; Apple has its M7 motion coprocessor, for instance, and Qualcomm is building advanced camera processing tech into its SoCs that will allow users to alter focus on pictures after the shot and do much more besides.

    Screen Shot 2014-02-20 at 11.30.51 AMBut Google’s latest experiment has the potential to do much more than either of these existing innovations. Computer vision is a rich field of academic, commercial and industrial research, the implications of which extend tendrils into virtually every aspect of our lives. Better still, the tech from Google’s partner used to make this happen is designed specifically with battery life as a primary consideration, so it’s designed to be an always-on technology, rather than something that can only be called upon in specific situations.

    Contextualizing Requests

    So what will these mean in terms of user experience? It’s trite but true to say “expect big changes,” but at this stage the push is to get developers thinking about how to make use of this new tech. So it’s still early to say exactly what kind of software they’ll build to put it into practice. One thing’s clear, though – context will be key.

    Google Now has provided some hint of what’s possible when a smartphone has a thorough understanding of its user, and what said user’s needs might be, given time, location and inputs including email, calendar and other overt signals. Combined, those can present a pretty good picture of what we call ‘context,’ or the sum total of circumstances that make up any given situation. But ultimately, as they operate currently, your smartphones are effectively working within black boxes, with pinholes cut out sporadically across their surface, letting through shafts of light that partially illuminate but don’t necessarily truly situate.

    With an understanding of surroundings, a virtual personal assistant could know that, despite being in the general vicinity of a bus stop, you’re currently shaking hands with a business contact and dropping your bag ahead of sitting down for coffee, for instance. In this situation, providing local bus arrival times isn’t as important as calling up that email chain confirming the meeting in question, for instance.

    boxes3

    But the value of visual awareness to virtual personal assistants is just one example, and one that’s easy to grasp given recent fictional depictions of the same. Knowing where a phone is being used could also help to bring more far-out concepts to life, including games that change settings, surroundings and characters depending on where they’re played; situational advertising that interacts with nearby multimedia displays and addresses/engages a user personally based on where they are and what types of products they have to be looking at; even macro level settings change to make sure your phone is prepared to suit your needs given your current circumstances.

    As for that last example, it could help usher in the type of dynamic mobile OS that Firefox, Google and others have clearly been toying with. Contextual launchers, I argued on a recent TechCrunch Droidcast, aren’t ready for primetime because too often they get the context wrong; again, they’re working inside a black box with only brief and sporadic glimpses to make sense of the world around them, and this new tech stands to bring them out into the open. The result could be devices that don’t need to be manually silenced, or that automatically serve up the right home screen or app for what you need without having to be told to do so.

    Immense Data Potential

    Of course, Google wouldn’t be doing Google if this project didn’t have a data angle: The type of information that could potentially be gleaned by a device, carried everywhere by a user, that can not only see but also make sense of its surroundings is tremendously powerful.

    Google’s entire business is built around its ability to know its customers, and to know what they want to see at any given time. The search engine monetized on the back of extremely targeted web-based text ads, which return highly relevant results whenever a user types a query into their engine – an almost foolproof sign that they’re interested in that topic. It sounds like a no-brainer now, but at the dawn of search, this simple equation struck the entire ad industry like a lightning bolt, and it continues to drive behemoth revenue for Mountain View.

    Even Google’s famously ambitious “moonshots” all have a thread of that original goal behind their impressive facades of consumer potential, and Tango is no exception. Stated intent, like that expressed by people Googling things, is only one part of the equation when it comes to sussing out a consumer’s desires – anticipating needs before they arise, and understanding needs that a person might not even be aware they have, make up the broader blue sky opportunities for a company like Google. In that context, having a contextually aware smartphone that can observe and interpret its surroundings is almost like putting a dedicated market researcher in the room with any given shopper, at any given time.

    As with every single noteworthy mobile tech development of the past decade at least, Project Tango will seek greater access to a user’s life in exchange for more and better services rendered. And once developers start showing us what’s possible once a smartphone can understand where it is and what’s going on, I’m willing to bet users find the cost in data perfectly acceptable.

    boxes2

    Google’s Not Alone

    Google isn’t the only company that’s working on perceptive mobile devices, and it won’t be the only one that helps bring gadgets with visual intelligence to market. Apple acquired PrimeSense last year, a company that builds motion-sensing tech and helps map 3D space. Qualcomm purchased GestureTek, a similar company, back in 2011.

    Location-based tech seemed sci-fi when it was first introduced to mobile devices, but now it’s de rigueur. The same will be true of contextual awareness with the devices we buy tomorrow. Google is the first one to start putting this power in the hands of developers to see what that might mean for the future of software, but it won’t be the only one. Expect to see development come fast and furious on this new frontier in mobile tech, and expect every major player to claim a seat at the table.

    Illustrations by Bryce Durbin

    Archos expands off-contract lines for MWC 2014

    This week the folks at Archos have let it be known that they’re not about to stop producing low-cost smartphones and tablets for the masses. These devices expand on Archos … Continue reading

    Google Tango: An Experimental Android Phone That Maps Your Whole World

    Google’s new Project Tango is a limited-run experimental phone that’ll be handed out to 200 lucky developers next month. It’s got Kinect-like vision and a revolutionary chipset, enabling phones to see the world in a whole new way. A bit like a really sophisticated 3D scanner in a phone. It’s pretty nuts.

    Read more…


        



    Google Wi-Fi App Could Make Hotspot Access Easy

    Google Wi Fi App Could Make Hotspot Access EasyThe folks over at Google do seem to be on a constant quest to innovate and change things for the better, and it looks as though they are now working on a new Wi-Fi app which could make life easier when one wants to access wireless hotspots. Apparently, Google has already come up with Android and iOS versions of an app which is smart enough to authenticate and connect to its free hotspots automatically within a Starbucks store, or just about anywhere that they happen to be available. This particular Android app is currently undergoing trials over at its Mountain View headquarters before we presume it is released to the masses for them to enjoy.

    (more…)

  • Follow: Computers, , ,
  • Google Wi-Fi App Could Make Hotspot Access Easy original content from Ubergizmo.

        



    Google Wanted To Buy WhatsApp For $10 Billion [Report]

    Google Wanted To Buy WhatsApp For $10 Billion [Report]

    By now we have all heard that Facebook has agreed in principle to buy WhatsApp, the most popular cross platform messaging service out there, for an eye watering sum of $19 billion in cash and stock. If the deal falls through for some reason, Facebook has promised a cash breakup fee of $1 billion. Since the announcement was made, many figures were thrown around. An investor is set to rake in billions, while existing WhatsApp employees will get millions on part of the $3 billion restricted Facebook stock they get. There’s another number that’s equally interesting. Apparently Google was courting WhatsApp as well, and it even offered $10 billion to buy it.

    (more…)

  • Follow: General, , , ,
  • Google Wanted To Buy WhatsApp For $10 Billion [Report] original content from Ubergizmo.