Leap Motion Muse turns your hands into musical instruments

Leap Motion has announced a new music app that is available for the Mac user called Muse. The software is a music creation platform that allows users to create ambient … Continue reading

Leap Motion Lays Off 10% Of Its Workforce After Missing On First Year Sales Estimates

Leap Motion won a lot of buzz early on for its motion controller, which is designed to make it possible for users to interact with their computer through gestures alone. The early buzz and pre-order interest led to a lot of growth, with the company swelling to 120 employees at its peak. But disappointing reviews when the hardware actually shipped took some of the wind out of the startup’s… Read More

Meet The First 10 Companies To Take Part In Founders Fund And SOSVentures’ LEAP Axlr8r

leap3dmotioncontrol_large1

Late last year, a couple of venture firms sought to invest in LEAP Motion’s gesture control technology by helping developers to build businesses around it with an accelerator. Today, the LEAP Axlr8r is opening for business and announcing the first 10 participating companies in the program.

LEAP Motion has built an $80 hardware device that allows any user to control what’s happening on their computer through an interface that tracks the movement of their hands. It’s had more than 70,000 developers sign up to test out and build apps for the device, but few actual apps have been launched so far.

The LEAP Axlr8r seeks to change that by taking LEAP Motion’s technology to the next level. With backing from Peter Thiel’s Founders Fund, as well as SOSVentures, the firm behind hardware accelerator HAXLR8R, the incubator sought out startups doing interesting things with the next-generation gesture control platform.

Like other incubators, LEAP Axlr8r provides participating companies with a small amount of funding — in this case, $25,000 — and puts them through a three-month program that is designed to refine the products and services they’re seeking to build. Housed near LEAP Motion headquarters in San Francisco, those companies will have access to the engineers who built LEAP Motion technology, as well as a number of mentors who can help with other aspects of the design process.

The whole thing ends in a Demo Day on May 9th. We’ll be tracking their progress and are looking forward to seeing what they release. The first 10 companies participating in the accelerator include:

  • MotionSavvy – Giving voice to the deaf and hard-of-hearing through real-time American Sign Language translation
  • Diplopia – Restoring depth perception for the 5% of the population affected by amblyopia (lazy eye) through virtual reality computer games using Oculus Rift and Leap Motion
  • Sterile Air – Creating the “Operating System” to enable a computerized, sterile surgical OR
  • LivePainter – Enabling real-time DJ-ing and VJ-ing as performance art via live web collaboration
  • Ten Ton Raygun – Gamifying physical rehabilitation therapy for Stroke and other injuries to make rehab fun, quicker, and measurable
  • Mirror Training – Making robots an extension of your own body using Leap Motion and video. A DARPA spinoff revolutionizing robotic arm control with a natural user interface and visual feedback for the user
  • GetVu – Creating a next-gen augmented reality platform that mixes computer vision with human vision in a wearable device
  • Illuminator 4D – Easily create interactive, holographic environments for retail, and in-home usage
  • Crispy Driven Pixels – Reinventing 2D and 3D creative software through a new, natural user interface
  • Paralagames – Improving hand-eye coordination through games controlled by the hand

Leap Motion Expands Into Japan Market Via Exclusive SoftBank Partnership

Leap Motion Expands Into Japan Market Via Exclusive SoftBank PartnershipThere comes a time in the life of a company to work with other partners from far away, as it makes perfect sense in this day and age considering how the world is a global village now. Leap Motion has decided to take the plunge in the Japanese market, and in order to do so without floundering around, they have managed to enlist the exclusive help of SoftBank in a partnership. This would mean that the Leap Motion controller will now have retail presence in 15 different countries across the five continents.

Leap Motion figured out that leveraging on SoftBank Group’s multi-channel reach as a leading telecommunications and internet corporation would be a good place to start – after all, why bother reinventing the wheel when existing tools are already in place? Leap Motion’s technology can already enable folks to use finger and hand movements in the air to play, create and explore in more natural and dynamic ways on computers and other devices, and Japan would be a key market for the company. After all, they have received their fair share of interest from Japanese developers and consumers, making it the third largest market for web sales of the Leap Motion Controller. Does this mean that we are in for some very exciting times?

  • Follow: Gadgets, , ,
  • Leap Motion Expands Into Japan Market Via Exclusive SoftBank Partnership original content from Ubergizmo.

        



    Leap Motion Gesture Control Could Debut On Smartphones And Tablets Late Next Year

    Leap Motion Gesture Control Could Debut On Smartphones And Tablets Late Next Year

    Back in March, Leap Motion released a controller for conventional computers. The controllers are able to detect up to ten fingers and allow users to use gestures to control the computers. Since the initial launch, Leap Motion has gained a lot of popularity, it has even been shipping built-in to HP computers. The next frontier for this gesture control technology is smartphones and tablets, and that’s exactly what the company wants to achieve by the third quarter of 2014.

    CEO Michael Buckwald revealed to TNW at a briefing in London that they have been able to overcome the hardware and technical challenges required to solve so that the technology can be incorporated in devices with smaller form factors, such as smartphones and tablets. While there’s no concrete roadmap for the products that are likely to come with Leap Motion technology, Buckwald says that the company expects tablets and phones in the market by Q3 or Q4, 2014. He sees Leap Motion in TVs, head mounted displays and “even things like cars” in the future. It remains to be seen which manufacturers and OEMs team up with the company to integrate its smart 3D gesture control technology in their mobile devices, hopefully we’ll hear more about potential partners in the next few months.

  • Follow: CellPhones, Tablets, ,
  • Leap Motion Gesture Control Could Debut On Smartphones And Tablets Late Next Year original content from Ubergizmo.

        



    HP Leap Motion integration expands to desktops and all-in-ones

    Supposing you’re in the mood for controlling your computer with a wisp of your hand this upcoming winter season, HP and Leap Motion have today suggested that you’re in luck. They’ve made clear that they’re bringing the move beyond their original integration of Leap Motion control in the HP Envy 17 and are moving on […]

    Elliptic Labs Launches Android SDK For Its Ultrasound-Powered Mid-Air Gesture Tech – Phones With ‘Touchless’ UI Landing In 2H 2014

    Elliptic Labs

    Elliptic Labs, a startup founded back in 2006 which uses ultrasound technology to enable touchless, gesture-based interfaces, has finally pushed its tech into smartphones. It’s been demoing this at the CEATEC conference in Japan this week (a demo of Elliptic’s tech running on a tablet can also be seen in this TC video, from May) but today it’s announcing the launch of its first SDK for Android smartphones.

    Elliptic’s technology is able to work with any ARM-based smartphone, confirmed CTO Haakon Bryhni in an interview with TechCrunch. “That is completely new to us, that we’re able to make the technology available on a low-powered platform,” he said. ”A major part of our technology development for the past half year has been to optimise our algorithms for smartphone use.”

    Gesture-based user interfaces which turn mid-air hand movements into UI commands have pushed their way into console-based gaming, thanks to Microsoft’s Kinect peripheral, and also mainstream computing via the likes of the Leap Motion device and webcam-based alternatives. Mobiles haven’t been entirely untouched by ‘touchless’ interfaces — Samsung added limited mid-air gesture support to the Galaxy S4 earlier this year, for instance (and back in 2009 the now defunct Sony Ericsson tried its hand at motion-sensitive mobile gaming) — but most current-gen smartphones don’t have the ability to respond to mid-air swiping.

    That’s set to change in 2014, as Elliptic Labs is currently working with several Android OEMs that are building devices that will include support for a gesture-based interface. Bryhni would not confirm the exact companies but said he expects several gesture-supporting mobile devices to hit the market in the second half of next year.

    “We are currently working very closely with three OEMs, in advanced prototyping stages with the objective of getting our technology into handsets — one tablet and two smartphone manufacturers,” he told TechCrunch. “We are also talking to some laptop manufacturers. But it is the smartphone and tablet vendor that are the most aggressive.”

    As well as increased numbers of mobile devices packing gesture support next year, the technology is going to get more powerful thanks to the capabilities of ultrasound, according to Bryhni. He argues that the Galaxy S4′s gesture support is more limited, being as it’s powered by an infrared sensor which requires the user to be relatively close for it to function.

    By contrast, Elliptic’s embedded ultrasound tech (which basically consists of microphones and a transducer, plus the software) can support gestures within an 180 degree sphere — in front of and around the edges of a phone, and at distances that could be customised by the user – allowing for a range of “natural gestures” to be used to control the UI, interact with apps or play games.

    Ultrasound’s backers & challengers

    Ultrasound also contrasts favourable to a camera-based gesture technologies like Leap Motion, according to Bryhni, which requires the user to perform their hand movements within a relatively narrow “cone” where the camera can see them. “If you put cameras onto the screen — let’s say integrated into the bezels — then you need to hold your hand at 90 degrees so it’s super inconvenient,” he said, discussing the drawbacks of using camera-based systems to enable gestures on mobile devices. “The benefit with our technology is it works with the sensor placed flat and invisible, hidden within the bezel of the screen.

    “Using ultrasound enables a very natural way of gesturing. And also the big benefit that we can work on a smartphone and a tablet, and we’re not dependent on any high powered lights or cameras.”

    “Ultrasound uses a fraction of power in comparison to optical 3D technologies.  Even in low light or in the dark, you can use the same natural hand movements you use every day,” added Elliptic Labs CEO Laila Danielsen in a statement. “With our software SDK we are giving smartphone manufacturers a way to easily and cost effectively include consumer-friendly touchless gesturing into their phones.”

    Elliptic is not the only company looking at using ultrasound to extend user interfaces in new ways. Chipmaker Qualcomm acquired digital ultrasound company EPOS last November — perhaps with a view to pushing ultrasound tech into styluses, which would allow for a nearby mobile device to detect the position of the pen and pick up notes being made on a paper notepad, for instance. Qualcomm is also evidently interested in how ultrasound can be used to support gesture interfaces on mobile devices.

    In terms of competing with Qualcomm, Bryhni argues that the EPOS’ pen tracking technology Qualcomm acquired is different to what Elliptic Labs has been focused on. “We’ve been dedicated on gesture recognition for eight years. We’ve seen this coming,” he said, adding: “We have the time and expertise in the market.” He also points out that Elliptic is offering device makers who make their own processors — as Samsung and Apple do, for instance — an alternative to having to buy all Qualcomm chips. “Our customers are quite interested in having an independent chipset for gesture-recognition technology,” he added. “The vendors tend to like that flexibility.”

    Another area of flexibility is that Elliptic has made its technology available within an off-board DSP — the Wolfson 5110 – which allows an OEM to create a device that supports gesture controls even when the phone’s main processor is sleeping (i.e. so that a gesture interface does not compromise other power efficiency technologies which help to improve battery longevity on a mobile device). “A trend in modern smartphones and tablets is you offload some of the heavy, single processing to a dedicated DSP,” he said. “We have done that at this point… with a very high powered and super small DSP.”

    Gesture injection For Existing Apps

    As well as today’s Android SDK, which lets developers and OEMs build new software that can take full advantage of Elliptic Labs’ ultrasound tech’s capabilities, it is offering the ability to ‘retro-fit’ the tech to existing applications. It’s calling this ability to map mid-air gestures to existing apps “gesture injection”.

    “For example if you wave left to right you create an arrow left event. If you swipe from the top of the screen and down you generate a close application event, for instance. And if you detect a gesture coming in from the right into the screen we for instance engage a menu, so in this way a legacy game such as Fruit Ninja… [can be gesture-mapped],” said Bryhni.

    “It’s much more fun slashing fruits in the air than swiping on the screen,” he added.

    Fruit Ninja mid-air swipes certainly sound fun but that’s just one application. Does the mobile space generally need gesture-based interfaces? As noted above, OEMs have dabbled here already with relatively uninspiring results. Smartphone touchscreens continue to engage their users with evolving on-screen gestures. So off-screen gestures are likely going to need some killer apps to get the users fired up — something that makes mid-air finger wiggling as cool as pinch to zoom was, when that first aired. But what are those gesture-powered apps going to be?

    Killer Apps: From Comms To Cars & Watches

    Bryhni sees two main use-cases for gesture-based interfaces on mobiles. Firstly, controlling the UI, so things like changing apps, engaging menus, browsing up and down, selecting images and so on; and secondly: custom applications, such as games or mapping apps, or switching between productivity apps. He also sees potential for the tech to allow our devices to pick up on some of the unspoken communication that’s conveyed by things like hand gestures and body language.

    “If you watch people communicate a good fraction of their communication is actually gestures… So gestures are actually quite an important part of expressing yourself and we think computers should detect this and include it in the general user interface,” he said. “It’s been a major change in smartphones when the touch panel was invented… and we believe that new user interfaces that can make it more natural to interact with your device actually has the potential to… strongly influence the market.”

    But perhaps the biggest pull on the technology in the mobile space at least is the need for Android OEMs to add something different to their devices so they can stand out from each other and the rest of the industry. “I would say the ones that really need this are the OEMs,” Bryhni added. “They have a very strong need to differentiate themselves.”

    Asian mobile makers are likely to continue to be at the fore of smartphone-based gesture interfaces, according to Bryhni. “We are a European and American company but the Asians are quite aggressive when it comes to introducing new technology,” he said, noting that Elliptic, which has offices in Norway and Silicon Valley, will be opening an office in the region soon, to support Asian OEMs.

    “It should be our turf,” he added, discussing how innovation is shaking out in the smartphone space, with Asia leading the charge when it comes to pushing new technologies into devices. “They are more willing to try. [The U.S. and Europe] can’t afford to let Asians completely rule this business.”

    Moving beyond mobile, Bryhni said he sees potential for ultrasound-powered gestures to elbow their way into cars — as a hands-free way to command in-car entertainment or navigation systems, for instance. “Car applications is a use-case that we are pursuing. We are working with automotive manufacturers to put ultrasonic touchless gesturing into cars. The automotive use case is highly relevant because it works in the dark and in changing light situations (such as when you are driving with the sun coming into a window or at night),” he said.

    Asked Specifically about smartwatches, which have obvious screen size constraints and could therefore benefit from a gesture-based interface that doesn’t require the user’s fingers to block on-screen content, he said it would certainly be possible to mount the tech in a wrist-based mobile device.

    “The process is feasible to make this happen and that is something we are envisioning but we are not actively working with anyone at present,” he told TechCrunch. “It’s an opportunity that could work because nobody has looked into it before because of power consumption. With our technology, it becomes feasible to command a new smart-watch and make it touchless. It could be a distinctive new feature that could differentiate one vendor from another.”

    How Gesture Control Actually Works

    Gesture control sure is cool, even if it’s still a little bit of a gimmick. But how the hell does it actually work?

    Read more…


        



    Don’t miss Wikimedia, OLPC, Leap Motion, Voltaic and more at Expand NY!

    Don't miss Wikimedia, OLPC, Leap Motion, Voltaic and more at Expand NY!

    We’re getting more and more impatient waiting for Expand New York with every subsequent speaker announcement — and we’ve got five more names to lay on you right now. This November, we’ll be joined by Wikimedia’s director of mobile, Tomasz Finc, Leap Motion’s director of developer relations, Avinash Dabir, The One Laptop Per Child Association’s chairman and CEO, Rodrigo Arboleda, founder / CEO of Voltaic Systems Shayne McQuade and Michael Carroll, a professor of law at American University Washington College of Law and founding member of Creative Commons.

    And, of course, we’ve already announced a number of folks who will be joining us on November 9th and 10th, including LeVar Burton, Reggie Watts, Ben Heck, Peter Molyneux, Ben Huh and speakers from companies like Google, Sony, Pebble, Adafruit and The Electronic Frontier Foundation — and we’ve still got more to come. Check out the full list below.

    Filed under:

    Comments

    Source: Engadget Expand

    First HP Computer With Embedded Leap Motion Tech Will Ship This Fall For $1,049.99

    Shot_5_hero

    Hardware startup Leap Motion, which managed to sell an impressive number of pre-orders for its standalone Leap Motion Controller gesture control computer accessory, today announced that the first fruits of its OEM partnership with PC-maker HP will hit shelves this Fall. The HP ENVY17 Leap Motion SE is the first shipping computer to build the startup’s tech directly in, and features a new embedded Leap Motion sensor that dramatically reduces size vs. previous embedded designs.

    “We have a new, very small embedded module, which is about 70 percent thinner than the existing components in the Leap Motion,” Leap Motion CEO and co-founder Michael Buckwald said in an interview. “But it also has the same performance as the existing motion controller. HP is the first OEM to embed this new model sensor into a device.”

    The smaller embedded sensor will be placed in the base of the computer, next to the trackpad, Buckwald says, and the new sensor design not only makes that possible, but also makes it feasible to build sensors into tablets and the smallest, slimmest laptop designs, according to him. That’s in keeping with Leap Motion’s larger aim as a company.

    “Our goal is for the technology to be in as many devices as possible and to sort of disappear,” he explained. “Obviously we love the [Leap Motion Controller] and it’s been very successful, but it’s also great to see consumers have other ways to use the technology and obviously this makes it easier for someone to always have it with them, and makes it much more portable.”

    The standalone Leap Motion Controller may have sold in considerable numbers via pre-orders, but reviews for the device were less than enthusiastic. My own experience with the hardware definitely left a lot to be desired, but Buckwald says the company isn’t focused on replacing keyboard and mouse, but on providing another input method option that will be better suited to software specifically designed for its use.

    HP will bundle the ENVY17 Leap Motion SE with Airspace, Leap Motion’s app store software, as well as select pre-installed applications exclusive to the computer maker. Leap Motion has also had interest from other OEMs, Buckwald says, as more and more computer makers look for ways to differentiate, and plans to continue to expand its retail presence with new partners, especially in the international market. 52 percent of Leap Motion pre-order sales were to customers outside the U.S., and Buckwald sees strong demand for its tech abroad.