New NBA stats deal will put motion tracking cameras in every arena

NBA to use Stats LLC's tracking cameras for generating player data on the fly

The NBA faces a big challenge now that it offers all its player statistics to the public — how does it generate stats that hold the interest of basketball fans? The league’s solution is a multi-year agreement to use Stats LLC’s SportVU motion tracking system in every arena (15 teams had already implemented the technology on their own). As of the 2013-14 season, every NBA arena will have a six-camera setup that creates a steady stream of player data based on ball possession, distance, proximity and speed. The NBA’s website, NBA Game Time and NBA TV will all use the information to expand game stats beyond what we see today with heat maps and specific details on each possession. There’s no telling how useful that extra knowledge will be, but we won’t be shocked if it helps settle a few sports bar arguments.

[Image credit: Rondo Estrello, Rondostar.com / Flickr]

Filed under: ,

Comments

Via: AP (Yahoo)

Source: NBA, Stats LLC

University of Tokyo’s fast-tracking camera system could revolutionize sports coverage (video)

Image

Researchers at the University of Tokyo’s Ishikawa Oku Lab have been hard at work on a camera system that can track fast moving objects incredibly well, and the technology may change the way sports like baseball and soccer are televised. Recently, the team building the system has entered the next phase of testing: taking it outside, to see if will perform as well as it has in a lab setting. If all goes according to plan, they expect it’ll be ready for broadcast use in roughly two years.

Demos of the tech are pretty impressive, as you can see in the video below showing the (warning: not recommended watching for those easily prone to motion sickness). To get the ping-pong ball-centric shots, the system uses a group of lenses and two small mirrors that pan, tilt and move so the camera itself doesn’t have to. The mirrors rely on a speedy image tracking system that follows movement, rather than predicting it. Swapping the camera out for a projector also has some interesting applications — it can paint digital pictures on whatever its tracking. Sounds like the perfect gadget for folks who wish their table tennis balls looked like emoji.

Filed under: ,

Comments

Via: Diginfo

Source: Ishikawa Oku Laboratory

Eyes-on: MIT Media Lab’s Smarter Objects can map a user interface onto… anything (video)

Eyeson MIT Media Lab's Smarter Objects can map a user interface onto anything video

While patrolling the halls of the CHI 2013 Human Factors in Computing conference in Paris, we spied a research project from MIT’s Media Lab called “Smarter Objects” that turns Minority Report tech on its head. The researchers figured out a way to map software functionality onto tangible objects like a radio, light switch or door lock through an iPad interface and a simple processor / WiFi transceiver in the object. Researcher Valentin Huen explains that “graphical user interfaces are perfect for modifying systems,” but operating them on a day-to-day basis is much easier using tangible objects.

To that end, the team developed an iPad app that uses motion tracking technology to “map” a user interface onto different parts of an object. The example we saw was a simple radio with a a pair of dials and a speaker, and when the iPad’s camera was pointed at it, a circular interface along with a menu system popped up that cannily tracked the radio. From there, Huen mapped various songs onto different positions of the knob, allowing him to control his playlist by moving it — a simple, manual interface for selecting music. He was even able to activate a second speaker by drawing a line to it, then “cutting” the line to shut it off. We’re not sure when, or if, this kind of tech will ever make it into your house, but the demo we saw (see the pair of videos after the break) seemed impressively ready to go.

Filed under: ,

Comments

Sony reveals how the PlayStation 4 Eye works

PS4 Eye promises to unlock your PlayStation at a glance, tips hat to Kinect

Sony’s Shuhei Yoshida has dished the dirt on how the company’s latest camera accessory will work. The PlayStation 4 Eye comes with a pair of 1,280 x 800 cameras, four microphones and an 85-degree field of view. The two lenses are designed to be used in a variety of ways, including triangulating the 3D space, gesture recognition, Kinect-style body tracking, and in conjunction with accessories like the Wonderbook or DualShock 4 controller. “It’s not just a way to identify your player number, it also works like a PS Move,” Yoshida said of the new DualShock’s light bar. “It’s an extension of the PS Move technology that we incorporated into the DualShock so that the camera can see where it is.”

The Sony Studios chief used a PS Eye-style AR game as an example, saying that with the original camera, one lens had to do everything. With the new unit, one camera will concentrate on capturing the action and ensuring good picture quality, while the other is dedicated to motion tracking. Another reason that the Move functionality was incorporated into the DualShock is to enable the console know where you’re sitting in relation to the TV (and your on-screen character). The company is also aiming to enable users to take 3D pictures and video and store it on the console. As for the microphones in the new Eye and how that’ll impact interaction with the PlayStation 4 on a system level, Yoshida wasn’t giving up any details. Though he said it’ll be incorporated into games (a la Kinect voice commands on Xbox 360 games), he wouldn’t give up whether you could use your voice to control the PlayStation 4 on a system level.

Ben Gilbert contributed to this report.

Filed under: ,

Comments

Leap Motion used for legitimate air drumming, authentic instrument control

Leap Motion used for legitimate air drumming, authentic instrument control

Hacking Kinect might get you access to an audible air guitar, but Stephane Berscot can do you one better — tweaking the pitch of a tangible axe via Leap Motion’s virtual work space. Berscot configured a Leap tweak his guitar’s pitch based on the instrument’s position over the device. That’s not all, either, the makeshift MIDI controller also functions as a keyboard equalizer and a set of functional air drums. Combining all three tricks together scored Berscot a pretty mean demo track, but it’s apparently a lot harder than it looks. “It wasn’t easy to play drums with it,” he said, explaining how he had to detect beats based tracking the upward and downward velocity of the drumstick. “My method is pretty basic and still needs some work.” Even so, the demo definitely shows the device’s potential. Skip on past the break to see Berscot kick out the jams.

Filed under:

Comments

Via: Make

Source: Gratoo (YouTube)

Extreme Reality’s Extreme Motion uses 2D webcams for 3D motion games (hands-on)

Extreme Reality Extreme Motion hands-on

Extreme Reality‘s technology revolves around gestures, and its latest effort is to bring that movement to the masses: its Extreme Motion developer kit turns just about any off-the-shelf webcam or built-in camera on common platforms, including Android, iOS and Windows, into an almost Kinect-like system capable of tracking 3D motion. Despite missing depth cameras or other additional sensors, it’s theoretically quite accurate — the software tracks joints across the body in every frame, although it’s not quite so sensitive as to track fingers.

This author had the chance to make a fool of himself in front of a laptop’s camera to see how well Extreme Motion works. In short, reasonably well: while it wasn’t in perfect sync, it recognized with less-than-elegant moves in a Dance Central-style demo title and flagged whether a shimmy was right on target or evidence of two left feet. Of course, this experiment was conducted in a brightly-lit hotel ballroom, where body detection is ideal, so take the experiment with a grain of salt. It’s still adept enough that the developers who will have access to the (currently free) toolkit can produce motion games we’d be sincerely interested in playing.

Michael Gorman contributed to this report.

Filed under: ,

Comments

Source: Extreme Reality

Xsens teases wearable 3D body sensors that won’t cost, will track an arm and a leg (video)

Xsens teases wearable 3D body sensors that won't cost an arm and a leg video

When we think of full-body motion capture, we most often associate it with movie-grade equipment that demands a dedicated room, odd-looking suits and a corporate bank account to finance it all. Xsens hints that we may not have to rent a professional studio (or stand in front of a Kinect) to get complete body tracking for personal use. It’s planning to show a wearable, 3D-capable tracking system at CES that uses “consumer grade” MEMS sensors to monitor joint positions and movement — in other words, the kind of technology that might go into a phone’s accelerometer, just strapped to our arms and legs. Further details are scarce, although Xsens is pressing for uses in everything from fitness to gaming. We’d like to see partners line up so that there’s a product we can buy in a store. Until then, we’ll have to make do with the company’s skateboard-dominated teaser clip, which you can find after the break.

Continue reading Xsens teases wearable 3D body sensors that won’t cost, will track an arm and a leg (video)

Filed under: ,

Comments

Source: Xsens

Mosoro releases its Bluetooth LE sensors and SDK for VIP appcessory developers

Mosoro releases its Bluetooth LE sensors and SDK for VIP appcessory developers

Since we last heard about Mosoro’s Lego-brick sized Bluetooth LE modules, they’ve changed their names, picked up another member and are now making their way to iOS app developers. The 3D-Motion’s got an accelerometer, gyroscope and magnetometer, while the Enviro measures temperature, humidity and barometric pressure. New to the team is Proximity, useful for triggering location-based apps and tracking motion for creating alerts. All three rechargeable Bluetooth low energy sensors have “shake-to-wake” support, an RGB “glow-cap” for notifications and a humble programmable button. They are expected to hit retail in fall 2012, but “VIP” app developers can grab them now, as well as the SDK which simplifies iOS Bluetooth integration. Got the ideas and inclination to become one of Mosoro’s “rock star app-developer partners?” Then go sign up on the website and see if you make the VIP grade.

Continue reading Mosoro releases its Bluetooth LE sensors and SDK for VIP appcessory developers

Filed under: ,

Mosoro releases its Bluetooth LE sensors and SDK for VIP appcessory developers originally appeared on Engadget on Tue, 04 Sep 2012 21:09:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Second Story uses Kinect for augmented shopping, tells us how much that doggie is in the window (video)

Second Story uses Kinect for augmented shopping, tells you exactly how much that doggie is in the window video

Second Story isn’t content to leave window shoppers guessing at whether or not they can afford that dress or buy it in mauve. A new project at the creative studio uses the combination of a Kinect for Windows sensor with a Planar LookThru transparent LCD enclosure to provide an augmented reality overlay for whatever passers-by see inside the box. The Microsoft peripheral’s face detection keeps the perspective accurate and (hopefully) entrances would-be customers. Coming from an outlet that specializes in bringing this sort of work to corporate clients, the potential for retail use is more than a little obvious, but not exclusive: the creators imagine it also applying to art galleries, museums and anywhere else that some context would come in handy. If it becomes a practical reality, we’re looking forward to Second Story’s project dissuading us from the occasional impulse luxury purchase.

Continue reading Second Story uses Kinect for augmented shopping, tells us how much that doggie is in the window (video)

Filed under: ,

Second Story uses Kinect for augmented shopping, tells us how much that doggie is in the window (video) originally appeared on Engadget on Thu, 26 Jul 2012 02:57:00 EDT. Please see our terms for use of feeds.

Permalink Next at Microsoft, The Next Web  |  sourceSecond Story  | Email this | Comments

RIM patent uses motion, CAPTCHAs to stop texting while driving, shows a fine appreciation of irony

RIM patent uses motion, CAPTCHAs to stop texting and driving, shows a fine appreciation of irony

More and more people understand that texting while driving is a bad idea, but RIM has just been granted a patent that would have smartphones step in before things get out of hand. Going beyond just filtering inbound messages like some motion-based lockdown apps, the BlackBerry maker’s invention also turns off the creation of any outbound messages as long as the phone is moving within a given speed range. The override for the lock is the dictionary definition of ironic, however: the technique makes owners type out the answer to a CAPTCHA challenge onscreen, encouraging the very problem it’s meant to stop. As much as we could still see the hassle being enough to deter some messaging-addicted drivers, we have a hunch that the miniscule hurdle is a primary reason why the 2009-era patent hasn’t found its way into a shipping BlackBerry. Maybe RIM should have chronic texters solve a Rubik’s Cube instead.

Filed under:

RIM patent uses motion, CAPTCHAs to stop texting while driving, shows a fine appreciation of irony originally appeared on Engadget on Wed, 11 Jul 2012 12:28:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceUSPTO  | Email this | Comments