Faceware and Vicon, two makers of motion capture technologies, has announced a new partnership that will bring their respective offerings together. For now, the partnership is bringing a simple merging … Continue reading
You’re sitting still, right? Wrong. When it comes to how fast you’re moving right this instant, everything is relative.
Wouldn’t it be cool if birds left visible trails behind them, like jets tracing the sky with smoke? That’s exactly the effect of Rhode Island School of Design professor Dennis Hlynsky achieves in his mesmerizing videos posted today at This Is Colossal.
If your workouts never quite gel with your soundtrack, help may soon be at hand. The Guardian is reporting that Spotify has plans to measure heart rate and motion to help choose you the perfect playlist for any situation.
If you’re looking to make music with your iPhone or iPod Touch, check out this add-on, which looks pretty neat. The AUUG Motion Synth is a combination grip/case for your iDevice, which lets you use your hand and motion to play music.
It’s supposed to let you make your music more naturally than with a touchscreen by itself, adding tactile edges for each of its on-screen controls so you don’t have to look at the screen while playing, and allowing single-handed play.
When used with its companion app, it will transform your device into an eight-button synthesizer. The instrument is controlled via the app’s keys as well as the motion of your hand. You can change pitch and tone this way, and the buttons will play notes.
The AUUG app isn’t a synthesizer itself, but it can be used to control other iOS audio apps like Garage Band. You can also control external synth hardware as well. Here’s a look at AUUG being used to control Ableton Live, by sending MIDI data over Wi-Fi.
The device is being funded via Kickstarter. You can get one if you pledge at least $68(USD) by December 19th.
We all know that humanoid robots roaming the streets aren’t that far away, but there are still plenty of kinks to be worked out of the designs. The AMBER 2 Robot does its best to emulate human foot movements, with the goal of making a machine that can walk on all sorts of terrain.
The AMBER 2 Robot from Texas A&M Amber Lab has almost all of the pivot points necessary to mimic human-like locomotion, which is very complex. You’ll note the purposeful stumble at the end of the video, which was intentional to show that the boom only provided lateral stability.
I want to see when these kinds of legs will be integrated into a real walking robot. Hopefully, Skynet won’t use them to help exterminate us all.
Life moves fast. Sometimes it even feels like a blur. And for this week’s Shooting Challenge, we’re celebrating the speed of life.
Sixense’s Stem motion tracker may get Android and iOS support through stretch goal (video)
Posted in: Today's ChiliSixense has so far promised only PC compatibility for its Stem motion tracker, but the company just teased us with the prospect of a wider ecosystem. It now says that Stem’s developer kit will support Android and iOS if the crowdfunded project reaches a new $700,000 stretch goal. Mobile devices linked to a Stem tracker could serve as motion controllers, virtual cameras and even head-mounted displays. As an incentive to make a pledge, Sixense is adding a pair of programmer-friendly pledge rewards: $149 gets a one-tracker bundle with no controllers, while an early five-tracker bundle has returned at a lower $299 price. Whether or not you chip in, you can watch a conceptual demo after the break.
Filed under: Gaming, Peripherals
Source: Kickstarter
iPhone 5s packs M7 motion-sensing chip, CoreMotion API for more accurate tracking
Posted in: Today's ChiliApple’s new flagship iPhone 5s is about to have much more detailed information about how much its users are moving, thanks to a new M7 “Motion co-processor.” Unveiled during today’s live event, it works along with the new 64-bit A7 CPU to measure motion data continuously from the accelerometer, gyroscope and compass without draining the battery as heavily. It looks like the iPhone 5s will be ready to take over for hardware extras like the FitBit or Nike Fuel wristband, but with a new CoreMotion API, devs for those companies and others can pull the information into their apps. The CoreMotion API specifically works to identify user movement, and offers “optimizations based on contextual awareness.” Overall, it’s very similar to what we’d heard would be in the Moto X, although we haven’t seen all of these extra sensors used for activity tracking quite in this way. Nike was on hand with a new Nike+ Move app that used the M7 and GPS to track users’ activities, and we wouldn’t be surprised if others follow closely behind. Nike called the Move app an “introductory experience” to Nike Fuel in a tweet, so maybe it’s planning to upsell customers on (potentially?) more detailed tracking with its hardware add-ons afterward.
Check out all the coverage at our iPhone ‘Special Event‘ 2013 event hub!
The Daily Roundup for 07.22.2013
Posted in: Today's ChiliYou might say the day is never really done in consumer technology news. Your workday, however, hopefully draws to a close at some point. This is the Daily Roundup on Engadget, a quick peek back at the top headlines for the past 24 hours — all handpicked by the editors here at the site. Click on through the break, and enjoy.