An eccentric, or perhaps just egocentric, Shiek in Abu Dhabi has carved his name into the desert of an United Arab Emirates island he owns in giant 1000 meter letters. And it just happens to be visible from space. More »
An eccentric, or perhaps just egocentric, Shiek in Abu Dhabi has carved his name into the desert of an United Arab Emirates island he owns in giant 1000 meter letters. And it just happens to be visible from space. More »
Olivetti’s no carioca. It’s a bonafide Italiano electronics company, but that didn’t stop its latest round of tablets from making an appearance south of the equator. Shown off at the 2011 Eletrolarshow in Brazil, the 10-inch Olipad 110 made its second video appearance to strut its sleek NVIDIA Tegra 2-processing, Honeycomb-operating stuff. The successor to the Olipad throne also brought its little brother, the Olipad 70, to the party — rocking a 7-inch capacitive display, Android 2.3 Gingerbread, WiFi and Bluetooth. Sadly, our penchant for bossa nova does not extend to Portuguese language fluency, so you’re on your own after the break.
What better way to combine your nerdy loves of computer programming and Star Wars than with a robot that can actually battle with a lightsaber?
This is “JediBot,” a Microsoft Kinect–controlled robot that can wield a foam sword (lightsaber, if you will) and duel a human combatant for command of the empire. Or something like that.
“We’ve all seen the Star Wars movies; they’re a lot of fun, and the sword fights are one of the most entertaining parts of it. So it seemed like it’d be cool to actually sword fight like that against a computerized opponent, like a Star Wars video game,” graduate student Ken Oslund says in the video above.
The world of dynamic robotics and AI has been immensely aided by the affordable, hackable Microsoft Kinect. The Kinect includes multiple camera and infrared light sensors, which makes recognizing, analyzing and interacting with a three-dimensional moving object — namely, a human — much simpler than in the past. Microsoft recently released the SDK for the Kinect, so we should be seeing increasingly useful and creative applications of the device. The KUKA robotic arm in the video above is traditionally used in assembly line manufacturing, but you may remember it from a Microsoft HALO: Reach light sculpture video last year.
According to the course overview (.pdf) for the “Experimental Robotics” course, the purpose of the laboratory-based class is “to provide hands-on experience with robotic manipulation.” Although the other groups in the class used a PUMA 560 industrial manipulator, the JediBot design team, composed of four graduate students including Tim Jenkins and Ken Oslund, got to use a more recently developed KUKA robotic arm. This final project for the course, which they got to choose themselves, was completed in a mere three weeks.
“The class is really open-ended,” Jenkins said. “The professor likes to have dynamic projects that involve action.”
The group knew they wanted to do something with computer vision so a person could interact with their robot. Due to the resources available, the group decided to use a Microsoft Kinect for that task over a camera. The Kinect was used to detect the position of JediBot’s opponent’s green sword-saber.
The robot strikes using a set of predefined attack motions. When it detects a hit, when its foam lightsaber comes in contact with its opponent’s foam lightsaber and puts torque on the robotic arm’s joints, it recoils and moves on to the next motion. It switches from move to move every one or two seconds.
“The defense mechanics were the most challenging, but people ended up enjoying the attack mode most. It was actually kind of a gimmick and only took a few hours to code up,” Jenkins said.
The project utilized a secret weapon not apparent in the video: a special set of C/C++ libraries developed by Stanford visiting entrepreneur and researcher Torsten Kroeger. Normally, the robot would need to plot out the entire trajectory of its motions from start to finish — preplanned motion. Kroeger’s Reflexxes Motion Libraries enable you to make the robot react to events, like collisions and new data from the Kinect, by simply updating the target position and velocity, with the libraries computing a new trajectory on-the-fly in less than a single millisecond.
This allows JediBot to respond to sensor events in real time, and that’s really the key to making robots more interactive.
Imagine a waiterbot with the reflexes to catch a falling drink before it hits the ground, or a karate robot you can spar against for practice before a big tournament.
I doubt anyone would be buying their own KUKA robotic arm and creating a sword-playing robot like JediBot in their home, but innovations like this using interactive controllers, and the availability of the Reflexxes Motion Libraries in particular for real-time physical responses, could help us see robots that better interact with us in daily life.
Apple stopped loving the MacBook a long time ago. It was obvious to everyone, perhaps, but the MacBook. And now Apple’s decided to stop even pretending. The plastic MacBook is gone. More »
A mother rushes to comfort her sobbing child. Choking through the sadness, he explains: “It’s… it’s tablets, mama. They have so few ports.” That’s when the Toshiba product manager wakes up, and sets back to finish his work on Thrive. More »
As part of a global recycling campaign, a humanoid-robot-looking creature is showing up at random Beijing post offices to inspire people to turn in their old handsets.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.