Insert Coin: Linkbot modular robotic platform lets you quickly build a bot, skills

In Insert Coin, we look at an exciting new tech project that requires funding before it can hit production. If you’d like to pitch a project, please send us a tip with “Insert Coin” as the subject line.

Insert Coin Linkbot

Everybody loves robots, but the initial ardor for building one can quickly be snuffed out by the complex reality of actually programming it to do anything. That’s where Linkbot comes in, a new project from the Barobo team that brought us the Mobot. It’s designed as a modular system that can be expanded infinitely with accessories like a camera mount, gripper, and wheels, thanks to three separate mounting surfaces — which also have standard #6-32 screw attachment holes on the mounting plate to attach personality-enhancing cutouts. Despite the expansion potential, though, it can still be used right out of the box to do robotics without touching a lick of code. That’s thanks to several built-in modes like BumpConnect, which permits wireless connections between the modules by touching them together; and PoseTeach, to program complex motions by hand in a similar (but less time-consuming) manner to stop-motion animation techniques.

For those who want to step it up a notch, the system lets you go far past basic mech fun. The Linkbot itself has two rotating hubs with absolute encoding, along with an accelerometer, buzzer, multicolored LCD and ZigBee wireless system with a 100m line-of-sight range. There are also optional breakout and Bluetooth boards to connect sensors like range finders, IR proximity sensors, photo detectors and thermostats. The outfit’s BaroboLink software for Mac, PC or Linux is included to program the Arduino-compatible bot in several languages as well, and can even translate previously created PoseTeach motions into computer routines. So far, the company has created working prototypes and even shipped them to local schools, so if you’re interested, you can pledge a minimum $129 toward the company’s $40,000 target to grab one. That’ll net you a Linkbot, two wheels, the BaroboLink software, access to the MyBarobo community — and hopefully a jolt to your robotics confidence.

Filed under:

Comments

Source: Kickstarter

Eyes-on: University of Pennsylvania’s TitanArm exoskeleton (video)

Eyes-on: University of Pennsylvania's TitanArm exoskeleton (video)

TitanArm already took home silver in a competition for senior projects at the University of Pennsylvania, and now the team behind it is visiting Orlando to compete in the Intel-sponsored Cornell Cup for embedded design. We stopped by the showroom and snagged a few minutes with the crew to take a look at their creation: an 18-pound, untethered, self-powered exoskeleton arm constructed for less than $2,000.

To wield the contraption, users attach the cable-driven mechanical appendage to themselves with straps from a military-grade hiking backpack, and guide it with a thumbstick on a nunchuck-like controller. If a load needs to be held in place, the wearer can jab a button on the hand-held control to apply a brake. A Beagle Bone drives the logic for the setup, and it can stream data such as range of motion wirelessly to a computer. As for battery-life, they group says the upper-body suit has previously squeezed out over 24 hours of use without having to recharge.

Filed under: ,

Comments

Hexa drone is half-hexacopter, half-hexapod, 100% terrifying

When the robots finally come to harvest us, they’ll probably descend from the skies and then scuttle, spider-like, into our homes and shelters, just like MadLab Industries‘ terrifyingly ominous Hexa. The combined horror of a six-bladed hexacopter and a 6-legged hexapod, the omnidirectional robot can either tackle terrain on-foot or take to the air to avoid obstacles, then using the multipurpose legs as a grapple to snatch up objects (objects that, it has to be said, are roughly the size of a human baby’s head in MLI’s demo video).

hexa

The DIY ‘bot pairs a PhantomX Hexapod kit and a custom MLI hexacopter, using carbon-fiber and aluminum components to keep the weight down. In total, the whole thing tips the scales at 10.8 pounds, and is strong enough to not only transport its own weight, but light objects it can grasp with its legs.

Possible future improvements could include the ability for the two sections to detach and be independently controlled, meaning Hexa could fly in, deposit the hexapod, and then fly back out again. That could eventually be useful for search & rescue operations, transporting Hexa-style hunting drones to a disaster area and then leaving them to rummage through the rubble for survivors.

The MLI team said back in December that, if demand was deemed sufficiently strong, it would consider Kickstarter for a Hexa kit. No word on what stage that project is up to, nor how much it might eventually cost.

Of course, right now there are human controllers in charge of Hexa, but AI research is doing its level best to cook up autonomous versions that are so ominous that even Google’s Eric Schmidt is calling for drone increased regulation. The situation is only likely to get more serious, however, with recent DARPA proposals suggesting potential funding for companies capable of delivering self-controlled flying gadgets.

[via Hack’n’Mod]


Hexa drone is half-hexacopter, half-hexapod, 100% terrifying is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Driving Miss dAIsy: What Google’s self-driving cars see on the road

We’ve been hearing a lot about Google‘s self-driving car lately, and we’re all probably wanting to know how exactly the search giant is able to construct such a thing and drive itself without hitting anything or anyone. A new photo has surfaced that demonstrates what Google’s self-driving vehicles see while they’re out on the town, and it looks rather frightening.

google-car

The image was tweeted by Idealab founder Bill Gross, along with a claim that the self-driving car collects almost 1GB of data every second (yes, every second). This data includes imagery of the cars surroundings in order to effectively and safely navigate roads. The image shows that the car sees its surroundings through an infrared-like camera sensor, and it even can pick out people walking on the sidewalk.

Of course, 1GB of data every second isn’t too surprising when you consider that the car has to get a 360-degree image of its surroundings at all times. The image we see above even distinguishes different objects by color and shape. For instance, pedestrians are in bright green, cars are shaped like boxes, and the road is in dark blue.

However, we’re not sure where this photo came from, so it could simply be a rendering of someone’s idea of what Google’s self-driving car sees. Either way, Google says that we could see self-driving cars make their way to public roads in the next five years or so, which actually isn’t that far off, and Tesla Motors CEO Elon Musk is even interested in developing self-driving cars as well. However, they certainly don’t come without their problems, and we’re guessing that the first batch of self-driving cars probably won’t be in 100% tip-top shape.

[via BuzzFeed]


Driving Miss dAIsy: What Google’s self-driving cars see on the road is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Japanese scientists build baseball-playing robot with artifical brain

Researchers and scientists at the University of Electro-Communications in Tokyo and the Okinawa Institute of Science and Technology have built a robot with quite the sports prowess, although you probably won’t see it take the field anytime soon. The robot is able to swing and hit at plastic balls, and can improve its swing over time.

robot-baseball

The robot only stands a couple feet tall, and it uses a giant flyswatter-like bat in order to make contact with the ball, so it essentially can’t hit like Alex Rodriguez, but maybe in the future the robot will give the all-star a run for his money. The robot features an artificial brain with the power of 100,000 neurons that allow the robot to learn and improve on its swing over time.

How the whole thing works is that when a ball is pitched to the robot, an accelerometer behind the robot records information about the flight and speed of the ball, and this data is sent to a separate machine off to the side that holds the robot’s brain. The data gets processed and it lets the robot know when to swing.

The impressive part is that if the speed of the ball changes, the robot can re-learn the swing all over again to try and hit the ball at the new speed. Hopefully the researchers will be able to soon give the robot a real bat instead of a giant flyswatter and be able to hit real baseballs, but that kind of technology probably won’t be on its way for several more years.

[via Wired]


Japanese scientists build baseball-playing robot with artifical brain is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Finally A Cute Robot That Doesn’t Have The Crazy Eyes

Wall-E has adorable droopy eyes and is totally compelling, but I have to remind everyone that he isn’t real. If you need a minute to let that sink in take as much time as you need. We’re gonna talk about Romibo and you can catch up later. More »

Artificial sense of touch gets smarter, lets robots really feel

Artifical sense of touch gets smarter, lets robots really feel

The verdict’s still out on whether or not androids dream of electric sheep. But their ability to feel? Well, that’s about to approach levels of human sensitivity. We’re of course talking about the sense of touch, not emotions. And thanks to work out of Georgia Tech, tactile sensitivity for robotics, more secure e-signatures and general human-machine interaction is about to get a great ‘ol boost. Through the use of thousands of piezotronic transistors (i.e., grouped vertical zinc oxide nanowires) known as “taxels,” a three-person team led by Prof. Zhong Lin Wang has devised a way to translate motion into electronic signals. In other words, you’re looking at a future in which robotic hands interpret the nuances of a surface or gripped object akin to a human fingertip and artificial skin senses touch similar to the way tiny hairs on an arm do.

What’s more, the tech has use outside of robotics and can even be levereged for more secure e-signature verification based on speed and pressure of a user’s handwriting. And the best part? These sensors can be manufactured on transparent and flexible substrates like the one pictured above, which allows for various real-world applications — just use your imagination. Pretty soon, even robots will have the pleasure of enjoying the touch… the feel of cotton and maybe even hum that jingle to themselves, too.

Filed under: ,

Comments

Via: MIT Technology Review

Source: Georgia Tech, Science

Scientists create artificial skin capable of feeling

Robotics is an intense field of research all around the world as scientists attempt to create robots that are able to assist humans in all sorts of situations. One thing that robots need to be able to assist humans in functional situations is the ability to feel an object. Knowing how hard to squeeze an object is something that humans take for granted.

smart-skin

For instance, we know how hard we can squeeze a fragile item, such as a glass, without breaking it. Without that sort of feedback, a robot could simply crush an item they are meant to handle safely. A group of scientists from the United States and China working together have created an experimental array that is able to sense pressure in the same range as the human fingertip.

The creation of the experimental array is a step forward in allowing robots and other machines to mimic the human sense of touch. The so-called “smart skin” is able to “feel” activity on its surface. The material is embedded with sensors that use bundles of vertical zinc oxide nano wires. The material also contains arrays consisting of about 8000 transistors.

Each of those 8000 transistors is capable of producing an electronic signal when placed under mechanical strain. These sensors are called taxels and promise sensitivity on par with the human fingertip. While there are other ways to give materials a sense of touch, the method developed by the researchers at the Georgia Institute of Technology relies on a different physical phenomenon. These researchers use tiny polarization charges from piezoelectric materials, such as zinc oxide, that are produced when the material is moved or placed under a strain. The scientists believe that the technology could be used in robotics, human computer interfaces, and other areas where mechanical deformation is present.

[via BBC]


Scientists create artificial skin capable of feeling is written by Shane McGlaun & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

New robots shed light on origins of sea turtles

Baby sea turtles have an interesting way of moving across sand and into the ocean, and scientists have been studying these little creatures for quite a while. So much, in fact, that engineers are designing and building robots that replicate the movements of a baby sea turtle in order to better understand the origins of these animals.

flipperbot

The robot is called the FlipperBot, and it features two motor-driven flippers with flexible wrists that are similar to sea turtle wrists. The robot is designed to travel through malleable surfaces like sand, just like sea turtles, and these kids of robots could help engineers further develop robot technology that will allow robots to swim through water, as well as walk on land.

The FlipperBot is quite small, measuring in at about 7.5 inches long and weighing only two pounds. Scientists are using these kinds of newly-developed robots to better understand how turtle flippers work, as well as help researchers understand how sea turtles evolved to be able to walk on land, especially with limbs that were designed for swimming rather than walking.

Daniel Goldman, a physicist at the Georgia Institute of Technology in Atlanta, says that these kinds of experiments will also work with other animals who have a long history. He says that him and his team are “working with paleontologists on studying what the first animals moving on land were like with more paleontologically realistic robots.” He notes that most animals likely encountered sand and mud, rather than concrete and hard rock, bringing up the question of how animals moved through these malleable substances.

[via Tech News Daily]


New robots shed light on origins of sea turtles is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Lego Mindstorms EV3 set to invade classrooms

Lego isn’t all just about fun and games. While most kids love to play around with Star Wars Lego sets and craft their own creations out of the plastic connectors, the company wants to bring Legos into the classroom. Lego has announced that their new Mindstorms EV3 robotic sets will be set for classroom use on August 1.

lego-mindstorms

The Lego Mindstorms sets are built to actively engage students and teach them about various fundamentals in the fields of science, technology, and engineering. The Lego sets come with digital workbooks, so teachers shouldn’t have a hard time learning about the new platform before handing them over the students.

We briefly got a look at the third-generation Mindstorms EV3 sets at CES 2013 back in January, and they essentially allow you to build different kinds of robots that you can control with an app on your iOS or Android device. The educational kits will come with software that will easily guide students through the process of building a Mindstorms robot.

Kits start at $340, and they’re available to pre-order right now. That may a bit on the expensive side, and to equip an entire classroom with these sets would be quite costly, but as with most other educational tools, they should be able to hold up for a few years and be able to go through hundreds of students hands.


Lego Mindstorms EV3 set to invade classrooms is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.