Burritob0t 3D Printer Outputs Delicious Tex-Mex

If you ask me, 3D printing is the future – of everything. I think that within the next two decades, we’ll be using 3D printing technology (or variants thereof) to manufacture everything from parts for cars to replacement organs for humans. And robots are already used widely in food production, so why not a 3D printer that can create lunch? That’s exactly what the guy behind the BurritoB0t has in mind.

burritob0t

The Burritob0t is just what it sounds like – a robotic printer/extruder that can output burritos. Interactive designer/builder Marko Manriquez came up with the idea of a robot that can fabricate burritos after realizing the overlap between 3D printing (additive assembly and interchangeable ingredients) with burrito construction.

burritob0t 2

The BurritoB0t is designed to automatically create 3D printed burritos by layering and extruding components to produce a fully customizable, edible Tex-Mex treat. Marko already has a prototype of the machine (tech details here), but it isn’t ready for prime time (or meal time) just yet. From the looks of this video clip below, there’s a plan to launch a Kickstarter campaign to raise funds to complete the BurritoB0t to feed hungry New Yorkers – but it doesn’t look like the fundraiser has started yet from what I can tell.

Whether or not the BurritoB0t ever sees the light of day at your local Taco Bell isn’t really important though. This is really just the tip of the iceberg in terms of robotic food construction. I can envision a day when 3D printing is fast enough that you’ll be able to dial up a recipe from your mobile device, and 3 minutes later, your fully-assembled meal will pop out of its tray. Now whether or not it tastes good… That’s a whole other question.


L’androïde Mahoro automatise le travail de laboratoire risqué

Mahoro, co-développé par AIST et Yaskawa, est un androïde universel utilisé pour l’automatisation de certaines tâches de laboratoire qui devaient auparavant être effectuées manuellement.
Le robot peut effectuer ces tâches, telles que la distribution et la culture, plus vite et plus précisément que les humains. Il peut donc effectuer des tests cliniques et travailler en condition de risque biologique efficacement.
Quand la précision, lors de tests d’amplification de gènes, de Mahoro …

Georgia Tech scientists developing biology-inspired system to give robot eyes more human-like motion

Georgia Tech scientists develop biologyinspired system to give robot eyes more humanlike motion

Having difficulty getting your robot parts to work as planned? Turn to nature — or better yet, look inside yourself. After all, where better to find inspiration than the humans that the machines will one day enslave, right? Researchers at Georgia Tech have been working to develop a system to control cameras in robots that utilizes similar functionality as human muscle. Says Ph.D. candidate Joshua Schultz,

The actuators developed in our lab embody many properties in common with biological muscle, especially a cellular structure. Essentially, in the human eye muscles are controlled by neural impulses. Eventually, the actuators we are developing will be used to capture the kinematics and performance of the human eye.

The team recently showed off their work at the EEE International Conference on Biomedical Robotics and Biomechatronics in Rome. When fully developed, they anticipate that the piezoelectric system could be used for MRI-based surgery, rehabilitation and research of the human eye.

Georgia Tech scientists developing biology-inspired system to give robot eyes more human-like motion originally appeared on Engadget on Sat, 07 Jul 2012 04:12:00 EDT. Please see our terms for use of feeds.

Permalink TG Daily  |  sourceGeorgia Tech  | Email this | Comments

Eye muscle replicated by piezoelectric materials

Researchers have successfully replicated the muscle motion of the human eye which is used to control camera systems thanks to the wonders of piezoelectric materials, and this new discovery will be used to improve the operation of robots. How so? How about hearing that this spanking new muscle-like action could eventually see robotic tools be a whole lot safer than before to use, not to mention paving the way for a more effective MRI-guided surgery as well as robotic rehabilitation.

Scientific advancements have always received inspiration from the natural world, and this new control system is no different. Being a piezoelectric cellular actuator, it relies on biologically inspired technology which enables a robot eye to move about in a manner that resembles an actual eye – although I am quite sure that it will most probably not end up at the uncanny valley just yet. Perhaps that might change in time, who knows?

By Ubergizmo. Related articles: Scientists develop ‘’most realistic’ robot legs ever, Brain scanner links man to robot ,

Mahoro lab android automates dangerous lab work

Mahoro, co-developed by AIST and Yaskawa, is a general-purpose android for automating lab work that previously had to be done manually.
The robot can do tasks, such as dispensing and culturing, faster and more precisely than people. So, it can do clinical tests and work with biohazards efficiently.
“For example, to develop influenza drugs, we do infection trials every day, using virulent strains of influenza. This work is very hazardous, so it should be done by robots. We also have to do …

Robot Body Controlled By Human Thoughts Alone [Science]

For the first time, scientists have managed to use fMRI scans of a human to control the movements of a robot body. The link between man and machine allowed the researchers to control a robot in France from a brain scanner in Israel. More »

Robotic legs simulate our neural system, lurch along in the most human-like way so far

robotic-legs-most-like-human-walking

We’ve seen some pretty wonky bipedal robots before, but scientists at the University of Arizona have gone straight to the source — us — to make one with a more human-like saunter. It turns out it’s not just our skull-borne computer that controls gait: a simple neural network in the lumber area of our spine, called the central pattern generator (CPG), also fires to provide the necessary rhythm. By creating a basic digital version of that and connecting some feedback sensors in the legs, a more natural human stride (without balance) was created — and on top of that it didn’t require the tricky processing used in other striding bots. Apparently this throws light on why babies can make that cute walking motion even before they toddle in earnest, since the necessary CPG system comes pre-installed from birth. That means the study could lead to new ways of stimulating that region to help those with spinal cord injuries re-learn to walk, and produce better, less complex walking robots to boot. Judging by the video, it’s a good start, but there’s still a ways to go before they can mimic us exactly — you can watch it after the break.

Continue reading Robotic legs simulate our neural system, lurch along in the most human-like way so far

Robotic legs simulate our neural system, lurch along in the most human-like way so far originally appeared on Engadget on Fri, 06 Jul 2012 04:16:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceEurekaAlert!  | Email this | Comments

Scientists develop ‘’most realistic’ robot legs ever

According to a bunch of US experts, they have laid claim to what they deem as the most biologically-accurate robotic legs to date. They wrote about this particular achievement in the Journal of Neural Engineering, where they claimed that their work would be full well capable of assisting one in understanding just how babies are able to learn how to walk, as well as aid in future spinal-injury treatments. Basically, this bunch of scientists managed to come up with their own version of the message system which is capable of generating the rhythmic muscle signals which control walking.

One of the UK experts was excited about the work, citing the reason of this robot being able to mimic control in addition to movement, which is unprecedented. This system will be able to control the movement of its legs after gathering and processing information from different parts of the body that are involved in walking, responding accordingly to the environment. In human terms, it means we are able to walk without having to think about doing so, something that even the most complex and powerful computers will still need to consciously process and make quite an effort to do so.

By Ubergizmo. Related articles: Brain scanner links man to robot , SHIRI robot butt actually fears, and responses with emotion,

Brain scanner links man to robot

The whole idea of using one’s brains to control a particular object is close to our hearts – after all, who does not want seemingly supernatural or superhuman powers? Well, it seems that scientists have managed to figure out how one is able to use one’s thought power alone to control a tiny humanoid robot. Not only that, we are not talking about a local connection here – no sir, the person controlling the robot was hooked up to a brain scanner in Israel, while the robot was located a continent away in France, moving around the laboratory.

This would help nudge us towards the direction of robot avatars representing us in the real world, maybe we can even send our robot avatars to the office while we recover at home from a flu! Of course, it also has practical applications such as those who are paralyzed or have locked-in-syndrome are able to get up and about for a greater degree of independence. Not only that, the neural link also allows you to view the world through the eyes of the robot. Future improvements would see the robot gain the ability of speech – letting it say whatever you speak out on the other end.

By Ubergizmo. Related articles: SHIRI robot butt actually fears, and responses with emotion, Robot to fill potholes and clean city roads? ,

SHIRI robot butt actually fears, and responses with emotion

Strange, I was just watching Terminator 2 yesterday evening, and it was interesting to see a young punk of John Connor teaching the T-800 different nuances of human emotions and responses, even asking him to smile. Robots do not feel as they are machines, making them efficient killing tools (or workers at the factory), but inventor Nobuhiro Takahashi has decided to inject a little bit of “humanity” into the world of robotics, doing so via his creation “SHIRI”, which is translated to “butt” from Japanese, which is a robot butt that can respond with different emotions to different human touches.

Takahashi has high hopes that this prototype could eventually help robots develop responses that can be applied to other parts of a robot’s body, with the face being his primary focus, in order to achieve a greater sense of realism where non-verbal communication is concerned. He started off with the butt because a bottom’s movements are large, which is easier to convey emotion.

In order to induce fear in a robot, Takahashi would give the robot butt a good spanking, where it will respond by quivering. Now, I wonder if pole dancing robots would get quivering butts as a future feature…

By Ubergizmo. Related articles: Brain scanner links man to robot , Robot to fill potholes and clean city roads? ,