Hamilton robot capable of detecting and treating breast cancer

Cancer is considered to be a four letter word to many, and some say that it is a death sentence. Do you agree? I have had family and friends fall to the scourge of cancer over the years that passed by, and they do say that detecting cancer early enough is one of the most effective methods of stopping this disease in its tracks. Enter this Hamilton robot which is capable of changing the way early detection and treatment of breast cancer happens, doing so within months. It is currently being tested on patients, and inventor of the Hamilton robot, Dr. Mehran Anvari, said, “Women in Hamilton will be one of the first to have access to it. Its accuracy is extremely high. We hope it will enhance care.”

Hopefully it will be able to sniff out more than just breast cancer for the future as well. After all, there are many other kinds of cancer out there, and if the Hamilton robot were to be able to pick those up before they progress to the later stages of development, it would definitely be a case of being in the nick of time.

By Ubergizmo. Related articles: World’s fastest camera used to detect cancer cells in real time, Simple robot smart enough to sniff out underwater mines,

The Writer By Jaquet Droz: Getting to Know an Over 200 Year Old Android

Europe in the mid to late 18th century was not as backwards as some people may believe. In fact, much of the developed world was at a point of incredible intellectual advancement. In Switzerland, for example, a man named Jaquet Droz and his team were building real, honest-to-HAL robots. They didn’t call them that back then, but these automaton androids were incredibly advanced even by today’s standards.

Most of them were built to dazzle the incredibly rich who would then fork over money and commission their creators to build more such splendid automated objects. Though highly advanced androids such as Jaquet Droz’s “The Writer” were not for sale – as they only existed to tease kings and high lords, many made their way into private collections and one has been reanimated for the first time in Switzerland by the Swatch Group.

The Writer, shown above, is a fully mechanical little boy who sits at a desk and writes out complete phrases using a quill and ink. It was the inspiration for the machine in the movie Hugo. The head and eyes even move with the writing hand and, although in operation the machine sounds a bit like a typewriter, it’s an amazing feat of mechanical design. Two other existing Jaquet Droz automatons are called The Draftsman and The Pianist. They are on display in Switzerland and even at over 200 years old they still impress.


Chinese androids wear tracksuits, play sports, but not at the same time (video)

Chinese androids wear clothes and play ping-pong, but not at the same time video

When we last caught up with the Beijing Institute’s family of bots, their abilities extended to slow (but pretty) tai chi moves. Returning three years later, we see that they’re coming along nicely: BHR-4 is still going through the old graceful routines, but now he’s wearing a human face and fetching sportswear to look like one of his creators. The 140-pound android beats certain Japanese alternatives by having both a fully-actuated body and a face that can mimic emotions, like surprise and fear when someone tries to give it a decent hair cut. Meanwhile, brother BHR-5 doesn’t bother with appearances, but instead has graduated to playing ping-pong in the hope of one day taking on rivals from Zhejiang University. He uses high-speed image processing and 32 degrees of freedom to pull off rallies of up to 200 shots, and he’ll do his utmost to impress you in the video after the break.

[Image and video credit: CCTV-4]

Continue reading Chinese androids wear tracksuits, play sports, but not at the same time (video)

Chinese androids wear tracksuits, play sports, but not at the same time (video) originally appeared on Engadget on Mon, 09 Jul 2012 04:37:00 EDT. Please see our terms for use of feeds.

Permalink Plastipals  |  sourceKTSF  | Email this | Comments

Burritob0t 3D Printer Outputs Delicious Tex-Mex

If you ask me, 3D printing is the future – of everything. I think that within the next two decades, we’ll be using 3D printing technology (or variants thereof) to manufacture everything from parts for cars to replacement organs for humans. And robots are already used widely in food production, so why not a 3D printer that can create lunch? That’s exactly what the guy behind the BurritoB0t has in mind.

burritob0t

The Burritob0t is just what it sounds like – a robotic printer/extruder that can output burritos. Interactive designer/builder Marko Manriquez came up with the idea of a robot that can fabricate burritos after realizing the overlap between 3D printing (additive assembly and interchangeable ingredients) with burrito construction.

burritob0t 2

The BurritoB0t is designed to automatically create 3D printed burritos by layering and extruding components to produce a fully customizable, edible Tex-Mex treat. Marko already has a prototype of the machine (tech details here), but it isn’t ready for prime time (or meal time) just yet. From the looks of this video clip below, there’s a plan to launch a Kickstarter campaign to raise funds to complete the BurritoB0t to feed hungry New Yorkers – but it doesn’t look like the fundraiser has started yet from what I can tell.

Whether or not the BurritoB0t ever sees the light of day at your local Taco Bell isn’t really important though. This is really just the tip of the iceberg in terms of robotic food construction. I can envision a day when 3D printing is fast enough that you’ll be able to dial up a recipe from your mobile device, and 3 minutes later, your fully-assembled meal will pop out of its tray. Now whether or not it tastes good… That’s a whole other question.


A Robot Walks Exactly Like a Human For the First Time [Video]

How do babies learn to walk? And how to victims of spinal cord injuries regain the ability to do so? We might soon have a bit more insight into these phenomena, because researchers at the University of Arizona have created the first pair of legs that can walk in in a biologically accurate way. More »

Eye muscle replicated by piezoelectric materials

Researchers have successfully replicated the muscle motion of the human eye which is used to control camera systems thanks to the wonders of piezoelectric materials, and this new discovery will be used to improve the operation of robots. How so? How about hearing that this spanking new muscle-like action could eventually see robotic tools be a whole lot safer than before to use, not to mention paving the way for a more effective MRI-guided surgery as well as robotic rehabilitation.

Scientific advancements have always received inspiration from the natural world, and this new control system is no different. Being a piezoelectric cellular actuator, it relies on biologically inspired technology which enables a robot eye to move about in a manner that resembles an actual eye – although I am quite sure that it will most probably not end up at the uncanny valley just yet. Perhaps that might change in time, who knows?

By Ubergizmo. Related articles: Scientists develop ‘’most realistic’ robot legs ever, Brain scanner links man to robot ,

Scientists develop most advanced robotic legs yet

Scientists from the University of Arizona have developed the most accurate robotic replication of human legs that take it beyond just human-like movement. The robot, which can walk just like a human, will be able to help them understand how human babies start learning how to walk and how to better treat spinal related injuries.

The robot uses motors that push and pull on kevlar straps, which represent human leg muscles, allowing it to achieve nearly identical human-like movement. Even more amazing than that is the fact that the robot also has a computerized version of the central pattern generator (CPG), which is what humans use to collect information from their body. The CPG is what allows people to walk without thinking about walking, adjust strain on leg muscles, evaluate load patterns from each foot, and so on.

Additional enhancements to the robot are expected to be made in the future, including visual and tactile sensors as well as the ability to pick itself up again after falling down. And even though there have been robots that have previously replicated human movement before, this is the first one that has been able to accurately replicate the underlying human control mechanisms that actually drive the movement.

[via BBC]


Scientists develop most advanced robotic legs yet is written by Elise Moreau & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


Scientists develop ‘’most realistic’ robot legs ever

According to a bunch of US experts, they have laid claim to what they deem as the most biologically-accurate robotic legs to date. They wrote about this particular achievement in the Journal of Neural Engineering, where they claimed that their work would be full well capable of assisting one in understanding just how babies are able to learn how to walk, as well as aid in future spinal-injury treatments. Basically, this bunch of scientists managed to come up with their own version of the message system which is capable of generating the rhythmic muscle signals which control walking.

One of the UK experts was excited about the work, citing the reason of this robot being able to mimic control in addition to movement, which is unprecedented. This system will be able to control the movement of its legs after gathering and processing information from different parts of the body that are involved in walking, responding accordingly to the environment. In human terms, it means we are able to walk without having to think about doing so, something that even the most complex and powerful computers will still need to consciously process and make quite an effort to do so.

By Ubergizmo. Related articles: Brain scanner links man to robot , SHIRI robot butt actually fears, and responses with emotion,

Brain scanner links man to robot

The whole idea of using one’s brains to control a particular object is close to our hearts – after all, who does not want seemingly supernatural or superhuman powers? Well, it seems that scientists have managed to figure out how one is able to use one’s thought power alone to control a tiny humanoid robot. Not only that, we are not talking about a local connection here – no sir, the person controlling the robot was hooked up to a brain scanner in Israel, while the robot was located a continent away in France, moving around the laboratory.

This would help nudge us towards the direction of robot avatars representing us in the real world, maybe we can even send our robot avatars to the office while we recover at home from a flu! Of course, it also has practical applications such as those who are paralyzed or have locked-in-syndrome are able to get up and about for a greater degree of independence. Not only that, the neural link also allows you to view the world through the eyes of the robot. Future improvements would see the robot gain the ability of speech – letting it say whatever you speak out on the other end.

By Ubergizmo. Related articles: SHIRI robot butt actually fears, and responses with emotion, Robot to fill potholes and clean city roads? ,

SHIRI robot butt actually fears, and responses with emotion

Strange, I was just watching Terminator 2 yesterday evening, and it was interesting to see a young punk of John Connor teaching the T-800 different nuances of human emotions and responses, even asking him to smile. Robots do not feel as they are machines, making them efficient killing tools (or workers at the factory), but inventor Nobuhiro Takahashi has decided to inject a little bit of “humanity” into the world of robotics, doing so via his creation “SHIRI”, which is translated to “butt” from Japanese, which is a robot butt that can respond with different emotions to different human touches.

Takahashi has high hopes that this prototype could eventually help robots develop responses that can be applied to other parts of a robot’s body, with the face being his primary focus, in order to achieve a greater sense of realism where non-verbal communication is concerned. He started off with the butt because a bottom’s movements are large, which is easier to convey emotion.

In order to induce fear in a robot, Takahashi would give the robot butt a good spanking, where it will respond by quivering. Now, I wonder if pole dancing robots would get quivering butts as a future feature…

By Ubergizmo. Related articles: Brain scanner links man to robot , Robot to fill potholes and clean city roads? ,