Nanotech research could fit 10 trillion bits of data onto disk the size of a quarter

Two researchers, Ting Xu and Thomas Russell, are in the midst of developing some potentially sweet nanotech that could allow storage of around 10.5 terabits (or 10 trillion bits) of data on a space the size of a quarter. They’re currently working on the technique, which starts with a sliced crystal (sapphire or silicon) sliced at a jagged angle, which is then heated to 2,700 degrees Fahrenheit which causes the crystal to reorganize itself into a sawtooth pattern at three nanometer angles. The crystal is then sprayed with a custom polymer, dried, and treated again with a different solvent, after which the polymer then settles into a hexagonal pattern on the surface of the crystal. Sound complicated? Well, it is, and all the kinks aren’t quite work out, but the technique essentially provides a path to creating a self-assembling disk with far more storage capacity than anything currently available. The current state of the research will be detailed in the upcoming issue of Science magazine. We’ll believe it when we see it, but keep up the good work, guys!

Filed under:

Nanotech research could fit 10 trillion bits of data onto disk the size of a quarter originally appeared on Engadget on Fri, 20 Feb 2009 11:54:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Nokia signs €500 million loan for Symbian R&D

You’d think a company like Nokia could just finance whatever it wanted, but just to be safe, it’s signing a loan agreement with the European Investment Bank (EIB) to the tune of €500 million ($623.9 million). Why the sudden need for cash? According to Reuters, the five-year loan will be used in part to “finance software research and development (R&D) projects Nokia is undertaking during 2009-2011 to make Symbian-based smartphones more competitive.” More specifically, those R&D activities will “also benefit the work of the Symbian Foundation and its development of open-source software for mobile devices.” Sadly, that’s absolutely it for details, but we get the idea we’ll be hearing more about this soon. We hear you can accomplish some pretty wild goals with a half billion Euros.

Filed under:

Nokia signs €500 million loan for Symbian R&D originally appeared on Engadget on Thu, 19 Feb 2009 09:54:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Mind reading gets closer to real thanks to Canadian scientists

Hate to break it to you, but that clairvoyant you’ve been paying daily to read you fortune cookies while blindfolded actually isn’t some sort of medium. Tough to swallow, we know. That said, researchers at Canada’s largest children’s rehabilitation hospital are getting closer to equipping entrepreneurial individuals with the tools they need to read minds. By measuring the intensity of near-infrared light absorbed in brain tissue, scientists were able to decode a person’s preference for one of two drinks with 80 percent accuracy, all without a single minute of training on the human’s behalf. This research gives promise to finding out true feelings of those who can’t speak or move due to physical limitations, though there’s no word on how close it is to becoming viable outside of a lab. As an aside, we hear Professor X is pretty perturbed.

Filed under:

Mind reading gets closer to real thanks to Canadian scientists originally appeared on Engadget on Thu, 12 Feb 2009 10:12:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Robot doctors join the fight against breast cancer

From Da Vinci robosurgeons to helpful nursebots , robots are becoming commonplace in hospitals the world over — and now researchers at Duke University have developed a rudimentary tabletop robot that uses 3D ultrasound technology to detect a ‘lesion’ in a simulated sponge breast, pinpoint its exact location, and perform a biopsy. All the calculations are performed by the device itself, using what has been described as “a basic artificial intelligence program.” The next step in the research will be an upgrade that will that the robotic arm from three-axis to six-axis capability, and a change from the old sponge-based simulated breast to one made from turkey breasts, which approximates the density of human breast tissue. According to Stephen Smith, director of the Duke University Ultrasound Transducer Group, if things stay on track, robots will be performing routine breast exams and biopsies in five to ten years. Video after the break.

[Via PhysOrg]

Continue reading Robot doctors join the fight against breast cancer

Filed under:

Robot doctors join the fight against breast cancer originally appeared on Engadget on Wed, 11 Feb 2009 13:36:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Regenerative shock absorbers developed by team at MIT

A team of undergrads at MIT — led by Shakeel Avadhany and Zack Anderson — has produced a prototype of a shock absorber for vehicles which can harness and generate electricity back into the vehicle. The team claims that their prototype increases a vehicle’s fuel-efficiency by up to 10 percent by using a “hydraulic system that forces fluid through a turbine attached to a generator.” There is an active electronic system for controlling and optimizing the damping for a smoother ride than regular old shocks. The team is actively seeking to develop and commercialize the product, and have already seen interest in the prototype from the United States military and also several manufacturers of trucks, which see the most benefit from the shocks… so look for these guys on Grave Digger any day now.

Filed under:

Regenerative shock absorbers developed by team at MIT originally appeared on Engadget on Tue, 10 Feb 2009 23:28:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Contact Lens Could Bring TV Into Future Eyes

Eye_0209_2

Miniaturization of circuits and displays could lead to televisions shrunk into contact lenses and being powered by body heat, according to British futurologist Ian Pearson.

Channels could be changed using voice commands or gestures, Pearson told The Daily Mail. "You will just pop it into your eye in the
morning and take it out at the end of the day," he said.

Pearson’s predictions are in contrast to how consumer electronics companies have been pushing bigger TVs. In the last few years, advances in plasma and LCD panel technologies along with falling prices have made it easy to buy TVs with up to 100-inches in screen size.

But Pearson believes that trend will change and contact lenses that double up as personal TV sets could be reality within the next ten years.

Already scientists have taken the first steps towards making contact lenses more powerful and versatile, says LiveScience. Digital contact
lenses that can zoom in on objects and display related information have been tested on rabbits for up to 20 minutes. Digital and programmable contact lenses will be the next big step, say futurists like Pearson.

Another possibility to advance your pleasure while watching Temptation Island or Flavor of Love? digital
tattoos that could let users feel the emotions of the actors on the show by provoking similar impulses in their bodies.

Photo: (fxp/Flickr)

Probabilistic logic makes microchip more energy efficient

We’ll be straight up with you — there’s a lot of fancy work going on with this one that laypeople will have a tough time grasping, but the long and short of it is this: a team from Rice University (Krishna Palem pictured) and Nanyang Technological University have created a microchip that “uses 30 times less electricity while running seven times faster than today’s best technology.” Already crying snake oil? Not so fast. By trashing the traditional set of mathematical rules (that’d be Boolean logic) and instead applying probabilistic logic, researchers have figured out how to deliver similar results with a fraction of the energy. The tech is being dubbed PCMOS (probabilistic CMOS), and could eventually end up in embedded systems and even cellphones. In the case of the latter, this type of chip will be able to display streaming video on a minuscule display with more artifacts than usual, but due to the small screen size and the human brain’s ability to piece together nearly-perfect images, the errors involved would be all but forgotten. Meanwhile, your battery bar would still be nearly full. We always heard there was beauty in imperfections — now, at long last, we finally get it.

Filed under: , ,

Probabilistic logic makes microchip more energy efficient originally appeared on Engadget on Mon, 09 Feb 2009 06:09:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Cognitive Computing Project Aims to Reverse-Engineer the Mind

Imagine a computer that can process text, video and audio in an instant, solve problems on the fly, and do it all while consuming just 10 watts of power.

It would be the ultimate computing machine if it were built with silicon instead of human nerve cells.

Compare that to current computers, which require extensive, custom programming for each application, consume hundreds of watts in power, and are still not fast enough. So it’s no surprise that some computer scientists want to go back to the drawing board and try building computers that more closely emulate nature.

"The plan is to engineer the mind by reverse-engineering the brain,"
says Dharmendra Modha, manager of the cognitive computing project at
IBM Almaden Research Center.

In what could be one of the most ambitious computing projects ever, neuroscientists, computer engineers and psychologists are coming together in a bid to create an entirely new computing architecture that can simulate the brain’s abilities for perception, interaction and cognition. All that, while being small enough to fit into a lunch box and consuming extremely small amounts of power.

The 39-year old Modha, a Mumbai, India-born computer science engineer, has helped assemble a coalition of the country’s best researchers in a collaborative project that includes five universities, including Stanford, Cornell and Columbia, in addition to IBM.

The researchers’ goal is first to simulate a human brain on a supercomputer. Then they plan to use new nano-materials to create logic gates and transistor-based equivalents of neurons and synapses, in order to build a hardware-based, brain-like system. It’s the first attempt of its kind.

In October, the group bagged a $5 million grant from Darpa — just enough to get the first phase of the project going. If successful, they say, we could have the basics of a new computing system within the next decade. 

"The idea is to do software simulations and build hardware chips that would be based on what we know about how the brain and how neural circuits work," says Christopher Kello, an associate professor at the University of California-Merced who’s involved in the project.

Computing today is based on the von Neumann architecture, a design whose building blocks —  the control unit, the arithmetic logic unit and the memory — is the stuff of Computing 101. But that architecture presents two fundamental problems: The connection between the memory and the processor can get overloaded, limiting the speed of the computer to the pace at which it can transfer data between the two. And it requires specific programs written to perform specific tasks.

In contrast, the brain distributes memory and processing functions throughout the system, learning through situations and solving problems it has never encountered before, using a complex combination of reasoning, synthesis and creativity.

"The brain works in a massively multi-threaded way," says Charles King, an analyst with Pund-IT, a research and consulting firm. "Information is coming through all the five senses in a very nonlinear fashion and it creates logical sense out of it."

The brain is composed of billions of interlinked neurons, or nerve cells that transmit signals. Each neuron receives input from 8,000 other neurons and sends an output to another 8,000. If the input is enough to agitate the neuron, it fires, transmitting a signal through its axon in the direction of another neuron. The junction between two neurons is called a synapse, and that’s where signals move from one neuron to another.

"The brain is the hardware," says Modha, "and from it arises processes such as sensation, perception, action, cognition, emotion and interaction." Of this, the most important is cognition, the seat of which is believed to reside in the cerebral cortex.

The structure of the cerebral cortex is the same in all mammals. So researchers started with a real-time simulation of a small brain, about the size of a rat’s, in which they put together simulated neurons connected through a digital network. It took 8 terabytes of memory on a 32,768-processor BlueGene/L supercomputer to make it happen.

The simulation doesn’t replicate the rat brain itself, but rather imitates just the cortex. Despite being incomplete, the simulation is enough to offer insights into the brain’s high-level computational principles, says Modha.

The human cortex has about 22 billion neurons and 220 trillion synapses, making it roughly 400 times larger than the rat scale model. A supercomputer capable of running a software simulation of the human brain doesn’t exist yet. Researchers would require at least a machine with a computational capacity of 36.8 petaflops and a memory capacity of 3.2 petabytes — a scale that supercomputer technology isn’t expected to hit for at least three years.

While waiting for the hardware to catch up, Modha is hoping some of the coalition’s partners inch forward towards their targets.

Software simulation of the human brain is just one half the solution. The other is to create a new chip design that will mimic the neuron and synaptic structure of the brain.

That’s where Kwabena Boahen, associate professor of bioengineering at Stanford University, hopes to help. Boahen, along with other Stanford professors, has been working on implementing neural architectures in silicon.

One of the main challenges to building this system in hardware, explains Boahen, is that each neuron connects to others through 8,000 synapses. It takes about 20 transistors to implement a synapse, so building the silicon equivalent of 220 trillion synapses is a tall order, indeed.

"You end up with a technology where the cost is very unfavorable," says Boahen. "That’s why we have to use nanotech to implement synapses in a way that will make them much smaller and more cost-effective."

Boahen and his team are trying to create a device smaller than a single transistor that can do the job of 20 transistors. "We are essentially inventing a new device," he says.

Meanwhile, at the University of California-Merced, Kello and his team are creating a virtual environment that could train the simulated brain to experience and learn. They are using the Unreal Tournament videogame engine to help train the system. When it’s ready, it will be used to teach the neural networks how to make decisions and learn along the way.

Modha and his team say they want to create a fundamentally different approach. "What we have today is a way where you start with the objective and then figure out an algorithm to achieve it," says Modha.

Cognitive computing is hoping to change that perspective. The researchers say they want to an algorithm that will be capable of handling most problems thrown at it.

The virtual environment should help the system learn. "Here there are no instructions," says Kello. "What we have are basic learning principles so we need to give neural circuits a world where they can have experiences and learn from them."

Getting there will be a long, tough road. "The materials are a big challenge," says Kello. "The nanoscale engineering of a circuit that is programmable, extremely small and that requires extremely low power requires an enormous engineering feat."

There are also concerns that the $5 million Darpa grant and IBM’s largess — researchers and resources–while enough to get the project started may not be sufficient to see it till end.

Then there’s the difficulty of explaining that mimicking the cerebral cortex isn’t exactly the same as recreating the brain. The cerebral cortex is associated with functions such as thought, computation and action, while other parts of the brain handle emotions, co-ordination and vital functions. These researchers haven’t even begun to address simulating those parts yet.

Also see:
Pentagon Begins Fake Cat Brain Project
IBM Joins in Pentagon Quest for Fake Cat Brains
DARPA: Fake Brains, ASAP
DARPA 2009: Brains-on-a-Chip, Transparent Displays

MIT concocts wearable “sixth sense” device, Bruce Willis is like “what?”

We’ve nothing but respect for the researchers, engineers and all around brainacs that call MIT home, but unless our minds are simply too feeble to grasp the connection here, we can’t figure how this “sixth sense” device actually relates to one of Bruce Willis‘ most well known films. At any rate, what we do have here is a wearable device that is comprised of a mobile projector, a webcam and a cellphone — a package that was thrown together for around $300. Once strapped on, signals from the webcam and projector are relayed to internet-connected smartphones in order to project data onto basically any backdrop. Somehow, the device can even “take photographs if a user frames a scene with his or her hands, or project a watch face with the proper time on a wrist if the user makes a circle there with a finger.” The actual hows and whys seem to be a mystery, but if we had just developed a gizmo as ripe for commercialization as this, we’d probably keep most of the secrets under wraps as well.

[Via Blorge]

Filed under: , ,

MIT concocts wearable “sixth sense” device, Bruce Willis is like “what?” originally appeared on Engadget on Thu, 05 Feb 2009 13:32:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

IBM develops computerized voice that actually sounds human

If there’s one thing that still grates our nerves, it’s automated calling systems. Or, more specifically, the robotic beings that simply fail to understand our slang and incomprehensible rants. IBM’s working hard and fast to change all that, with a team at the company’s Thomas J Watson research division developing and patenting a computerized voice that can utter “um,” “er” and “yes, we’re dead serious.” The sophisticated system adds in the minutiae that makes conversation believable to Earthlings, and it’s even programmed to learn new nuances and react to phrases such as “shh.” The technology has been difficulty coined “generating paralinguistic phenomena via markup in text-to-speech syntheses,” and while exact end uses have yet to be discussed publicly, we can certainly imagine a brave new world of automated CSRs.

Filed under:

IBM develops computerized voice that actually sounds human originally appeared on Engadget on Mon, 02 Feb 2009 08:46:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments