PS3 Slim is cheaper, yes, and new Cell processor makes it faster, maybe

If you’ve been on the fence with that new console purchase then maybe this bit of information will push you over. Not only is the $299 PS3 Slim a skinnier version than its fat bro, it also features a new upgraded Cell processor (jointly developed by IBM, Toshiba, and Sony), according to an IBM spokesman, that uses smaller, more efficient, and less costly 45-nm processes first hinted at back in February of 2008. IBM doesn’t specify the clock speed. The 45-nm Cell is 34 percent smaller and requires 40% less power than the original 65-nm processor according to earlier accounts. Any changes to the graphics in the PS3 Slim are still unknown — the GPU is simply listed as the NVIDIA RSX like the ol’ chubster before it. Nevertheless, the IDG New Service says the PS3 Slim “adds hardware enhancements that make it speedier.”

What’s odd is that Sony didn’t make any claims of the PS3 slim being faster at launch and the “boost” in processing speed in the IDG article quoting IBM doesn’t seem to come from the IBM spokesman. As such, we’re not sure if this is just an improvement in performance-per-watt or something the gamer will actually notice during play. We’re still working on the review but rest assured, that’s one question we’re determined to answer.

Filed under:

PS3 Slim is cheaper, yes, and new Cell processor makes it faster, maybe originally appeared on Engadget on Thu, 20 Aug 2009 04:49:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

DNA May Help Build Next Generation of Chips

dnaorigami

In the race to keep Moore’s Law alive, researchers are turning to an unlikely ally: DNA molecules that can be positioned on wafers to create smaller, faster and more energy-efficient chips.

Researchers at IBM have made a significant breakthrough in their quest to combine DNA strands with conventional lithographic techniques to create tiny circuit boards. The breakthrough, which allows for the DNA structures to be positioned precisely on substrates, could help shrink computer chips to about a 6-nanometer scale. Intel’s latest chips, by comparison, are on a 32-nanometer scale.

“The idea is to combine leading edge lithography that can offer feature size of 25 nanometers with some chemical magic to access much smaller dimensions,” says Robert Allen, senior manager of chemistry and materials at IBM Almaden Research. “This allows us to place nano objects with 6-nanometer resolution. You don’t have a hope of doing that with lithography today.”

To keep pace with Moore’s Law, which postulates that the number of transistors on an integrated circuit will double every two years, chip makers have to squeeze an increasing number of transistors onto every chip. One way to describe how well transistors are packed is the smallest geometric feature that can be produced on a chip, usually designated in nanometers. Current lithographic techniques use either an electron beam or optics to etch patterns on chips in what is known as top-down technique.

“You pattern, mask and etch material away,” says Chris Dwyer, assistant professor at the department of electrical and computer programming at Duke University. “It is very easy to make big structures, but tough to create molecular-scale chips using this.” Dwyer compares it to taking a block of marble and chipping away from it to create the required pattern.

Newer techniques attempt to take small chips and fuse them together to create the required larger pattern in what is called as molecular self-assembly.

“What the IBM researchers have shown is a good demonstration where top-down and bottom-up techniques meet.”

At the heart of their research is an idea known as DNA origami. In 2006, Caltech researcher Paul Rothemund explained a method of creating nanoscale shapes and patterns using custom-designed strands of DNA. It involves folding a single long strand of viral DNA and smaller ’staple’ strands into various shapes. The technique has proven very fruitful, enabling researchers to create self-assembling nano machines, artworks and even tiny bridges.

Wallraff says the technique has a lot of potential for creating nano circuit boards. But the biggest challenge so far has been in getting the DNA origami nanostructures to align perfectly on a wafer. Researchers hope the DNA nanostructures can serve as scaffolds or miniature circuit boards for components such as carbon nanotubes, nanowires and nanoparticles.

“If the DNA origami is scattered around on a substrate, it makes them difficult to locate them and use to connect to other components,” says Greg Wallraff, an IBM research scientist working on the project. “These components are prepared off the chip, and the the origami structure would let you assemble them on the chip.”

It’s important for the kind of work Dwyer and his colleagues at Duke have been doing. They see IBM’s breakthrough as laying the groundwork for their research studying molecular sensors. “With this development we can look to integrate the sensors onto a chip and help build hybrid systems,” says Dwyer.

Still there are some big steps that have to be covered before circuit boards based on DNA nanostructures can hit commercial production. Researchers have to be able to get extremely precise alignment, with no room for error.

Even with the latest demonstration of alignment techniques, there is still some angular dispersion, points out Dwyer.

“If you put a transistor down on a circuit board, there is no dispersion,” says Dwyer. “Our computing systems cannot deal with that kind of randomness.”

That’s why commercial production of chips based on the DNA origami idea could be anywhere from five years to a decade away, says Allen.

“If you are going to take something from the bench-top scale to a fab, there are enormous barriers,” he says. “You really need to understand the mechanisms of defect generation. What we don’t want is to imply is that this is ready to go into a factory and make Star Trek–like chips.”

Photo: Low concentrations of triangular DNA origami bind to wide lines on a lithographically patterned surface.
Courtesy IBM.


IBM studying ‘DNA origami’ to build next-gen microchips, paralyze world with fear

IBM is already making a beeline to 28nm process technology, but it looks like the train may deviate a bit before it even reaches the bottom. Reportedly, the company responsible for PowerPC, the original business laptop and all sorts of underground things that we’ll never comprehend is now looking to use DNA as a model for crafting the world’s next great processor. DNA origami, as it’s so tactfully called, can supposedly provide a cheap framework “on which to build tiny microchips,” with IBM research manager Spike Narayan proclaiming that this is “the first demonstration of using biological molecules to help with processing in the semiconductor industry.” Sir Spike also noted that “if the DNA origami process scales to production-level, manufacturers could trade hundreds of millions of dollars in complex tools for less than a million dollars of polymers, DNA solutions, and heating implements.” The actual process still seems murky from here, but we’re told to expect real results within ten years. Which should be just in time for the robot apocalypse to really hit its stride — awesome.

[Via HotHardware]

Filed under: ,

IBM studying ‘DNA origami’ to build next-gen microchips, paralyze world with fear originally appeared on Engadget on Mon, 17 Aug 2009 10:51:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Water-cooled Aquasar supercomputer does math, heats dorm rooms

Not that we haven’t seen this trick pulled before, but there’s still something magical about the forthcoming Aquasar. Said supercomputer, which will feature two IBM BladeCenter servers in each rack, should be completed by 2010 and reach a top speed of ten teraflops. Such a number pales in comparison to the likes of IBM’s Roadrunner, but it’s the energy factor here that makes it a star. If all goes well, this machine will suck down just 10KW of energy, while the average power consumption of a supercomputer in the top 500 list is 257KW. The secret lies in the new approach to chip-level water cooling, which will utilize a “fine network of capillaries” to bring the water dangerously close to the processors without actually frying any silicon. While it’s crunching numbers, waste heat will also be channeled throughout the heating system at the Swiss Federal Institute of Technology, giving students and dorm room crashers a good feel for the usefulness of recycled warmth.

Filed under:

Water-cooled Aquasar supercomputer does math, heats dorm rooms originally appeared on Engadget on Thu, 25 Jun 2009 05:27:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Water-Cooled Supercomputer Doubles as Dorm Space Heater

IBM Aquasar

Massive supercomputers that devour electricity to keep them humming are not exactly the poster children for green technology. But IBM hopes to change that with its plans to build a supercomputer that will use water to keep the system cool and even recycle some of the waste heat to help heat the university where it’s housed.

The technology could lead to a reduction in overall energy consumption by at least 40 percent, when compared to similar air-cooled machines, says the company.

“Energy is arguably the number one challenge humanity will be facing in the 21st century,” says Dimos Poulikakos, lead investigator of the project. “We cannot afford anymore to design computer systems based on the criterion of computational speed and performance alone.”

Supercomputers are used in defense research labs such as Argonne National Laboratory, in space research by NASA and at universities for scientific research, all applications which have a nearly insatiable demand for processing power. The new supercomputer, called Aquasar, will be housed at the Swiss Federal Institute of Technology (ETH) Zurich and will have a top speed of 10 teraflops. (A teraflop is a trillion floating point operations per second, a measure of computing capacity.) While that’s a lot of computing power — a Core 2 Duo processor is capable of about 20 gigaflops, or 1/500 the speed of Aquasar — it’s a fraction of what some of the fastest supercomputers today. For instance, IBM’s Blue Gene/L supercomputer, which ranks fourth on the top 100 list, has a peak speed of 596 teraflops. Meanwhile, IBM has moved on to create its first supercomputer in Europe capable of one petaflop, or one thousand trillion operations per second.

Keeping these massive machines running isn’t as much a challenge as trying to maintain them in an optimal temperature band. Aquasar, however, hopes to offer more bang for the buck in terms of its energy consumption. Many of the chips used the supercomputing systems dissipate about ten times as much heat as a typical kitchen hotplate, says Thomas Brunschwiler, a researcher at IBM Zurich Research Lab. For optimal performance, the chips must be cooled below 185 degrees Fahrenheit (85 degrees Celsius).

Accomplishing that much cooling across a huge data center means a significant strain on electricity consumption. Researchers estimate that about 50 percent of an average air-cooled data center’s energy consumption stems from powering the cooling systems to keep the processors from overheating. Reducing that would be a big step towards energy efficiency.

The power consumption of one rack of the Aquasar will be around 10 KW, IBM officials say. By comparison, the Blue Gene L/P supercomputer consumes about 40 KW of power per rack, and the average power consumption of a supercomputer in the top 500 list is 257 KW.  Aquasar, set to be commissioned in 2010, will have two IBM BladeCenter servers in each rack.

Aquasar’s breakthrough lies in how it has successfully managed chip level water cooling, says Brunschwiler.

“One way to do it is to cool the air in a data center to 40 degrees Celsius (104 degrees Fahrenheit) , which means air conditioning units that take space and energy,” he says. “Or you can use liquid cooling to get there.”

In the Aquasar system, high performance micro-channel coolers are attached directly to the backside of the processor. In them, the cooler water is distributed through a fine network of capillaries that spread throughout the back.

It’s different from the water-cooled modules used in other supercomputers, says Brunschwiler. Water cooling on a module level brings the liquid between the processors, but not right up against them via micro capillaries.

“The breakthrough in our special package design lies in how we can bring the water as close as possible to the chips without letting it affect the chips’ performance,” says Brunschwiler.

The water-cooled supercomputer will require a small amount, just about 2.64 gallons of water for cooling. A pump ensures the water flows through at the rate of roughly 7.9 gallons per minute.

For overall efficiency, the entire cooling system is a closed circuit. The heated water from the chips is cooled as it passes through a passive heat exchanger and the removed heat is recycled. In this case, it is channeled into the University’s heating system.

“Heat is a valuable commodity that we rely on in our everyday lives,” says Bruno Michel, manager at IBM’s Zurich Research Laboratory. “If we capture and transport the waste heat from the active components in a computer system as efficiently as possible, we can reuse it as a resource.”

Photo: Aquasar/IBM Research


Nanometer wars heat up, Toshiba and Intel enter unofficial race

Think the megapixel race is bad? Now we’ve another to worry about, with both Toshiba and Intel hastily approaching 0.01nm technology in order to make chips faster, more nimble and smaller. According to undisclosed sources at Digitimes, Intel has actually canned production plans for its 45nm Havendale processors, which were originally slated to slip into machines later this year. The cause? It’s heading straight to 32nm, reportedly hoping to ship its Clarkdale line in Q1 2010 with entry-level prices ranging from $60 to $190. In related news, Toshiba is joining the likes of IBM, Samsung and Globalfoundries in an effort to dish out chips based on 28nm process technology. Needless to say, the move is being made in an effort to “stay relevant in an area dominated by the likes of Intel Corp and Texas Instruments.” Now, if only we could get one of these potent, low-power chips inside of a netbook, we’d be pleased as punch.

Read – Intel cans Havendale in move to 32nm
Read – Toshiba speeds to 28nm

Filed under: ,

Nanometer wars heat up, Toshiba and Intel enter unofficial race originally appeared on Engadget on Mon, 22 Jun 2009 10:12:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

NEC and Toshiba hop on IBM’s Semiconductor Alliance train for the ride to 28nm

NEC and Toshiba hop on IBM's Semiconductor Alliance train for the ride to 28nmIBM seems seriously intent to beat Intel to the tiny, 28nm processor punch, and has enlisted even more help to get there first. After securing deals with Samsung, Globalfoundries, and a few other merry chipmakers in April, NEC and Toshiba are now joining in on the Semiconductor Alliance fun to create next-generation processors before the biggest name in current-generation processors. Goals are smaller footprints, lower power consumption, and of course greater performance. Mind you, that greater performance is still likely two years away from anything we can hope to buy.

Filed under:

NEC and Toshiba hop on IBM’s Semiconductor Alliance train for the ride to 28nm originally appeared on Engadget on Thu, 18 Jun 2009 07:39:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

WinFast HPVC1100 is world’s first external SpursEngine encoder

Toshiba’s Cell-based SpursEngine HD video co-processor has made plenty of appearances within monstrous gaming machines, but this marks the very first time where it has stepped out of the laptop chassis and into a portable enclosure. Granted, the language barrier is killing us here, but it seems as if the Leadtek WinFast HPVC1100 wraps a SpursEngine encoder into an on-the-go solution that can be lugged around with a standard laptop in order to churn through video while on set, in the field or on the road. Other specs include 128MB of RAM, a PCI-Express slot and a weight of 1.54 pounds; there’s no word just yet on pricing or availability. One more shot is after the break.

[Via Akihabara News]

Continue reading WinFast HPVC1100 is world’s first external SpursEngine encoder

Filed under: ,

WinFast HPVC1100 is world’s first external SpursEngine encoder originally appeared on Engadget on Tue, 12 May 2009 08:53:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Apple Quietly Recruits Chip Designers for In-House Tech

84994761_3a42505030_o

Apple’s recent hiring spree of chip designers reveals the company may be about to exert even more control over the components that go into its products.

The company may go so far as manufacturing computer processors in-house, according to The Wall Street Journal, which cites only anonymous sources to bolster its claim that the internally designed chips will appear in products no sooner than 2010.

The publication also cites profiles on professional networking site LinkedIn, which lists more than 100 Apple employees with past expertise in chips at companies such as Intel, Samsung and Qualcomm.

These recruitments, coupled with Apple’s 2008 acquisition of PA Semiconductor, serve as strong evidence that the company is moving toward chip design for its hardware, including iPhones and iPods and possibly Macs. Such a move would reduce Apple’s dependence on Intel, which manufactures processors for current Mac computers, and Samsung, which provides an ARM-based microprocessor for the iPhone.

Apple has always kept a tight rein on its suppliers, going so far as acquiring them when necessary to ensure consistent access to critical components. Apple has enough clout that it was even able to negotiate with Intel — a far bigger company — to develop a smaller version of the Core 2 Duo processor for the MacBook Air.

By acquiring in-house semiconductor talent, Apple opens several options: It could more easily customize chips and chipsets from suppliers like Intel, giving Apple hardware unique features (and perhaps raising additional, hardware-based barriers to hackintosh clones — generic PCs running OS X). It could develop its own graphics processors for the iPhone and iPod touch, giving them more serious gaming chops. It could create more compact system-on-a-chip processors that would enable future iPhones (or iPhone-like devices) to be even smaller. Or, in the most ambitious case, it could develop its own CPUs.

In November, Wired.com also speculated that Apple was moving toward in-house chip manufacturing when the company hired former IBM executive Mark Papermaster. Papermaster was a key player in developing the PowerPC chips used in previous-generation Macs.

With control over processor production, Apple will be able to design exclusive features for its gadgets and better guard its secrets from rivals.

Though in-house chip manufacturing would enable Apple to tighten control over its products, technology strategist Michael Gartenberg said it’s unlikely the corporation will produce its own processors for Mac computers. He explained the move would be risky for Apple, as it would cost billions of dollars, and it would be difficult to compete with Intel.

“People have lost fortunes competing with Intel,” Gartenberg said. “It doesn’t make sense [for Apple]. You’d have to get to a point where Intel simply wasn’t able to meet Apple’s needs in any shape or form.”

Rather than producing computer chips, it’s more likely Apple is hiring chip designers to produce custom chipset variants for future products, which could offer special audio and graphic enhancements exclusive to Apple gadgets, Gartenberg speculated. He added that chip experts can also loan advice on manufacturing and design processes to create smaller, thinner and lighter gadgets.

Updated 12 p.m. PDT: Added comment from technology strategist Michael Gartenberg.

See Also:

In Major Shift, Apple Builds Its Own Team to Design Chips [WSJ]

Photo: blakie/Flickr


IBM’s Watson to rival humans in round of Jeopardy!

IBM’s already proven that a computer from its labs can take on the world’s best at chess, but what’ll happen when the boundaries of a square-filled board are removed? Researchers at the outfit are obviously excited to find out, today revealing that its Watson system will be pitted against brilliant Earthlings on Jeopardy! in an attempt to further artificial intelligence when it comes to semantics and searching for indexed information. Essentially, the machine will have to be remarkably labile in order to understand “analogies, puns, double entendres and relationships like size and location,” something that robotic linguists have long struggled with. There’s no mention of a solid date when it comes to the competition itself, but you can bet we’ll be setting our DVRs whenever it’s announced. Check out a video of the progress after the break.

[Via The New York Times]

Continue reading IBM’s Watson to rival humans in round of Jeopardy!

Filed under: ,

IBM’s Watson to rival humans in round of Jeopardy! originally appeared on Engadget on Mon, 27 Apr 2009 09:56:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments