ATI Radeon Eyefinity unveiled: up to six monitors on a single card

At a press event today the gang at AMD unleashed their newest graphics technology on the world. To be incorporated in the next generation of ATI Radeons, Eyefinity can rock up to six displays (DisplayPort, DVI, HDMI, etc.) with a single card, thanks to a new 40-nm graphics chip that contains 2 billion transistors, capable of 2.5 trillion calculations every second. Monitors can be configured to make up either one contiguous display or six separate ones, and the card can create 268 megapixel images. That means, according to Venture Beat, that it will deliver games with “12 times the high-definition resolution.” And the gang at Hot Hardware, who reports that the new graphic cards will come with either three or six display outs, put a prototype through its paces. We’re pleased to report that playing Left 4 Dead on three 30-inch displays “absolutely changes the experience for the better.” No word yet on a release date, but apparently Acer, Dell, HP, MSI and Toshiba already have Eyefinity notebooks in the works. We’ll take two! More shots after the break.

Read – AMD introduces a graphics chip that can power six computer displays at once
Read – AMD Eyefinity Multi-Display Technology In Action

Continue reading ATI Radeon Eyefinity unveiled: up to six monitors on a single card

Filed under:

ATI Radeon Eyefinity unveiled: up to six monitors on a single card originally appeared on Engadget on Thu, 10 Sep 2009 19:03:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

AMD announces ‘VISION’ guide to buying PCs

Well, Intel may have been on a bit of a processor rebranding kick as of late, but it looks like AMD is now trying to one-up ’em in fairly big way with its new “VISION” branding strategy, which promises to do nothing short of change the way people buy PCs — or so AMD hopes. The short of it is that AMD is looking to take the focus off the processor and instead connect “the needs of the consumer to the PC,” which, of course, calls for some new logos. As you can see above, new AMD-based PCs (starting with laptops and extending to desktops early next year) will be now branded primarily as either Vision, Vision Premium, or Vision Ultimate, with the processor and other specs apparently tucked away for folks that want to go looking for them. Not ones to keep things too simple, AMD will also later be introducing a Vision Black edition for “high-end, top of the line systems” which, ironically, are aimed mostly at folks primarily concerned with specs.

[Via Technologizer]

Filed under: ,

AMD announces ‘VISION’ guide to buying PCs originally appeared on Engadget on Thu, 10 Sep 2009 16:46:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

AMD Tigris and Congo mobile platforms focus on multimedia, longer battery life

Stop the presses! AMD has kept to its roadmap. Alright, start the presses up again. The Tigris laptop platform, announced today, is all set to become AMD’s “mainstream” weapon of choice, with the centrally touted features being full 1080p, DirectX 10.1 support and offloading video encoding to the Radeon HD 4200 GPU. Add in the new 45nm dual core Caspian CPUs, with speeds ranging up to 2.6GHz, and the result is a substantial 42 percent improvement in multimedia performance to go along with 25 percent longer battery life. Alas, that’ll still only net you an hour and 55 minutes of “active use” and just under five hours in idle, according to AMD. The Congo, offering the same HD video and DX10.1 support, does a little better at two hours 26 minutes of utility, thanks to its HD 3200 and dual core Neo chips inside. That’ll hardly trouble Intel’s CULV range of marathon runners, but then Intel’s processors don’t pack quite as much grunt. AMD’s own Pat Moorehead got to test drive laptops based on the two new platforms and was enraptured by their raw, snarling power. Of course, he would be. The majority of OEMs have signed up for this party, with models expected to arrive in time for the release of Windows 7.

[Via TG Daily]

Read – Tigris processors
Read – Pat Moorehead tests Tigris laptop
Read – Congo features
Read – Pat Moorehead tests Congo laptop

Filed under:

AMD Tigris and Congo mobile platforms focus on multimedia, longer battery life originally appeared on Engadget on Thu, 10 Sep 2009 05:04:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

MSI’s AMD-powered U210 up for pre-order, still not ‘official’

Who needs press releases? You can snap up an MSI U210 pre-order right this second on Amazon, so why bother waiting MSI to actually confirm the thing for a Stateside release? Morality. That’s why. Kids these days think they can just drop $430 on any old Athlon Neo MV-40-powered (the same chips at the heart of HP’s dv2) 12-inch XGA ultraportable with 2GB of RAM and a 250GB HDD and 802.11n and not have to pay the consequences. Well, we’re not standing for it. That read link right below? Not an implied approval of these illicit activities.

[Via Mark’s Technology News]

Filed under:

MSI’s AMD-powered U210 up for pre-order, still not ‘official’ originally appeared on Engadget on Sat, 29 Aug 2009 13:33:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

MSI X-Slim X610 leaked, reviewed by Russians

If the gang at 3D News are to be believed (and why not?), this familiar looking notebook isn’t MSI’s X-Slim X600 at all, but the not-yet-announced X-Slim X610. And if a leaked ultraportable isn’t enough excitement for you, wait’ll we tell you that they actually got their hands on one of these beauts and gave it the full-on review treatment. As you’d expect from a machine that shares chassis, specs, ATI Mobility Radeon HD 4330 graphics, a 250GB hard drive, 4GB RAM, and all but one digit of its name with the original, there is not too much to report. The major difference is that the X610 foregoes Intel’s 1.4GHz SU3500 CPU in place of an AMD Athlon MV-40 (1.6GHz), which results in some slower benchmarks, but not enough that you’d readily notice in everyday use. And then there is battery life — the new guy clocks in at slightly less than two hours, or around 20 percent less than the X600. Same machine, same specs, poorer performance — not really a step in the right direction, MSI. Perhaps you can at least give consumers a break on the price?

[Via SlashGear]

Filed under:

MSI X-Slim X610 leaked, reviewed by Russians originally appeared on Engadget on Wed, 19 Aug 2009 09:57:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Windows Media Center is set to thrill at CEDIA 2009 next month

Everyone likes to try and predict the future and with the Custom Electronic Design & Installation (CEDIA) show only a month away, the crew at Engadget HD threw all of their crazy ideas out there for your reading pleasure. For the most part all of the predictions are around Windows Media Center and how it will integrate with other products like the Zune HD, Digital Cable and HD satellite services, but there are some other fun things throw in. We really believe that this is going to be the year that Redmond brings everything together, so if you’re the type who doesn’t think it’ll ever happen, then click through to find out why we think you’re wrong. Either way, you can expect we’ll be on the scene in Atlanta to check out what’s new first hand.

Filed under: , ,

Windows Media Center is set to thrill at CEDIA 2009 next month originally appeared on Engadget on Mon, 10 Aug 2009 11:17:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

ATI Stream goes fisticuffs with NVIDIA’s CUDA in epic GPGPU tussle

It’s a given that the GPGPU (or General-Purpose Graphics Processing Unit) has a long, long ways to go before it can make a dent in the mainstream market, but given that ATI was talking up Stream nearly three whole years ago, we’d say a battle royale between it and its biggest rival was definitely in order. As such, the benchmarking gurus over at PC Perspective saw fit to pit ATI’s Stream and NVIDIA’s CUDA technologies against one another in a knock-down-drag-out for the ages, essentially looking to see which system took the most strain away from the CPU during video encoding and which produced more visually appealing results. We won’t bother getting into the nitty-gritty (that’s what the read link is for), but we will say this: in testing, ATI’s contraption managed to relieve the most stress from the CPU, though NVIDIA’s alternative seemed to pump out the highest quality materials. In other words, you can’t win for losin’.

Filed under:

ATI Stream goes fisticuffs with NVIDIA’s CUDA in epic GPGPU tussle originally appeared on Engadget on Mon, 10 Aug 2009 08:57:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

AMD’s integrated 785G graphics platform review roundup

It’s mildly hard to believe that AMD‘s DirectX 10-compatible 780 Series motherboard GPU was introduced well over a year ago now, but the long awaited successor has finally landed. This fine morning, a gaggle of hardware sites around the web have taken a look at a number of AMD 785G-equipped mainboards, all of which boast integrated Radeon HD 4200 GPUs, support for AMD’s AM3 processors and a price point that’s downright delectable (most boards are sub-$100). Without getting into too much detail here in this space, the general consensus seems to be that the new platform is definitely appreciated, but hardly revolutionary. It fails to destroy marks set by the 780G, and it couldn’t easily put NVIDIA’s GeForce 9300 to shame. What it can do, however, is provide better-than-average HD playback, making it a prime candidate for basic desktop users and even HTPC builders. For the full gamut of opinions, grab your favorite cup of joe and get to clickin’ below.

Read – HotHardware review
Read – The Tech Report review
Read – Tom’s Hardware review
Read – PC Perpective review
Read – Hardware Zone review
Read – Hexus review

Continue reading AMD’s integrated 785G graphics platform review roundup

Filed under: ,

AMD’s integrated 785G graphics platform review roundup originally appeared on Engadget on Tue, 04 Aug 2009 05:29:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Dell adds high-powered ATI FirePro M7740 graphics to the Precision M6400

We’ve always lusted after Dell’s high-zoot Precision M6400 mobile workstation, and now we’ve got yet another reason to save all these nickels and dimes in the sock drawer: the company’s adding AMD’s new ATI FirePro M7740 graphics processor to the mix. The new chip is due to be announced tomorrow at SIGGRAPH 2009, and like the rest of the FirePro line, it’ll offer 1GB of DDR5 frame buffer memory, 30-bit DisplayPort and dual-link DVI output, and tons of CAD application certifications. We’re looking for hard specs and prices now, we’ll let you know as soon as we get ’em.

Dell adds high-powered ATI FirePro M7740 graphics to the Precision M6400 originally appeared on Engadget on Mon, 03 Aug 2009 19:31:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Personal Supercomputers Promise Teraflops on Your Desk

js-personal-supercomputer

About a year ago John Stone, a senior research programmer at the University of Illinois, and his colleagues found a way to bypass the long waits for computer time at the National Center for Supercomputing Applications.

Stone’s team got “personal supercomputers,” compact machines with a stack of graphics processors that together pack quite a punch and can be used to run complex simulations.

“Now instead of taking a couple of days and waiting in a queue, we can do the calculations locally,” says Stone. “We can do more and better science.”

Personal supercomputers are available in many flavors, both as clusters of CPU and graphics processing units (GPUs). But it is GPU computing that is gaining in popularity for its ability to offer researchers easy and quick access to raw computing power. That’s opening up a new market for makers of GPUs, such as Nvidia and AMD, which have traditionally focused on high-end video cards for gamers and graphics pros.

True supercomputers, the rock stars of computing, are capable of millions of calculations per second. But they can be extremely expensive — the fastest supercomputer of 2008, IBM’s RoadRunner, costs $120 million — and access to them is limited. That’s why smaller versions, no bigger than a typical desktop PC, are becoming a hit among researchers who want access to massive processing power along with the convenience of having a machine at their own desk.

“Personal supercomputers that can run off a 110 volt wall circuit allow for a significant amount of performance at a very reasonable price,” says John Fruehe, director of business development for serve and workstation at AMD. Companies such as Nvidia and AMD make the graphics chips that personal supercomputer resellers assemble into personalized configurations for customers like Stone.

Demand for these personal supercomputers grew at an average of 20 percent every year between 2003 and 2008, says research firm IDC. Since Nvidia introduced its Tesla personal supercomputer less than a year ago, the company has sold more than 5,000 machines.

“Earlier when people talked about supercomputers, they meant giant Crays and IBMs,” says Jie Wu, research manager for technical computing at IDC. “Now it is more about having smaller clusters.”

Today, most U.S. researchers at universities who need access to a supercomputer have to submit a proposal to the National Science Foundation, which funds a number of supercomputer centers. If the proposal is approved, the researcher gets access to an account for a certain number of CPU hours at one of the major supercomputing centers at the universities of San Diego, Illinois or Pittsburgh, among others.

“Its like waiting in line at the post office to send a message,” says Stone. “Now you would rather send a text message from your computer rather than wait in line at the post office to do it. That way it is much more time efficient.”

Personal supercomputers may not be as powerful as the mighty mainframes, but they are still leagues above their desktop cousins. For instance, a four-GPU Tesla personal supercomputer from Nvidia can offer 4 teraflops of parallel supercomputing performance with 960 cores and two Intel Xeon 5500 Series Nehalem processors. That’s just a fraction of the IBM RoadRunner’s 1 petaflop speed, but it’s enough for most researchers to get the job done.

For researchers, this means the ability to run calculations faster than they can with a traditional desktop PC. “Sometimes researchers have to wait for six to eight hours before they can have the results from their tests,” says Sumit Gupta, senior product manager at Nvidia. “Now the wait time for some has come down to about 20 minutes.”

It also means that research projects that typically would have never get off the ground because they are deemed too costly and too resource and time intensive now get the green light. “The cost of making a mistake is much lower and a lot less intimidating,” says Stone.

The shift away from large supercomputers to smaller versions has also made research more cost effective for organizations. Stone, who works in a group that develops software used by scientists to simulate and visualize biomolecular structures, says his lab has 19 personal supercomputers shared by 30 researchers. “If we had what we wanted, we would run everything locally because it is better,” says Stone. “But the science we do is more powerful than what we can afford.”

The personal supercomputing idea has also gained momentum thanks to the emergence of programming languages designed especially for GPU-based machines. Nvidia has been trying to educate programmers and build support for CUDA, the C language programming environment created specifically for parallel programming the company’s GPUs. Meanwhile, AMD has declared its support for OpenCL (open computing language) this year. OpenCL is an industry standard programming language. Nvidia says it also works with developers to support OpenCL.

Stone says the rise of programming environments for high performance machines have certainly made them more popular. And while portable powerhouses can do a lot, there is still place for the large mainframe supercomputers. “There are still the big tasks for which we need access to the larger supercomputers,” says Stone. “But it doesn’t have to be for every thing.”

Photo: John Stone sits next to a personal supercomputer- a quad-core Linux PC with 8GB of memory and 3 GPUs (one NVIDIA Quadro FX 5800, and two NVIDIA Tesla C1060) each with 4GB of GPU memory/ Kirby Vandivort