AMD FirePro 2460 Multi-View: four Mini DisplayPort sockets, 13W, no frills

It’s no Radeon, but AMD’s new range of FirePro GPUs might just strike a chord with a few of you multi-monitor maniacs. Just a few short weeks after the debut of the FirePro V8800, AMD as launched the ATI FirePro V7800, ATI FirePro V5800, ATI FirePro V4800, and ATI FirePro V3800, all of which are aimed at assisting digital content creators, well, create content. Frankly, those pro-oriented cards don’t do a lot for us, but the FirePro 2460 Multi-View most certainly does. Boasting a low profile (half height) form factor, this relatively simple (read: not for hardcore gaming) card packs 512MB of video memory, hardware acceleration of DirectX 11, an average power drain of just 13 watts and not two, not three, but four video outputs. AMD tells us that this was designed for day traders who need four displays to accurately watch their stock prices fluctuate, but we can think of quite a few others who’d benefit from having access to four Mini DisplayPort sockets on a single, low-power card. All of the devices mentioned here should begin shipping today, with the 2460 in particular demanding a reasonable $299.

Continue reading AMD FirePro 2460 Multi-View: four Mini DisplayPort sockets, 13W, no frills

AMD FirePro 2460 Multi-View: four Mini DisplayPort sockets, 13W, no frills originally appeared on Engadget on Mon, 26 Apr 2010 17:13:00 EST. Please see our terms for use of feeds.

Permalink Hot Hardware  |  sourceAMD  | Email this | Comments

NVIDIA Verde to sync up desktop and laptop GPU driver releases, generate smiles galore

Good news, mobile gamers — NVIDIA‘s looking out for you and yours, and if you’re tired of lobbying to Congress about the inequities between driver releases for desktop GPUs and driver releases for mobile GPUs, you can finally move on to some other just cause. NVIDIA’s Verde driver program has been a relative success over the years, but it’s about to become a lot more gnarly when the company outs its 256 Series drivers in a few months. At that time, NVIDIA plans to “completely unify its GPU drivers, so mobile and desktop users will be able to get the latest releases simultaneously.” Users won’t find the desktop and laptop drivers in the same package, but we’re sure each one will be clearly marked on the download page. It’s worth noting, however, that these unified releases will only work with laptops featuring discrete GPUs, hybrid solutions utilizing NVIDIA-branded IGPs and Optimus-enabled machines; rigs with multi-vendor solutions (like the Alienware M11x, which uses an integrated set from Intel) won’t be allowed to join the party.

In related news, the upcoming release of the 197.16 driver for laptops will bring along support for external displays with 3D Vision, enabling 3D Vision-ready laptops to pipe 3D content to 3D Vision-ready LCDs with ease. Good news all around, but you’ll have to give those links below a visit if you’re hungry for more.

NVIDIA Verde to sync up desktop and laptop GPU driver releases, generate smiles galore originally appeared on Engadget on Mon, 26 Apr 2010 15:42:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceHot Hardware  | Email this | Comments

New Graphics Tech Promises Speed, Hyperrealism

unlimited-detail

Chipmakers have spent billions of dollars over the decades to create specialized processors that can help make computer graphics ever more realistic and detailed.

Now an Australian hobbyist says he has created a technology that can churn out high-quality, computer-generated graphics for video games and other applications without the need for graphics chips or processor-hungry machines.

“Major companies have got to a point where they improve the polygon-count in graphics-rendering by 22 percent a year,” says Bruce Dell, 32, the creator of the new technology, which he calls Unlimited Detail. “We have made it unlimited. It’s all software that requires no special hardware, so you get truly unlimited detail in your scenes.”

Dell is an unusual candidate for a computer-graphics revolutionary. He’s an autodidact who’s never been to a university and who ran a supermarket chain for about eight years.

But he claims to have found a way to search through trillions of voxels, the 3-D counterparts to pixels, to render a scene quickly. Voxels have so far been used largely in medical- and mining-graphics applications, not video games.

Bringing voxel-based rendering to the world of video games is an interesting idea, says Jon Peddie, founder of Jon Peddie Research. That’s because voxels could take a middle ground between two current rendering techniques: the fast but not graphically realistic world of polygon rendering (used by most video games today) and computationally resource-hungry and comparatively slow ray-tracing technology.

“With voxels, you create a volume of points and look at those points to see what the picture is all about,” says Peddie. “That gives a very accurate representations of the world you are trying to render, without taking up too much computational resources.”

Creating lifelike images through graphics-rendering usually requires major computing power. To recreate three-dimensional objects on a computer screen, programmers define a structure in terms of its geometry, texture, lighting and shading.

The resultant digital image is an approximation of a real-life object, but has a computer-generated–graphics feel to it. It also requires intensive computing power, which means graphics programmers must have state-of-the art machines with special chips from companies such as Nvidia and AMD.

In most 3-D graphics-modeling programs, the virtual depiction of almost every real-life object, such as a trees or a stone, starts as a little flat polygon. More-powerful processors can help the software have more of these polygons, which means increased roundness to the objects on screen. With enough computing power, billions of little polygons can be generated, and each made so small that it’s almost a dot.

Another alternative is to use ray tracing, a method in which the computer traces the path of light through space, simulating the effect on the light as it encounters different objects. That approach creates much more visually attractive scenes, but it is extremely intensive in its need for computational resources.

Dell says Unlimited Detail has an alternative to these systems. It uses billions of “point cloud” dots, or voxels, to accurately represent a world. To render an image, Unlimited Detail then acts as a search engine.

Dell says his algorithm can quickly figure out the dots needed to render a scene, search the data to find only those points, and pull them up quickly enough for smooth animation. He calls it “mass connected processing.”

“Instead of putting a trillion dots on screen and covering the ones you don’t use, we show only what needs to be done and how you can manipulate those dots,” says Dell.

It’s all so new that Dell, who claims to have single-handedly written the software, is still in the process of forming a company.

So how legitimate are his claims? It’s hard to evaluate. Few graphics programmers or industry analysts have actually seen his software at work. Dell says those who have are bound by tight nondisclosure agreements limiting their ability to talk about it.

And graphics chip makers such as Nvidia are not impressed.

“Voxel graphics have been around for quite some time, but they are not considered to be as precise as polygon-based graphics,” says Ken Brown, a spokesperson for Nvidia.

Graphics rendered using voxels can run on less-resource-hungry machines, but they can’t offer the same level of quality as ray tracing or rasterization, he says.

“With voxels, there are issues that come up with shading and coloring the images properly,” says Brown. “If you look at the screenshots that Unlimited Detail has posted, the images don’t look all that realistic.”

Some of those problems can be ameliorated by using better tools, but it can’t be done by a one-man band, say Brown and Peddie.

“There needs to be an infrastructure around every new rendering technique,” says Brown. “There have to be SDKs, tools and drivers, and these are things that teams of people from many different companies come together to create.”

As for claims that Unlimited Detail can do real-time graphics rendering on a machine with a single-core processor and no graphics card, Nvidia people say they’re skeptical. Searching through trillions of points of data would require large amounts of RAM (random access memory), and Dell isn’t sharing any details on how his algorithm deals with that problem.

Even if Dell can validate his claims, it could be years before graphics programmers start using the voxel-based technique that Dell is advocating, says Peddie.

“It will be evolutionary, rather than revolutionary, because there are too many entrenched systems and legacy files to be managed,” he says. “Anybody who is making graphics-creation software like Adobe, Autodesk and Maya will have to change their way of doing things. That’s a pretty big thing to change.”

Major companies such as Microsoft and HP also have patents around voxels, and if Dell wants to go professional, he’ll have to make sure he’s not infringing on the work of other researchers.

“The jury is still out on this idea,” says Peddie. “But Bruce Dell seems real, very sincere, and the idea looks solid.”

To preview Dell’s technology check out his own video:

See Also:

Photo: Unlimited Detail Rendered Artwork


NVIDIA GeForce GTX 480 set up in 3-way SLI, tested against Radeon HD 5870 and 5970

Not many mortals will ever have to worry about choosing between a three-way GeForce GTX 480 SLI setup, an equally numerous Radeon HD 5870 array, or a dual-card HD 5970 monstrosity, but we know plenty of people would care about who the winner might be. Preliminary notes here include the fun facts that a 1 Kilowatt PSU provided insufficient power for NVIDIA’s hardware, while the mighty Core i7-965 test bench CPU proved to be a bottleneck in some situations. Appropriately upgraded to a six-core Core i7-980X and a 1,200W power supply, the testers proceeded to carry out the sacred act of benchmarking the snot out of these superpowered rigs. We won’t spoil the final results of the bar chart warfare here, but rest assured both camps score clear wins in particular games and circumstances. The source link shall reveal all.

NVIDIA GeForce GTX 480 set up in 3-way SLI, tested against Radeon HD 5870 and 5970 originally appeared on Engadget on Tue, 20 Apr 2010 04:16:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceHardware.info  | Email this | Comments

ATI Radeon HD 5870 Eyefinity 6 Edition review roundup: novel, but not for everyone

We’ve been fortunate enough to spend a bit of time with an Eyefinity setup before, but up until now, it’s been somewhat of a hassle to get a fully functional six-screen setup into a consumer’s home. Today, AMD is taking the legwork out of the equation with the introduction of the Radeon HD 5870 Eyefinity 6 Edition, a standalone GPU with 2GB of GDDR5 memory and innate support for pushing a half-dozen panels at once. Outside of that, it’s essentially the same card that we saw last September, and based on the cadre of reviews that we rounded up, the doubled memory bank doesn’t do much to boost frame rates. What it does do, however, is enable six-screen gaming. Unfortunately (though understandably), this type of gaming scenario is only meant for a select segment of users, and many critics found the novelty wearing off exceptionally quick. In fact, it wasn’t long before NeoSeeker became fed up with the bezels ruining the experience, and just about everyone agreed that you needed to sit a good half-mile away to really enjoy it. Either way, we’d encourage you to hit up Hot Hardware‘s collection of videos before biting the bullet, buying up an extra five LCDs and then regretting it for the rest of your Earthly life.

Read – Hot Hardware
Read – AnandTech
Read – NeoSeeker
Read – Rage3D
Read – PC Perspective
Read – TweakTown
Read – FiringSquad
Read – Tom’s Hardware
Read – ExtremeTech
Read – Hexus

ATI Radeon HD 5870 Eyefinity 6 Edition review roundup: novel, but not for everyone originally appeared on Engadget on Wed, 31 Mar 2010 10:36:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceAMD  | Email this | Comments

CyberPower, Digital Storm and Maingear add NVIDIA Fermi GPUs to flagship gaming PCs

Origin PC kicked things off on Friday by shoving NVIDIA’s latest and greatest into its Genesis desktop, and now a few more in the custom PC game have upped the ante by offering a similarly delectable taste of Fermi. NVIDIA’s new GeForce GTX 470 and 480 have been all the rage over the weekend, and if those raucous benchmarks have you convinced that the time to buy is now, a trio of system builders are here vying for your attention. Digital Storm’s Black|OPS rig can now be ordered with a GTX 480 (starts at $2,891), while CyberPower is giving prospective customers the ability to add the latest Fermi GPUs into a smattering of towers. Maingear’s formidable SHIFT supercomputer is also seeing the update, but it’s really asking for trouble with a triple GTX 480 configuration that demands a minimum investment of $6,199. In related news, ASUS, Zotac and a slew of other GPU makers are cranking out new boards based on the minty fresh core, so you shouldn’t have a difficult time finding one if the rest of your rig is a-okay for now.

CyberPower, Digital Storm and Maingear add NVIDIA Fermi GPUs to flagship gaming PCs originally appeared on Engadget on Mon, 29 Mar 2010 01:57:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceDigital Storm, Maingear, CyberPower, Zotac, ASUS  | Email this | Comments

Origin PC stuffs 4.4GHz Core i7-980X, Fermi-based GTX 470 and 480 into Genesis desktop

Hope you didn’t just pull the trigger on a new Origin PC Genesis, else you’ll be forced to know that your rig was made obsolete in record time. Okay, maybe not obsolete, but there’s precisely no doubt that you’d rather be rocking a new Fermi card than whatever you’ve got now. Right on cue, NVIDIA has launched its latest pair of powerhouse graphics cards, and as of right now, prospective Origin PC buyers can opt for either the GTX 470 or GTX 480 on the Genesis desktop. Better still, you can buy ’em in single, dual or triple SLI configurations, and in case you’re down for paying the premium, a 4.4GHz overclocked Core i7-980X Extreme Edition CPU can sit alongside of it (or them).

Origin PC stuffs 4.4GHz Core i7-980X, Fermi-based GTX 470 and 480 into Genesis desktop originally appeared on Engadget on Fri, 26 Mar 2010 19:00:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceOrigin PC  | Email this | Comments

Samsung’s Galaxy S has four times the polygon power of Snapdragon

Samsung's Galaxy S has four times the polygon power of Snapdragon

When we got some hands-on time with the recently announced Samsung Galaxy S, it was painfully apparent that the thing has some serious power under the hood. Now we have a better idea of just how much power, with reports indicating that it has the graphics oomph (thanks to its PowerVR SGX540 GPU) to push 90 million triangles per second. Compare that to the Snapdragon platform, which manages 22 million polygons, and the iPhone 3GS’s 28 million from the earlier SGX535, and you get a feel for the muscle lurking behind that gorgeous Super AMOLED screen. Of course, polygon counts aren’t everything when it comes to graphical power these days, and 300 million triangles won’t help you if your handset gets laggy after you install every single Bejeweled clone in the Android Market, but forgive us if we’re a little excited about the rapidly brewing mobile GPU war.

[Thanks, Robert]

Samsung’s Galaxy S has four times the polygon power of Snapdragon originally appeared on Engadget on Fri, 26 Mar 2010 06:41:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceAndroid and Me  | Email this | Comments

NVIDIA to get official with Fermi GPUs, will ‘more than double the performance’ of existing cards

It’s sure taken ’em long enough, but the Wall Street Journal is reporting that NVIDIA will finally allow the long-awaited Fermi design to reveal itself to the world. We’re guessing that the GeForce GTX 470 and GTX 480 that we’ve been hearing (and hearing) about will be the flagship GPUs to get launched, but whatever the case, the WSJ assures us that the new line will “more than double the performance of its current products.” As you’d expect, the Fermi cards — which will ship with 512 480 or 446 cores (depending on model), three billion transistors and a whole heap of expectations — will support 3D titles along with the latest video processing software, but they’ll also be aimed at more unconventional markets like “medical research and oil-field exploration.” Sounds gnarly, NVIDIA, but we’re just interested in seeing our frame rates hit triple digits in Crysis 2 — got it?

NVIDIA to get official with Fermi GPUs, will ‘more than double the performance’ of existing cards originally appeared on Engadget on Wed, 24 Mar 2010 20:37:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceWall Street Journal  | Email this | Comments

NVIDIA GeForce GTX 480 and 470 specs and pricing emerge

We’re only a week away from their grand unveiling, but already we’ve got word of the specs for NVIDIA’s high end GTX 480 and GTX 470 cards. Priced at $499, the 480 will offer 480 shader processors, a 384-bit interface to 1.5GB of onboard GDDR5 RAM, and clock speeds of 700MHz, 1,401MHz, and 1,848MHz for the core, shaders and memory, respectively. The 470 makes do with 446 SPs, slower clocks, and a 320-bit memory interface, but it’s also priced at a more sensible $349. The TDPs of these cards are pretty spectacular too, with 225W for the junior model and 295W for the full-fat card. Sourced by VR Zone, these numbers are still unofficial, but they do look to mesh well with what we already know of the hardware, including a purported 5-10 percent benchmarking advantage for the GTX 480 over ATI’s HD 5870. Whether the price and power premium is worth it will be up to you and the inevitable slew of reviews to decide.

[Thanks, Sean]

NVIDIA GeForce GTX 480 and 470 specs and pricing emerge originally appeared on Engadget on Fri, 19 Mar 2010 04:31:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceVR Zone  | Email this | Comments