NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650

NVIDIA announces its lowest priced Kepler cards the $229 GeForce 660 and $109 GeForce 650

NVIDIA’s had some trouble shaving its Kepler GPUs down to an entry-level price point, but it looks to have put the problem behind it with the new GeForce 660 and 650 graphics cards. The company’s ambition was to coax impoverished gamers clinging to DirectX9 (and to a lesser extent, 10) into switching up to this wallet-friendly pair of low-end units.

The 660 has been designed to be the “weapon of choice” for budget gamers. It’ll play most games at reasonably high settings, thanks to its 2GB of RAM, 960 CUDA Cores and GPU Boost, which automatically overclocks the silicon according to the demands of your software. While we’ll wait for real-world benchmarks, the company expects four-times the performance of the GeForce 9800GT, claiming games like Borderlands 2 and Guild Wars 2, in a resolution of 1,920 x 1,080 will play at frame rates of 51fps and 41fps with full 3D, respectively

The 650 is the company’s self-proclaimed “gateway” into gaming, being the lowest-priced Kepler it’s planning to produce. Unlike the other cards in the range, it lacks GPU Boost, but the company left six-pin power on the card, giving card makers 64W to push the “good overclocker” 1GHz units all the way to 1.2GHz. It’s got 1GB of DDR5 RAM, which will apparently handle even the newest games at mid-range levels of detail with its 384 CUDA Cores. The pair are available from today, with companies like Maingear and Origin already announcing discounted desktops for them to nestle inside.

Continue reading NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650

Filed under:

NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650 originally appeared on Engadget on Thu, 13 Sep 2012 09:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Engadget Primed: The crazy science of GPU compute

Primed goes in-depth on the technobabble you hear on Engadget every day — we dig deep into each topic’s history and how it benefits our lives. You can follow the series here. Looking to suggest a piece of technology for us to break down? Drop us a line at primed *at* engadget *dawt* com.

Primed

As you’re hopefully aware, this is a gadget blog. As a result, we’re innately biased towards stuff that’s new and preferably fandangled. More cores, more pixels, more lenses; just give it here and make us happy. The risk of this type of technological greed is that we don’t make full use of what we already have, and nothing illustrates that better than the Graphics Processing Unit. Whether it sits in our desktops, laptops, tablets or phones, the GPU is cruelly limited by its history — its long-established reputation as a dumb, muscular component that takes instructions from the main processor and translates them into pixels for us to gawp at.

But what if the GPUs in our devices had some buried genius — abilities that, if only we could tap into them, would yield hyper-realistic experiences and better all-round performance from affordable hardware? Well, the thing is, this hidden potential actually exists. We’ve been covering it since at least 2008 and, even though it still hasn’t generated enough fuss to become truly famous, the semiconductor industry is making more noise about it now than ever before.

So please, join us after the break as we endeavor to explain why the trend known as “GPU compute,” aka “general purpose GPU (GPGPU),” or simply “not patronizing your graphics processor,” is still exciting despite having let us down in the past. We’ll try to show why it’s worth learning a few related concepts and terms to help provide a glossary for future coverage; and why, on the whole, your graphics chip is less Hasselhoff and more Hoffman than you may have imagined.

Continue reading Engadget Primed: The crazy science of GPU compute

Filed under: , ,

Engadget Primed: The crazy science of GPU compute originally appeared on Engadget on Mon, 20 Aug 2012 16:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments