NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650

NVIDIA announces its lowest priced Kepler cards the $229 GeForce 660 and $109 GeForce 650

NVIDIA’s had some trouble shaving its Kepler GPUs down to an entry-level price point, but it looks to have put the problem behind it with the new GeForce 660 and 650 graphics cards. The company’s ambition was to coax impoverished gamers clinging to DirectX9 (and to a lesser extent, 10) into switching up to this wallet-friendly pair of low-end units.

The 660 has been designed to be the “weapon of choice” for budget gamers. It’ll play most games at reasonably high settings, thanks to its 2GB of RAM, 960 CUDA Cores and GPU Boost, which automatically overclocks the silicon according to the demands of your software. While we’ll wait for real-world benchmarks, the company expects four-times the performance of the GeForce 9800GT, claiming games like Borderlands 2 and Guild Wars 2, in a resolution of 1,920 x 1,080 will play at frame rates of 51fps and 41fps with full 3D, respectively

The 650 is the company’s self-proclaimed “gateway” into gaming, being the lowest-priced Kepler it’s planning to produce. Unlike the other cards in the range, it lacks GPU Boost, but the company left six-pin power on the card, giving card makers 64W to push the “good overclocker” 1GHz units all the way to 1.2GHz. It’s got 1GB of DDR5 RAM, which will apparently handle even the newest games at mid-range levels of detail with its 384 CUDA Cores. The pair are available from today, with companies like Maingear and Origin already announcing discounted desktops for them to nestle inside.

Continue reading NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650

Filed under:

NVIDIA ticks budget boxes with the $229 GeForce 660 and $109 GeForce 650 originally appeared on Engadget on Thu, 13 Sep 2012 09:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Apple: A6 chip in iPhone 5 has 2x CPU power, 2x graphics performance, yet consumes less energy

iPhone 5's A6 chip has 2x CPU power, 2x graphics performance, yet consumes less energy

Every new iPhone needs a new engine, and Tim Cook has just made some bold claims about Apple’s latest silicon creation: the A6 processor. He hinted at a significant shrinkage in transistor size, allowing the chip to be 22 percent smaller than the A5 and hence more energy-efficient, while at the same time — he says — doubling all-round CPU and graphics capabilities. By way of practical benefits, the Apple CEO promises the Pages app will load up 2.1x faster than before, while Keynote attachments will hit the screen 1.7x faster. At this point we’re lacking any further detail about cores or clock speeds or indeed who actually fabricated the A6 (still Samsung, after all that bitterness?), but Apple does tend to be close-lipped on such things. In the meantime, bring on the benchmarks!

Check out all the coverage at our iPhone 2012 event hub!

Apple: A6 chip in iPhone 5 has 2x CPU power, 2x graphics performance, yet consumes less energy originally appeared on Engadget on Wed, 12 Sep 2012 13:34:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Lucid Thunderbolt External GPU demoed for undeniable Ultrabook gaming boost

This week during Intel’s Developers Forum, Lucid showed off a lovely Thunderbolt External Graphics solution that’ll have you Ultrabook owners flipping over the possibilities. Here with this non-finalized piece of hardware, you’ll be plugging in graphics power from Lucid with very little effort, bringing visual power from Lucid’s external hardware that was never before possible in such a tiny package as your current-generation Ultrabook. What Lucid showed off here is your ability – in the near future – to turn your Ultrabook into a real hardcore gaming machine.

When showed to Laptop Mag, it would found that a combination of a prototype Thunderbolt graphics card from Lucid produced amicable results. With a combination of Intel’s integrated Intel HD Graphics 4000 chip on a standard Ivy Bridge motherboard, they showed 3DMark06 benchmark bringing up 28 frames-per-second – that’s without this new solution. With the Lucid external graphic card plugged in through the test system’s Thunderbolt port, great things happened.

With the Thunderbolt-connected Lucid-made system, here a AMD Radeon 6700 chip, 3DMark06 brought up a much more fabulous 89 fps. This system works extremely simply, with a plug in to the system resulting in a moment of black screen then the system appearing under Windows Device Manager under Display adapters. If you’re in the mood to disconnect the system again, you’ll simply be force-quit out of whatever application your in and the original Windows desktop will be up and ready to continue to rock without it.

This external video card solution makes it so the future of laptop-based gaming (or desktop-based gaming, for that matter) will bring upgrades with as simple as a cord plug. No more screwdriver action for you if you don’t want! Prices and release dates have not yet been revealed – stay tuned for more Lucid action!

[via Laptop Mag]


Lucid Thunderbolt External GPU demoed for undeniable Ultrabook gaming boost is written by Chris Burns & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


NVIDIA Quadro K5000 GPU for Mac offers significant Premiere Pro performance boost, we go hands-on

Handson with NVIDIA's Quadro K5000 GPU for Mac Pro video

NVIDIA just announced that its new Quadro K5000 GPU will be available on Mac Pros, offering 4K display compatibility and support for up to four displays, not to mention 4GB of graphics memory and about 2x faster performance than the Fermi-based Quadro 4000. While the Kepler-powered chip won’t actually hit Apple systems till later this year, we got a first look at the K500 on a Mac here at IBC. NVIDIA demoed Adobe After Effects and Premiere Pro CS6 on a Mac Pro with dual K5000 GPUs.

As you’ll see in the video below, with 11 streams of 1080p video at 30 fps in Premiere Pro (and one overlay of the NVIDIA logo), GPU acceleration handles the workload seamlessly, letting us add effects in real time without any processing delay. Switching to software rendering mode in the editing program shows a night-and-day difference: video playback is extremely choppy, and processing moves at a crawl. Even with two K5000 chips in this desktop, Premiere Pro utilizes just one, but After Effects takes advantage of both GPUs. In this program, NVIDIA showed us ray-tracing, a computationally intensive 3D imaging feature, which only became available in After Effects with the release of CS6. Like in Premiere Pro, the program runs smoothly enough to let us edit images in real time. Take a look for yourself by heading past the break.

Continue reading NVIDIA Quadro K5000 GPU for Mac offers significant Premiere Pro performance boost, we go hands-on

Filed under: ,

NVIDIA Quadro K5000 GPU for Mac offers significant Premiere Pro performance boost, we go hands-on originally appeared on Engadget on Fri, 07 Sep 2012 06:39:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NVIDIA announces Quadro K5000 for Mac Pro, brings 4K support, 2x performance over Quadro 4000

NVIDIA announces Quadro K5000 for Mac Pro, brings 4K support, 2x performance over Quadro 4000

NVIDIA’s Kepler-powered Quadro K5000 GPU will be making its way to Apple’s Mac Pro systems, the company announced today at IBC. The professional graphics card made its debut earlier this summer and is slated to ship beginning in October for $2,249. Timing for the Quadro K5000 for Mac isn’t quite so firm, with NVIDIA simply stating that it’ll ship “later this year,” though pricing is expected to be in line with the previously announced flavor. The next-gen GPU is said to offer up to 2x faster performance over the Quadro 4000, while also delivering 4K display support, quad display capability through two DVI-DL and two DisplayPort 1.2 connectors, and 4 gigs of graphics memory. Furthermore, each Mac Pro will be able to support up to two separate cards, should you need the extra power. You’ll find full details in the press release after the break.

Continue reading NVIDIA announces Quadro K5000 for Mac Pro, brings 4K support, 2x performance over Quadro 4000

Filed under: ,

NVIDIA announces Quadro K5000 for Mac Pro, brings 4K support, 2x performance over Quadro 4000 originally appeared on Engadget on Fri, 07 Sep 2012 04:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Adobe updates Photoshop Touch with support for iPad retina display, bigger images

Adobe updates Photoshop Touch with support for Retina displays, bigger imagesAdobe has updated its tablet-friendly version of Photoshop to v1.3, bringing along one change that’ll make new iPad owners very happy indeed. The interface and text have been up-rezzed to support the Retina display and 12-megapixel images, while everyone else can edit pictures of up-to 10-megapixels in size. The company’s also throwing in two new effects — shred and colorize — plus new three-finger gestures to ease navigation and a raft of minor bug fixes. So, come on, let’s see what masterpieces your jam-smeared digits can create.

Filed under: ,

Adobe updates Photoshop Touch with support for iPad retina display, bigger images originally appeared on Engadget on Thu, 06 Sep 2012 07:10:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceAdobe  | Email this | Comments

NVIDIA working on Linux support for Optimus automatic graphics switching

NVIDIA working on Linux support for Optimus automatic graphics switching

Linux godfather Linus Torvalds may have a frosty relationship with NVIDIA, but that hasn’t stopped the company from improving its hardware’s support for the open-source operating system. In fact, the chipset-maker is working on the OS’ compatibility with its Optimus graphics switching tech, which would enable laptops to conserve power by swapping between discrete and integrated graphics on the fly. In an email sent to a developer listserv, NVIDIA software engineer Aaron Plattner revealed that he’s created a working proof of concept with a driver. There’s no word on when the Tux-loving masses may see Optimus support, but we imagine that day can’t come soon enough for those who want better battery life while gaming on their mobile machines.

Filed under: ,

NVIDIA working on Linux support for Optimus automatic graphics switching originally appeared on Engadget on Wed, 05 Sep 2012 06:29:00 EDT. Please see our terms for use of feeds.

Permalink PC World  |  sourceGmane  | Email this | Comments

Engadget Primed: The crazy science of GPU compute

Primed goes in-depth on the technobabble you hear on Engadget every day — we dig deep into each topic’s history and how it benefits our lives. You can follow the series here. Looking to suggest a piece of technology for us to break down? Drop us a line at primed *at* engadget *dawt* com.

Primed

As you’re hopefully aware, this is a gadget blog. As a result, we’re innately biased towards stuff that’s new and preferably fandangled. More cores, more pixels, more lenses; just give it here and make us happy. The risk of this type of technological greed is that we don’t make full use of what we already have, and nothing illustrates that better than the Graphics Processing Unit. Whether it sits in our desktops, laptops, tablets or phones, the GPU is cruelly limited by its history — its long-established reputation as a dumb, muscular component that takes instructions from the main processor and translates them into pixels for us to gawp at.

But what if the GPUs in our devices had some buried genius — abilities that, if only we could tap into them, would yield hyper-realistic experiences and better all-round performance from affordable hardware? Well, the thing is, this hidden potential actually exists. We’ve been covering it since at least 2008 and, even though it still hasn’t generated enough fuss to become truly famous, the semiconductor industry is making more noise about it now than ever before.

So please, join us after the break as we endeavor to explain why the trend known as “GPU compute,” aka “general purpose GPU (GPGPU),” or simply “not patronizing your graphics processor,” is still exciting despite having let us down in the past. We’ll try to show why it’s worth learning a few related concepts and terms to help provide a glossary for future coverage; and why, on the whole, your graphics chip is less Hasselhoff and more Hoffman than you may have imagined.

Continue reading Engadget Primed: The crazy science of GPU compute

Filed under: , ,

Engadget Primed: The crazy science of GPU compute originally appeared on Engadget on Mon, 20 Aug 2012 16:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NVIDIA GeForce GTX 660 Ti review roundup: impressive performance for around $300

NVIDIA GeForce GTX 660 Ti review roundup impressive performance for around $300

No one’s saying that $300 is “cheap,” but compared to the GTX 670 and GTX 680 before it, the newly announced GeForce GTX 660 Ti is definitely in a more attainable category. The usual suspects have hashed out their reviews today, with the general consensus being one of satisfaction. A gamechanger in the space it’s not, but this Kepler-based GPU managed to go toe-to-toe with similarly priced Radeon GPUs while being relatively power efficient in the process. That said, AnandTech was quick to point out that unlike Kepler reviews in the past, the 660 Ti wasn’t able to simply blow away the competition; it found the card to perform around 10 to 15 percent faster than the 7870 from AMD, while the 7950 was putting out roughly the same amount of performance as the card on today’s test bench. HotHardware mentioned that NVIDIA does indeed have another winner on its hands, noting that it’d be tough to do better right now for three Benjamins. Per usual, there’s plenty of further reading available in the links below for those seriously considering the upgrade.

Filed under: ,

NVIDIA GeForce GTX 660 Ti review roundup: impressive performance for around $300 originally appeared on Engadget on Thu, 16 Aug 2012 13:44:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceAnandTech, HotHardware, Tech Report  | Email this | Comments

NVIDIA GeForce GTX 660 Ti brings Kepler power with a cut on cost

It’s time for some fabulous 28nm Kepler action from NVIDIA with the GeForce GTX 660 Ti – bringing the fabulous next-generation graphics of Kepler GPU architecture in at a surprisingly affordable price. With the $299 Nvidia GeForce GTX 660 Ti you’ll be rolling out with a base clock speed of 915MHz and no less than two dual-link DVI outs, HDMI, and a DisplayPort 1.2 to boot. Along with a fabulous 1,344 CUDA cores in the mix, you’re getting essentially the same package as the GTX 670 with a major chop-out on the cash cost.

With the GTX 660Ti you’ve got what NVIDIA tells us will be a lovely 150W TDP or down to 134W under “typical use”. NVIDIA also notes that here in the hardware’s 2GB video buffer you’ve got 192-bit GDDR5 RAM making it all hum nicely too – this is just below the GTX 670′s configuration which is 256-bit GDDR5. Configurations will vary between brand releases, of course, but most will be popping up with a bit of a bonus – Borderlands 2!

Have a peek at MAINGEAR’s release and see a couple of custom built limited-edition PCs made with the 660 Ti with Borderlands 2 included as well. You’ll find that NVIDIA is making an effort to come up directly against the AMD Radeon HD 7870, also on the market now, with both of them coming in at $299 standard.

NVIDIA has made claims that the GeForce GTX 660 Ti has had anywhere in between 10 and 30 percent performance gain over the AMD Radeon HD 7870, and has taken on the slightly more expensive $350 AMD Radeon HD 7950 as well. NVIDIA let it be known that in tests vs the 7950, their GTX 660 Ti came up with higher average frame rates, up to 20 percent in some cases, when testing games such as Max Payne 3.

And of course, as shown above, each new generation of NVIDIA’s GPU line keeps getting better and better! Stay tuned for more GTX 660 Ti action throughout the day, and stick around as we test our own build on a custom-built MAINGEAR review unit soon as well. The NVIDIA GeForce GTX 660 Ti will begin shipping today from all your favorite outlets, too, so get pumped!


NVIDIA GeForce GTX 660 Ti brings Kepler power with a cut on cost is written by Chris Burns & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.