NVIDIA GeForce GTX 650 Ti arrives to fill $149 GPU gap

NVIDIA has revealed its latest graphics option, the GeForce GTX 650 iI, a $149 video card targeting upgraders looking for the company’s Kepler cleverness. Packing a promised five-times the performance of the GeForce 9600 GT, the GTX 650 Ti supports DirectX 11 and Full HD 1080p, and NVIDIA is even throwing in the promise of a free game for those who go shopping soon.

Buy a GeForce GTX 650 Ti-based video card from one of NVIDIA’s participating retailers, and you’ll get a copy of Assassin’s Creed III free. The game will have 768 CUDA cores to play with, along with a 925MHz GPU clock and 64 texture units, though no access to NVIDIA’s GPU Boost system.

There’s also 1GB of GDDR5 memory, a 128-bit memory bus and 105W TDP. It’s actually based on the same Kepler GK106 that powers NVIDIA’s GTX 660, though pared back somewhat on the specs so as to bring the price down to sit between that card and the GTX 550 Ti.

Connectivity includes a single 6-pin power connector, along with two dual-link DVI ports and a mini HDMI on NVIDIA’s reference design. However, the GPU itself supports up to four displays, though it’s up to manufacturers themselves to equip their versions with the right connectivity.

ASUS, EVGA, Gainward, KFA2 (Galaxy), Gigabyte, Inno3D, MSI, Palit, PNY, Point of View, and Zotac will all be pushing out video cards based on the GeForce GTX 650 Ti GPU, with availability from today.


NVIDIA GeForce GTX 650 Ti arrives to fill $149 GPU gap is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


AMD enters Windows 8 tablet fray with Z-60 chip: ‘all-day’ battery life, graphics ‘you would never expect’

AMD wades into Windows 8 tablet war with Z60 chip '10 hours' battery life and graphics 'you would never expect'

If you know AMD mainly for its laptop and desktop processors, then some readjustment may be in order: as of now, the company is rushing head-long into the market created by Surface fever and the need for ultra lean tablet chips that can handle Windows 8. Specifically, we’re looking at the official launch of the Z-60, formerly known as Hondo, which AMD says will arrive in tablets “later this year” and satisfy even our most unreasonable demands for Windows 8 hybrids that last 10+ hours in tablet mode and which turn into full-scale PCs when docked. And if you’re thinking that Intel made much the same pledge during its recent Clover Trail announcement, then you’re dead right — we actually have all the makings here of a proper old-fashioned chip fight. Read on for a spot of pre-match banter.

Continue reading AMD enters Windows 8 tablet fray with Z-60 chip: ‘all-day’ battery life, graphics ‘you would never expect’

Filed under: ,

AMD enters Windows 8 tablet fray with Z-60 chip: ‘all-day’ battery life, graphics ‘you would never expect’ originally appeared on Engadget on Tue, 09 Oct 2012 00:01:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

AMD Trinity APU overclocked at 7.3 GHz, kept cool with liquid nitrogen

AMD Trinity APU overclocked at 7.3 GHz, kept cool with liquid nitrogen

AMD’s Trinity APUs have only been in the wild for a few days, but some have already taken on the challenge of pushing the new desktop silicon to its limits. By giving the A10-5800K model 1.956 volts, disabling two of its cores and cooling it with liquid nitrogen, overclockers were able to push the chip to 7.3GHz. Air-cooling and 1.616 volts squeezed out 5.1GHz without sacrificing any cores. If you’re a mere mortal who’s fresh out of liquid nitrogen (or never had any to begin with), you should be able to comfortably bump CPU performance by roughly 10 percent and GPU speeds by 15 to 17 percent. For the full specs on this particular overclock, hit the source links below.

Filed under:

AMD Trinity APU overclocked at 7.3 GHz, kept cool with liquid nitrogen originally appeared on Engadget on Sat, 06 Oct 2012 09:04:00 EDT. Please see our terms for use of feeds.

Permalink TG Daily, PC Games Hardware (Tranlsated)  |  sourceCPU-Z (7.3 GHz), (5.1GHz)  | Email this | Comments

BlueStacks teams with AMD to optimize Android App Player for Fusion, Radeon chips (video)

BlueStacks teams with AMD to optimize its Android App Player for AMD chips video

AMD has a disproportionately large $6.4 million investment in BlueStacks, and now we’re seeing one clear reason why. The two companies have teamed up to create a special version of the BlueStacks App Player that’s tuned for AMD’s Fusion-based processors and Radeon graphics cards, running Android apps with the full help of the chip desgner’s hardware in Windows 7 and 8 PCs. Accordingly, over 500,000 Android apps are invading AMD’s new AppZone portal without any needed tweaks of their own, giving the service a much larger catalog than if it had gone with Windows alone. Both companies have a clear incentive to this melding of desktop and mobile: BlueStacks suddenly gets exposure to as many as 100 million AMD-running users, while AMD can tout a giant app catalog that may be preloaded on future PCs using its components. We don’t know if the world needs yet another avenue for playing Angry Birds, especially when many AMD-based PCs won’t have touchscreens, but the BlueStacks partnership could be a strong lure for new PC buyers who’d like an instant software library.

Continue reading BlueStacks teams with AMD to optimize Android App Player for Fusion, Radeon chips (video)

Filed under: ,

BlueStacks teams with AMD to optimize Android App Player for Fusion, Radeon chips (video) originally appeared on Engadget on Thu, 27 Sep 2012 14:31:00 EDT. Please see our terms for use of feeds.

Permalink TechCrunch  |  sourceAMD AppZone  | Email this | Comments

These Brand New AMD A-Series Processors Could Power Your Next Desktop [Guts]

AMD just announced the processor that could power your next desktop: the second generation A-Series processor. It has more cores for more power, integrated graphics with the AMD Radeon HD 7000 for graphics and puts a high priority on power efficiency. More »

NVIDIA to offer up documentation for Tegra graphics core to prove its commitment to open-source (video)

NVIDIA to offer up documentation for Tegra graphics core to prove its commitment to opensource

There’s nothing like a little smack talk to light the fire under certain derrieres. It’s been a few months since Linus Torvalds got verbal about NVIDIA’s support for the semi-eponymous OS, prompting the chip-maker to say “supporting Linux is important to us.” Proving that its word is good, NVIDIA will be releasing programming documentation for its Tegra architecture graphics core. The news comes from a talk given by Lucas Stach of the Nouveau project (who develop free drivers for the NVIDIA platform) at the XDC2012 conference. The focus will initially be on Tegra’s 2D rendering engine, but it’s hopes the 3D will soon follow. So, while Torvalds’ approach might have been a little bit brusque, you can’t fault its effectiveness. Video of the XDC talk after the break.

Continue reading NVIDIA to offer up documentation for Tegra graphics core to prove its commitment to open-source (video)

Filed under: , ,

NVIDIA to offer up documentation for Tegra graphics core to prove its commitment to open-source (video) originally appeared on Engadget on Sun, 23 Sep 2012 13:20:00 EDT. Please see our terms for use of feeds.

Permalink Hot Hardware  |  sourcePhoronix  | Email this | Comments

Wii U’s slow CPU “a challenge” for one launch developer

These days, we have a better idea of what the Wii U is packing under the hood. While there are some aspects of the Wii U that are clearly better than the Xbox 360 or PS3, the CPU isn’t one of them. We don’t know everything about the Wii U’s CPU just yet (clock speed, for instance, is still a mystery), but we do know that it comes from IBM and features three Power PC cores.


That underwhelming CPU is giving one Wii U launch developer some trouble. During the Tokyo Game Show, Eurogamer sat down with Warriors Orochi 3 Hyper producer Akihiro Suzuki, who says that the Wii U’s CPU tends to have some issues when there are multiple characters on screen, which is pretty much always the case when playing a Dynasty Warriors game. “One of the weaknesses of the Wii U compared to PS3 and Xbox 360 is the CPU power is a little bit less,” Suzuki said. “So for games in the Warriors series, including Dynasty Warriors and Warriors Orochi, when you have a lot of enemies coming at you at once, the performance tends to be affected because of the CPU.”

Suzuki followed up by saying that dealing with those performance issues can be “a challenge,” but did also point out that as far as sheer graphics power is concerned, the Wii U has the 360 and PS3 beat. Not only does the Wii U feature what is believed to be a custom AMD 7 series GPU, but it’s been confirmed to house 1GB of RAM that is dedicated to games, which is twice the amount the 360 and PS3 can boast. This means games which are more GPU-intensive will shine on Wii U, while those that require some significant CPU power risk falling flat.

It’s important to keep in mind that as time goes on, developers will figure out how to squeeze the most power out of the Wii U’s CPU. All you need to do is look at this generation to see that much is true – compare titles like The Last of Us or Uncharted 3 to games that launched at the beginning of the generation, and you’ll surely notice a sizable boost in overall quality. It seems safe to assume that we can expect a similar progression with games on Wii U, so this is just probably one of those launch hurdles that most developers have to deal with. Check our story timeline below for more on the Wii U!


Wii U’s slow CPU “a challenge” for one launch developer is written by Eric Abent & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


NVIDIA GeForce GTX 650 and 660 review roundup: hitting the sweet spot, sometimes

NVIDIA GeForce GTX 660 review roundup

If you’re building or upgrading a budget gaming rig, it’ll be hard to ignore the GeForce GTX 650 and 660. Whether or not NVIDIA’s new chipsets are worth the glance is another matter, and early reviews suggest that a sale depends on just which market you’re in. The GTX 660, by far the darling of the review crowd, competes solidly against the Radeon HD 7850 by outrunning AMD’s hardware in most situations while undercutting on the official price. Only a few have taken a look at the lower-end GTX 650, but it’s not as much of a clear-cut purchasing decision — the entry-level video often slots in between the performance of the Radeon HD 7750 and 7770 without the price edge of its bigger brother. Either card is much better value for the money than the GT 640, however, and looks to be a meaningful upgrade if you’re trading up from equivalent prior-generation gear.

Read – AnandTech (GTX 660)
Read – Benchmark Reviews (GTX 660)
Read – Bit-Tech (GTX 660)
Read – Guru 3D (GTX 650)
Read – HardOCP (GTX 660)
Read – Hot Hardware (GTX 660)
Read – PC Mag (GTX 660)
Read – PC Perspective (GTX 660)
Read – Tom’s Hardware (GTX 650 and 660)

Filed under: , ,

NVIDIA GeForce GTX 650 and 660 review roundup: hitting the sweet spot, sometimes originally appeared on Engadget on Fri, 14 Sep 2012 14:11:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NVIDIA GeForce GTX 660 and GTX 660 push Kepler to sub-$110

NVIDIA has taken the wraps off of its latest Kepler graphics cards, the GeForce GTX 650 and GTX 660, bringing the CUDA-based GPUs to the lowest price so far. Prices are promised at around $109 for the GeForce GTX 650, which offers a 1GHz clock speed and 1GB of DDR5 memory, and around $229 for the GeForce GTX 660, which doubles the RAM and is the cheapest way to get NVIDIA’s GPU Boost for automatic overclocking.

That’s not to say that the GTX 650 can’t be overclocked, or indeed that it needs to be. Out of the box it can simultaneously drive four monitors for a total resolution of 5760 x 1080 with its 384 CUDA cores, but there’s a 6-pin power connector for those wanting to coax up to around 1.2GHz from the GPU.

Gamers, though, might want to step straight to the GTX 660 for the native GPU Boost. That works with the card’s 960 CUDA cores, and 192-bit memory channel (versus the 128-bit of the GTX 650) to drive Full HD monitors at some impressive frame rates compared to its predecessors; check out the benchmarks in the table below (click for a larger version):

Connectivity includes a Dual Link DVI-I, Dual Link DVI-D, HDMI, and a DisplayPort on the double-width GTX 660, and a Dual Link DVI-I, a Dual Link DVI-D, and a Mini HDMI on the double-width GTX 650. Expect cards from the usual suspects from today.


NVIDIA GeForce GTX 660 and GTX 660 push Kepler to sub-$110 is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


NVIDIA Quadro K5000 GPU for Mac offers significant Premiere Pro performance boost, we go hands-on

Handson with NVIDIA's Quadro K5000 GPU for Mac Pro video

NVIDIA just announced that its new Quadro K5000 GPU will be available on Mac Pros, offering 4K display compatibility and support for up to four displays, not to mention 4GB of graphics memory and about 2x faster performance than the Fermi-based Quadro 4000. While the Kepler-powered chip won’t actually hit Apple systems till later this year, we got a first look at the K500 on a Mac here at IBC. NVIDIA demoed Adobe After Effects and Premiere Pro CS6 on a Mac Pro with dual K5000 GPUs.

As you’ll see in the video below, with 11 streams of 1080p video at 30 fps in Premiere Pro (and one overlay of the NVIDIA logo), GPU acceleration handles the workload seamlessly, letting us add effects in real time without any processing delay. Switching to software rendering mode in the editing program shows a night-and-day difference: video playback is extremely choppy, and processing moves at a crawl. Even with two K5000 chips in this desktop, Premiere Pro utilizes just one, but After Effects takes advantage of both GPUs. In this program, NVIDIA showed us ray-tracing, a computationally intensive 3D imaging feature, which only became available in After Effects with the release of CS6. Like in Premiere Pro, the program runs smoothly enough to let us edit images in real time. Take a look for yourself by heading past the break.

Continue reading NVIDIA Quadro K5000 GPU for Mac offers significant Premiere Pro performance boost, we go hands-on

Filed under: ,

NVIDIA Quadro K5000 GPU for Mac offers significant Premiere Pro performance boost, we go hands-on originally appeared on Engadget on Fri, 07 Sep 2012 06:39:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments