Unreal Engine 3 adds extra dimension with NVIDIA 3D Vision

Epic Games has announced that its wildly popular Unreal Engine 3 has now added NVIDIA’s 3D Vision to its list of supported technologies. We’ve already come across Batman: Arkham Asylum being played with NVIDIA’s signature shutter glasses so this isn’t a huge surprise per se, but it does put a stamp of compatibility on the vast catalog of games — both current and future — built upon Epic’s graphics engine. Those include Borderlands, Mass Effect 1 and 2, Bioshock 1 and 2, and that all-time classic 50 Cent: Blood on the Sand. The Unreal Development Kit — a freeware version of the Engine for non-commercial uses — is also being upgraded to make the addition of stereoscopic 3D effects “easier than ever,” while other small improvements (covered by Gamespot) show that the Epic crew isn’t standing still on its core product. Good news for all you mobile mavens wanting a taste of Unreality on your iPhones or Pres.

Unreal Engine 3 adds extra dimension with NVIDIA 3D Vision originally appeared on Engadget on Fri, 12 Mar 2010 08:17:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceNVIDIA  | Email this | Comments

OpenGL 4.0 arrives, brings more opportunities for general purpose GPU action

What’s a Game Developers Conference without some sweet new tools for developers to sink their teeth into? Khronos Group, the association behind OpenGL, has today announced the fourth generation of its cross-platform API spec, which takes up the mantle of offering a viable competitor to Microsoft’s DirectX 11. The latest release includes two new shader stages for offloading geometry tessellation from the CPU to the GPU, as well as tighter integration with OpenCL to allow the graphics card to take up yet more duties off the typically overworked processor — both useful additions in light of NVIDIA’s newfound love affair with tessellation and supposed leaning toward general purpose GPU design in the Fermi chips coming this month. Lest you don’t care that much about desktop gaming, OpenGL ES (Embedded Systems, a mobile offshoot of OpenGL) is the graphics standard on “virtually every shipping smart phone,” meaning that whatever ripples start on the desktop front will be landing as waves on your next superphone. If that holds true, we can look forward to more involvement from our graphics chips beyond their usual 3D duties and into spheres we tend to care about — such as video acceleration. Now you care, don’t ya?

OpenGL 4.0 arrives, brings more opportunities for general purpose GPU action originally appeared on Engadget on Thu, 11 Mar 2010 11:06:00 EST. Please see our terms for use of feeds.

Permalink Tech Crunch  |  sourceVirtual Press Office  | Email this | Comments

You Will Have the Power of a PS3 In Your Pocket In 3 Years [Powervr]

I spoke to Imagination Technologies—maker of the PowerVR chip that powers smartphones like the iPhone, Droid and many others—and they said, definitively, that you’ll have graphics comparable to the PlayStation 3 in 3 years.

They know this because these are the chips they’re designing right now. The way the development process works for phones is that Imagination comes up with a chip, which they license, and that works its way through development cycles and people like Apple or HTC, which then incorporate them into their phones, which they in turn have to productize and bring to market. The whole thing takes three years. But in three years, says Imagination, you’re going to have a PS3 in your pocket. And that’s not just running at the 480×340 resolution that most phones have now, that’s PS3-esque graphics on 720p output via HDMI to a TV. Hell, some phones in three years will have a 720p display native.

But there are going to be some interesting things between now and then. Imagination is still working on support for the products out now—the chips in the iPhones and the Droids and the Nokias that use PowerVR. The two most interesting things are Flash acceleration in hardware and OpenCL support, which enables GPGPU computing.

The first is obvious. By utilizing a software-based update, phones on the market right now can run Flash acceleration. Imagination’s been working with Adobe for about three years now, and they’ve gotten the acceleration up to about 300% compared to using just software. They think they can do even better. Even still, 300% is pretty damn good for just pushing what you can do with your current phone.

Secondly, there’s OpenCL support, which allows devices to utilize the GPU—the graphics chip—to help out in general purpose computing. For a more in depth look on what this means, check out our feature on GPGPUs, but in essence it’s going to allow multi-threaded tasks to be executed faster than they would be otherwise.

I also asked Imagination about what’s going to be different about their chips that will hit the market one, two and three years from now, and they say one of the big things is going to be focused on multiprocessors. Theoretically you can get about three or four into a phone without going too crazy on power demands, which will help them pull off that PS3-equivalency we talked about earlier.

Keep in mind that this stuff is what’s “possible” in three years, based on what hardware is going to be available in the phones released then. A lot of this is still based on phone makers like Apple or HTC or Palm or Motorola to make these features available. But since most of the major phone manufacturers are going to have essentially the same chip, it’s in everyone’s self-interest to push as much power out from their phones as possible.

But if you’re looking forward to what’s coming one year from now, check out the screenshots in the post, taken from the demos they had running on sample hardware.

NVIDIA pulls 196.75 driver amid reports it’s frying graphics cards

One of the discussions that arise anytime we bring up a new graphics card from ATI or NVIDIA is about which company has the better drivers. Well, this should help sway the argument a little bit. It would seem StarCraft II Beta players were among the first to notice low frame rates while using the latest drivers from NVIDIA, and further digging has uncovered that the automated fan-controlling part of said firmware was failing to act as intended. The result? Overheated chips, diminished performance, and in some extreme cases, death (of the GPU, we think the users will be okay). The totality of it is that you should avoid the 196.75 iteration like the plague, and NVIDIA has temporarily yanked the update while investigating the reported issues. Shame that the company hasn’t got any warnings up on its site to tell those who’ve installed the update but haven’t yet nuked their graphics card to roll back their drivers, but that’s what you’ve got us for, right?

[Thanks, Shockie]

NVIDIA pulls 196.75 driver amid reports it’s frying graphics cards originally appeared on Engadget on Fri, 05 Mar 2010 04:34:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceinc gamers  | Email this | Comments

HP EliteBook 8740w specs begin to take shape, ATI FirePro M7820 revealed

You’ve been wondering, we know, why the newly unveiled 2540p and 2740p EliteBooks from HP weren’t accompanied by their heavyweight compadre, the 8740w, but as it turns out the latter might be taking a bit longer to launch due to its inclusion of ATI’s as yet unannounced FirePro M7820 GPU. Joining up with the earlier leaked M5800, this is likely to form the backbone of ATI’s pro graphics refresh, with its innards based on the successful HD 5870, meaning it offers DirectX 11 functionality, 1GB of GDDR5 memory, and probably the most graphical horsepower your lap has ever seen. This is aided by the low-voltage, but highly potent Core i7-720QM CPU and four DDR3 slots for up to 16GB of RAM on the 8740w. You have until the end of the month to figure out what to do with all that power, which is when the rumormongers expect this machine to be announced.

[Thanks, Reznov]

HP EliteBook 8740w specs begin to take shape, ATI FirePro M7820 revealed originally appeared on Engadget on Thu, 04 Mar 2010 08:26:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceHP Fansite (1), (2)  | Email this | Comments

NVIDIA’s Optimus technology shows its graphics switching adroitness on video

Explaining automatic graphics switching and the benefits thereof can be a somewhat dry affair. You have to tell people about usability improvements and battery life savings and whatnot… it’s much more fun if you just take a nice big engineering board, strap the discrete GPU on its own card and insert an LED light for the viewer to follow. NVIDIA has done just that with its Optimus technology — coming to a laptop or Ion 2-equipped netbook near you — and topped it off by actually pulling out the GPU card when it wasn’t active, then reinserting it and carrying on with its use as if nothing had happened. This was done to illustrate the fact that Optimus shuts down the GPU electrically, which is that little bit more energy efficient than dropping it into an idle state. Shimmy past the break to see the video.

Continue reading NVIDIA’s Optimus technology shows its graphics switching adroitness on video

NVIDIA’s Optimus technology shows its graphics switching adroitness on video originally appeared on Engadget on Wed, 03 Mar 2010 07:24:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceNVIDIA  | Email this | Comments

ATI’s six-screen Eyefinity madness reviewed, fatal flaw found

Along with its introduction of the HD 5830, ATI announced the HD 5870 Eyefinity 6 card yesterday, which predictably comes with six DisplayPort outputs and enables that hallowed six-screen gaming overload that the Eyefinity branding has been about since the beginning. Some lucky scribes over at PC Pro have been treated to a live demonstration of what gaming at 5,760 x 2,160 feels like, and their understated response was to describe it as “far more immersive.” No kidding. They did raise the spectral figure of those monitor bezels, however, pointing out that bezel correction — where the image “behind the bezel” is rendered but hidden making the overall display look like a window unto the game world — habitually obscured text and game HUD elements. In their view, the sweet spot remains a triple-screen setup, and we’re inclined to agree (particularly if they look like this). For those interested in getting their multi-monitor gaming up and running, we’ve linked an invaluable guide from HardOCP below, which breaks down how much you can expect from ATI’s current HD 5000 series of cards, and also provides a video guide to setting your rig up.

ATI’s six-screen Eyefinity madness reviewed, fatal flaw found originally appeared on Engadget on Fri, 26 Feb 2010 04:17:00 EST. Please see our terms for use of feeds.

Permalink   |  sourcePC Pro  | Email this | Comments

NVIDIA GeForce GT 340 highlights introduction of 300-series cards, none are powerful enough to matter

Is there a tribunal where you can bring up marketing teams for crimes against common sense? NVIDIA’s epic rebranding exercise knows no bounds, as the company has now snuck out its very first desktop 300-series cards, but instead of the world-altering performance parts we’ve always associated with the jump into the 300s, we’re getting what are essentially GT 2xx cards in new garb. The GT 340 sports the same 96 CUDA cores, 550MHz graphics and 1,340MHz processor clock speeds as the GT 240 — its spec sheet is literally identical to the 240 variant with 1,700MHz memory clocks. To be fair to the company, these DirectX 10.1 parts are exclusively for OEMs, so (hopefully) nobody there will be confused into thinking a GT 320 is better than a GTX 295, but we’d still prefer a more lucid nomenclature… and Fermi graphics cards, we’d totally like some of those too.

NVIDIA GeForce GT 340 highlights introduction of 300-series cards, none are powerful enough to matter originally appeared on Engadget on Wed, 24 Feb 2010 05:05:00 EST. Please see our terms for use of feeds.

Permalink Electronista, Fudzilla  |  sourceNVIDIA  | Email this | Comments

AnandTech goes behind the scenes of ATI’s RV870 / Evergreen GPU development

Anyone familiar with the constantly shifting release dates and delays that characterize GPU refresh cycles will have been impressed by ATI’s execution of the Evergreen series release. Starting out at the top with its uber-performance parts, the company kept to an aggressive schedule over the winter and can now boast a fully fleshed out family of DirectX 11 graphics processors built under a 40nm process. The fact that NVIDIA has yet to give us even one DX11 product is testament to the enormity of this feat. But as dedicated geeks we want more than just the achievements, we want to know the ins and outs of ATI’s resurgence and the decisions that led to its present position of being the market leader in features and mindshare, if not sales. To sate that curiosity, we have our good friend Anand Shimpi with a frankly unmissable retrospective on the development of the RV870 GPU that was to become the Evergreen chips we know today. He delves into the internal planning changes that took place after the delay of the R5xx series, the balancing of marketing and engineering ambitions, and even a bit of info on features that didn’t quite make it into the HD 5xxx range. Hit the source link for all that precious knowledge.

AnandTech goes behind the scenes of ATI’s RV870 / Evergreen GPU development originally appeared on Engadget on Mon, 22 Feb 2010 05:18:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceAnandTech  | Email this | Comments

ATI Radeon HD 5450 focuses on multimedia features, neglects gaming

It’s rare to see a rumor — hell, even a roadmap — pinpoint the timing of new releases quite so accurately, but our earlier report of ATI refreshing the middle and lower parts of its lineup turned out to be bang on. Following in the footsteps of the HD 5670, we have the Radeon HD 5450, which drags the entry price for DirectX 11 and Eyefinity multi-monitor support all the way down to $50. Course, the processing power inside isn’t going to be on par with its elder siblings, but that also means the 5450 will run cool enough to be offered with half-height, passive cooling solutions as seen above. ATI’s focus here is on media PCs, with a DisplayPort, um… port, alongside HDMI 1.3a, Dolby TrueHD and DTS-HD Master Audio bitstreaming support. For the money, you really can’t argue with all this extra multimedia juice, but if you must have benchmarks to sate your soul, check out the early reviews below — they’re full of bar charts and performance comparisons, don’t you know.

Continue reading ATI Radeon HD 5450 focuses on multimedia features, neglects gaming

ATI Radeon HD 5450 focuses on multimedia features, neglects gaming originally appeared on Engadget on Thu, 04 Feb 2010 02:56:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceBusiness Wire  | Email this | Comments