OpenGL 4.0 arrives, brings more opportunities for general purpose GPU action

What’s a Game Developers Conference without some sweet new tools for developers to sink their teeth into? Khronos Group, the association behind OpenGL, has today announced the fourth generation of its cross-platform API spec, which takes up the mantle of offering a viable competitor to Microsoft’s DirectX 11. The latest release includes two new shader stages for offloading geometry tessellation from the CPU to the GPU, as well as tighter integration with OpenCL to allow the graphics card to take up yet more duties off the typically overworked processor — both useful additions in light of NVIDIA’s newfound love affair with tessellation and supposed leaning toward general purpose GPU design in the Fermi chips coming this month. Lest you don’t care that much about desktop gaming, OpenGL ES (Embedded Systems, a mobile offshoot of OpenGL) is the graphics standard on “virtually every shipping smart phone,” meaning that whatever ripples start on the desktop front will be landing as waves on your next superphone. If that holds true, we can look forward to more involvement from our graphics chips beyond their usual 3D duties and into spheres we tend to care about — such as video acceleration. Now you care, don’t ya?

OpenGL 4.0 arrives, brings more opportunities for general purpose GPU action originally appeared on Engadget on Thu, 11 Mar 2010 11:06:00 EST. Please see our terms for use of feeds.

Permalink Tech Crunch  |  sourceVirtual Press Office  | Email this | Comments

HTML5 vs. Flash comparison finds a few surprises, settles few debates

Think we’d all be better off if HTML5 could somehow instantly replace Flash overnight? Not necessarily, according to a set of comparisons from Jan Ozer of the Streaming Learning Center website, which found that while HTML5 did come out ahead in many respects, it wasn’t exactly a clear winner. The tests weren’t completely scientific, but they did find that HTML5 clearly performed better than Flash 10 or 10.1 in Safari on a Mac, although the differences were less clear cut in Google Chrome or Firefox. On the other hand, Flash more than held its own on Windows, and Flash Player 10.1 was actually 58% more efficient than HTML5 in Google Chrome on the Windows system tested. As you may have deduced, one of the big factors accounting for that discrepancy is that Flash is able to take advantage of GPU hardware acceleration in Windows, while Adobe is effectively cut out of the loop on Mac — something it has complained about quite publicly. According to Ozer, the differences between HTML5 and Flash playback on a Mac could be virtually eliminated if Flash could make use of GPU acceleration. Hit up the link below for all the numbers.

Update: Mike Chambers has performed some additional tests that he says shows that “does not perform consistently worse on Mac than on Windows.” Check out the complete results here.

HTML5 vs. Flash comparison finds a few surprises, settles few debates originally appeared on Engadget on Wed, 10 Mar 2010 23:36:00 EST. Please see our terms for use of feeds.

Permalink ReadWriteWeb  |  sourceStreaming Learning Center  | Email this | Comments

NVIDIA GTX 480 makes benchmarking debut, matches ATI HD 5870 performance (video)

We’re still not happy with NVIDIA’s failure to publish anything on its site alerting users about the doom that may befall them if they switched to the 196.75 drivers, but the company’s making an effort to get back into our good books with the first official video of its forthcoming GeForce GTX 480 and even a benchmark run against ATI’s flagship single-GPU card, the HD 5870. It looks like you’ll need to jack in a pair of auxiliary power connectors — one 8-pin and one 6-pin — to power the first Fermi card, as well as plenty of clearance in your case to accommodate its full length (stop giggling!). NVIDIA’s benchmarking stressed the GTX 480’s superior tesselation performance over the HD 5870, but it was level pegging between the two cards during the more conventional moments. It’s all well and good being able to handle extreme amounts of tesselation, but it’ll only matter to the end user if game designers use it as extensively as this benchmark did. As ever, wait for the real benchmarks (i.e. games) before deciding who wins, but we’re slightly disappointed that NVIDIA’s latest and greatest didn’t just blow ATI’s six-month old right out of the water. Benchmarking result awaits after the break, along with video of the new graphics card and a quick look at NVIDIA’s 3D Vision Surround setup. Go fill your eyes.

Continue reading NVIDIA GTX 480 makes benchmarking debut, matches ATI HD 5870 performance (video)

NVIDIA GTX 480 makes benchmarking debut, matches ATI HD 5870 performance (video) originally appeared on Engadget on Sat, 06 Mar 2010 07:34:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceYouTube  | Email this | Comments

NVIDIA pulls 196.75 driver amid reports it’s frying graphics cards

One of the discussions that arise anytime we bring up a new graphics card from ATI or NVIDIA is about which company has the better drivers. Well, this should help sway the argument a little bit. It would seem StarCraft II Beta players were among the first to notice low frame rates while using the latest drivers from NVIDIA, and further digging has uncovered that the automated fan-controlling part of said firmware was failing to act as intended. The result? Overheated chips, diminished performance, and in some extreme cases, death (of the GPU, we think the users will be okay). The totality of it is that you should avoid the 196.75 iteration like the plague, and NVIDIA has temporarily yanked the update while investigating the reported issues. Shame that the company hasn’t got any warnings up on its site to tell those who’ve installed the update but haven’t yet nuked their graphics card to roll back their drivers, but that’s what you’ve got us for, right?

[Thanks, Shockie]

NVIDIA pulls 196.75 driver amid reports it’s frying graphics cards originally appeared on Engadget on Fri, 05 Mar 2010 04:34:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceinc gamers  | Email this | Comments

HP EliteBook 8740w specs begin to take shape, ATI FirePro M7820 revealed

You’ve been wondering, we know, why the newly unveiled 2540p and 2740p EliteBooks from HP weren’t accompanied by their heavyweight compadre, the 8740w, but as it turns out the latter might be taking a bit longer to launch due to its inclusion of ATI’s as yet unannounced FirePro M7820 GPU. Joining up with the earlier leaked M5800, this is likely to form the backbone of ATI’s pro graphics refresh, with its innards based on the successful HD 5870, meaning it offers DirectX 11 functionality, 1GB of GDDR5 memory, and probably the most graphical horsepower your lap has ever seen. This is aided by the low-voltage, but highly potent Core i7-720QM CPU and four DDR3 slots for up to 16GB of RAM on the 8740w. You have until the end of the month to figure out what to do with all that power, which is when the rumormongers expect this machine to be announced.

[Thanks, Reznov]

HP EliteBook 8740w specs begin to take shape, ATI FirePro M7820 revealed originally appeared on Engadget on Thu, 04 Mar 2010 08:26:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceHP Fansite (1), (2)  | Email this | Comments

NVIDIA’s Optimus technology shows its graphics switching adroitness on video

Explaining automatic graphics switching and the benefits thereof can be a somewhat dry affair. You have to tell people about usability improvements and battery life savings and whatnot… it’s much more fun if you just take a nice big engineering board, strap the discrete GPU on its own card and insert an LED light for the viewer to follow. NVIDIA has done just that with its Optimus technology — coming to a laptop or Ion 2-equipped netbook near you — and topped it off by actually pulling out the GPU card when it wasn’t active, then reinserting it and carrying on with its use as if nothing had happened. This was done to illustrate the fact that Optimus shuts down the GPU electrically, which is that little bit more energy efficient than dropping it into an idle state. Shimmy past the break to see the video.

Continue reading NVIDIA’s Optimus technology shows its graphics switching adroitness on video

NVIDIA’s Optimus technology shows its graphics switching adroitness on video originally appeared on Engadget on Wed, 03 Mar 2010 07:24:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceNVIDIA  | Email this | Comments

ATI’s six-screen Eyefinity madness reviewed, fatal flaw found

Along with its introduction of the HD 5830, ATI announced the HD 5870 Eyefinity 6 card yesterday, which predictably comes with six DisplayPort outputs and enables that hallowed six-screen gaming overload that the Eyefinity branding has been about since the beginning. Some lucky scribes over at PC Pro have been treated to a live demonstration of what gaming at 5,760 x 2,160 feels like, and their understated response was to describe it as “far more immersive.” No kidding. They did raise the spectral figure of those monitor bezels, however, pointing out that bezel correction — where the image “behind the bezel” is rendered but hidden making the overall display look like a window unto the game world — habitually obscured text and game HUD elements. In their view, the sweet spot remains a triple-screen setup, and we’re inclined to agree (particularly if they look like this). For those interested in getting their multi-monitor gaming up and running, we’ve linked an invaluable guide from HardOCP below, which breaks down how much you can expect from ATI’s current HD 5000 series of cards, and also provides a video guide to setting your rig up.

ATI’s six-screen Eyefinity madness reviewed, fatal flaw found originally appeared on Engadget on Fri, 26 Feb 2010 04:17:00 EST. Please see our terms for use of feeds.

Permalink   |  sourcePC Pro  | Email this | Comments

NVIDIA GeForce GT 340 highlights introduction of 300-series cards, none are powerful enough to matter

Is there a tribunal where you can bring up marketing teams for crimes against common sense? NVIDIA’s epic rebranding exercise knows no bounds, as the company has now snuck out its very first desktop 300-series cards, but instead of the world-altering performance parts we’ve always associated with the jump into the 300s, we’re getting what are essentially GT 2xx cards in new garb. The GT 340 sports the same 96 CUDA cores, 550MHz graphics and 1,340MHz processor clock speeds as the GT 240 — its spec sheet is literally identical to the 240 variant with 1,700MHz memory clocks. To be fair to the company, these DirectX 10.1 parts are exclusively for OEMs, so (hopefully) nobody there will be confused into thinking a GT 320 is better than a GTX 295, but we’d still prefer a more lucid nomenclature… and Fermi graphics cards, we’d totally like some of those too.

NVIDIA GeForce GT 340 highlights introduction of 300-series cards, none are powerful enough to matter originally appeared on Engadget on Wed, 24 Feb 2010 05:05:00 EST. Please see our terms for use of feeds.

Permalink Electronista, Fudzilla  |  sourceNVIDIA  | Email this | Comments

AnandTech goes behind the scenes of ATI’s RV870 / Evergreen GPU development

Anyone familiar with the constantly shifting release dates and delays that characterize GPU refresh cycles will have been impressed by ATI’s execution of the Evergreen series release. Starting out at the top with its uber-performance parts, the company kept to an aggressive schedule over the winter and can now boast a fully fleshed out family of DirectX 11 graphics processors built under a 40nm process. The fact that NVIDIA has yet to give us even one DX11 product is testament to the enormity of this feat. But as dedicated geeks we want more than just the achievements, we want to know the ins and outs of ATI’s resurgence and the decisions that led to its present position of being the market leader in features and mindshare, if not sales. To sate that curiosity, we have our good friend Anand Shimpi with a frankly unmissable retrospective on the development of the RV870 GPU that was to become the Evergreen chips we know today. He delves into the internal planning changes that took place after the delay of the R5xx series, the balancing of marketing and engineering ambitions, and even a bit of info on features that didn’t quite make it into the HD 5xxx range. Hit the source link for all that precious knowledge.

AnandTech goes behind the scenes of ATI’s RV870 / Evergreen GPU development originally appeared on Engadget on Mon, 22 Feb 2010 05:18:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceAnandTech  | Email this | Comments

ATI Radeon HD 5450 focuses on multimedia features, neglects gaming

It’s rare to see a rumor — hell, even a roadmap — pinpoint the timing of new releases quite so accurately, but our earlier report of ATI refreshing the middle and lower parts of its lineup turned out to be bang on. Following in the footsteps of the HD 5670, we have the Radeon HD 5450, which drags the entry price for DirectX 11 and Eyefinity multi-monitor support all the way down to $50. Course, the processing power inside isn’t going to be on par with its elder siblings, but that also means the 5450 will run cool enough to be offered with half-height, passive cooling solutions as seen above. ATI’s focus here is on media PCs, with a DisplayPort, um… port, alongside HDMI 1.3a, Dolby TrueHD and DTS-HD Master Audio bitstreaming support. For the money, you really can’t argue with all this extra multimedia juice, but if you must have benchmarks to sate your soul, check out the early reviews below — they’re full of bar charts and performance comparisons, don’t you know.

Continue reading ATI Radeon HD 5450 focuses on multimedia features, neglects gaming

ATI Radeon HD 5450 focuses on multimedia features, neglects gaming originally appeared on Engadget on Thu, 04 Feb 2010 02:56:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceBusiness Wire  | Email this | Comments