BAPCo calls ‘liar, liar’ on AMD, Intel still its golden prince

Benchmarks can be a bit of a back and forth schoolyard screaming match — there’s plenty of yelling, but not always much brute force to back it up — so let’s take this case of ‘he said / she said’ with an even coarser grain of salt. BAPCo, a non-profit whose members include major tech industry heavyweights, slapped back at AMD today for publicly dissing the SYSmark 2012 benchmark it had an 80 percent hand in creating and for claming the group forced them out of the club. The chip maker had similar beef back in 2007 over Intel’s benchmark-friendlier chips, and this appears to be the final straw that broke its GPU’s back. On Monday, VIA and NVIDIA also joined the ranks of the recently defected, but refrained from any superfluous PR finger-wagging. Wherever the truth may lie, for sure someone’s got a case of the green-eyed monster, and it’s definitely not us. We’re looking at you, AMD.

[Thanks, Muhammad; image courtesy BAPCo]

BAPCo calls ‘liar, liar’ on AMD, Intel still its golden prince originally appeared on Engadget on Wed, 22 Jun 2011 15:29:00 EDT. Please see our terms for use of feeds.

Permalink Maximum PC  |  sourceSemi Accurate  | Email this | Comments

Microsoft decides to pass on WebGL over security concerns (Update: iOS 5 supports WebGL, sort of))

WebGL Attack

Well, it looks like Microsoft is taking those warnings about WebGL pretty seriously. The company has decided not to support the web-based 3D standard because it wouldn’t be able to pass security muster. Highest on the list of concerns is that WebGL opens up a direct line from the internet to a system’s GPU. To make matters worse, holes and bugs may crop up that are platform or video card specific, turning attempts to plug holes in its defense into a game of whack-a-mole — with many players of varying reliability. Lastly Microsoft, like security firm Context, has found current solutions for protecting against DoS attacks rather unsatisfying. Lack of support in Internet Explorer won’t necessarily kill WebGL and, as it matures, Microsoft may change its tune — but it’s still a pretty big blow for all us of hoping the next edition of Crysis would be browser-based.

Update: As is usually the case Apple and the Windows folks are on opposite sides of this one. In fact, the Cupertino crew plans to bring WebGL to iOS 5 with one very strange restriction — it will only be available to iAd developers. Now, chances are it will eventually be opened up in mobile Safari for everyone, but for the moment it seems browser-based 3D graphics will be limited to advertisements on the iPhone. Still, that’s another big name throwing its support behind the burgeoning standard.

[Thanks, Greg]

Microsoft decides to pass on WebGL over security concerns (Update: iOS 5 supports WebGL, sort of)) originally appeared on Engadget on Fri, 17 Jun 2011 01:58:00 EDT. Please see our terms for use of feeds.

Permalink WinRumors, The Register  |  sourceMicrosoft, WebGL Mailing List  | Email this | Comments

MSI’s Afterburner Android app makes GPU overclocking as easy as Facebooking

Back in our day, overclocking one’s PC was akin to a fine art. It took skill. Precision. Effort. Cajones. These days, it’s just about as simple as blinking. Or winking. Or winking while blinking. MSI’s made the simplification of PC overclocking quite the priority over the past few years, with OC Genie and an updated Wind BIOS from last decade putting all sorts of power into the hands of mere mortals. At Computex this week, the outfit took things one step further with the Afterburner Android app. Purportedly, the GPU tool enables users to monitor the temperature, voltage and fan speed of their graphics card via a WiFi connection, and if you’re feeling froggy, you can overclock and overvolt to your heart’s content. Details beyond that are few and far betwixt, but we’re hearing that it’ll soon work with GPUs from other vendors, and that an iOS variant is en route.

Continue reading MSI’s Afterburner Android app makes GPU overclocking as easy as Facebooking

MSI’s Afterburner Android app makes GPU overclocking as easy as Facebooking originally appeared on Engadget on Mon, 06 Jun 2011 06:37:00 EDT. Please see our terms for use of feeds.

Permalink Far East Gizmos  |  sourceMSI  | Email this | Comments

LucidLogix brings GPU virtualization to AMD notebooks, all-in-ones, keeps sharing the graphics love


Late last year, LucidLogix introduced us to Virtu, the GPU virtualization software that makes disparate GPUs play nice on Sandy Bridge PCs, and now its extending the love to AMD Bulldozer and Brazos machines. The latest version of the software, dubbed Virtu Universal, also extends GPU virtualization to all-in-ones and notebooks (on both AMD and Intel), enabling simple switching between discrete graphics and the integrated ilk. What’s more, the program ushers in the debut of Virtual Vsync, which claims to bring “maximum gaming frame rates and responsiveness, while eliminating distracting and image-distorting visual tearing.” Of course, we’ll believe it when we see it, which, if LucidLogix has its way, should be before the ball drops in Time Square. Full PR after the break.

Continue reading LucidLogix brings GPU virtualization to AMD notebooks, all-in-ones, keeps sharing the graphics love

LucidLogix brings GPU virtualization to AMD notebooks, all-in-ones, keeps sharing the graphics love originally appeared on Engadget on Wed, 01 Jun 2011 21:38:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceLucidLogix  | Email this | Comments

ASUS Mars II and Matrix GTX580 Platinum eyes-on

If you thought the original Mars graphics card from ASUS was a little bit ridiculous, get ready to see what a lot of ridiculous looks like. The company’s Mars II that was recently teased alongside a fresh new Matrix GTX580 Platinum card, squeezes two GeForce GTX 580 chips on the same board and overclocks them for good measure. In order to achieve such great feats, the card requires no less than three 8-pin auxiliary power connectors and takes up the space of three (2.6, to be precise) PCI slots with its ginormous dual-fan cooler. Heatpipes are also employed to keep the raging fires within in check, and — for situations where all else fails — ASUS has installed a special red button that sends the fan into full speed when depressed. ASUS hasn’t yet finalized how far above the default engine clock speeds the Mars II will reach, but it has a bit of time to figure that out as this extremely limited edition card is coming sometime in July. Buyers in the US, Europe and Asia-Pacific region will have to be quick on their credit card trigger, as only 1,000 Mars IIs will ever be produced. Oh, and if you’re wondering how much power a dual-GTX 580 graphics card might consume, the answer is 600W. All by itself.

Also making its debut at Computex this week is ASUS’ latest offering for the truly overclock-mad PC gamer: the Matrix GTX580 Platinum. Frankly, it feels barren by comparison to its Martian sibling, coming with just one GTX 580 graphics processor, albeit an overclocked one, and the requirement for only two 8-pin connectors for added power. ASUS has thrown in a pair of physical “plus” and “minus” buttons, which permit voltage alterations on the fly, added the same fan override key as on the Mars II, and included a Safe Mode switch at the back in the event that you get carried away with your tweaking. Mashing that last button will reset all clock speeds, voltages and other settings to their default values, which should hopefully let you boot back up and try again. A final note of merit goes to the LED-infused Matrix logo atop the GTX580 Platinum. It’s not there just for decorative purposes; its color changes in response to the load the GPU is under, so that blue and green will tell you there are no worries and orange and red will indicate you’re cranking it close to its limits. The GTX580 Platinum should start selling worldwide next week, though pricing has yet to be announced. Check it out in closer detail in the gallery below.

ASUS Mars II and Matrix GTX580 Platinum eyes-on originally appeared on Engadget on Wed, 01 Jun 2011 13:35:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NVIDIA’s quad-core Kal-El used to demo next-gen mobile graphics, blow minds (video)

You might think yourself too grown-up to be wowed by shiny, glittery things, but we doubt many will be able to watch NVIDIA’s new Glow Ball tech demo without a smidgen of childlike glee. Built to run on the company’s quad-core Kal-El processor, it shows us the first example of true dynamic lighting on mobile devices and also throws in some impressive physics calculations like fully modeled cloth motion. Instead of the pre-canned, static lights that we see on mobile games today, NVIDIA’s new hardware will make it possible to create lighting that moves, fluctuates in intensity, and responds realistically to its environment — all rendered in real time. The titular glow ball can be skinned with different textures, each one allowing a different amount and hue of illumination to escape to surrounding objects, and is directed around the screen using the accelerometer in your tablet or smartphone.

NVIDIA demoed the new goodness on a Honeycomb slate with 1280 x 800 resolution and the frame rates remained smooth throughout. In order to emphasize the generational leap that we can expect with Kal-El, the company switched off two of the four cores momentarily, which plunged performance down to less than 10fps. That means the simulations we’re watching require a full quartet of processing cores on top of the 12-core GPU NVIDIA has in Kal-El. Mind-boggling stuff. Glow Ball will be available as a game on Android tablets once this crazy new chip makes its way into retail devices — which are still expected in the latter half of this year, August if everything goes perfectly to plan. One final note if you’re still feeling jaded: NVIDIA promises the production chip will be 25 to 30 percent faster than the one on display today. Full video demo follows after the break.

Continue reading NVIDIA’s quad-core Kal-El used to demo next-gen mobile graphics, blow minds (video)

NVIDIA’s quad-core Kal-El used to demo next-gen mobile graphics, blow minds (video) originally appeared on Engadget on Sun, 29 May 2011 23:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NVIDIA refreshes notebook graphics with GeForce GTX 560M, attracts ASUS, MSI, Toshiba and Alienware

If you’ve enjoyed NVIDIA’s fine tradition of merely bumping along its GPUs time and again and affixing a new badge, you’ll like the GeForce GTX 560M — it’s much like last year’s GTX 460M, but with more bang for the buck than ever. ASUS, MSI, Alienware, Toshiba and Clevo have all committed to new notebooks bearing the graphics processor in light of the potent performance NVIDIA claims it will bring: Namely, those same 192 CUDA cores (now clocked at 1550MHz) and up to 3GB of GDDR5 memory (now clocked at 1250MHz, with a 192-bit bus) should enable the latest games to run at playable framerates on a 1080p screen with maximum detail — save antialiasing. Of course, that assumes you’ve also got a recent quad-core Sandy Bridge processor and gobs upon gobs of RAM, but NVIDIA also says that with the built-in Optimus switchable graphics, those same potent laptops should be able to manage five hours of battery life while idling.

If you’re looking for some inexpensive discrete graphics, however, NVIDIA’s also got a refresh there, as the new GeForce GT 520MX bumps up all the clock speeds of the GT 520M. When can you expect a mobile GPU to knock the GTX 485M off its silicon throne, though? Glad you asked: a chart shows a “Next-gen GTX” coming late this year. Meanwhile, see what NVIDIA says the GTX 560M’s capable of in the gallery below and a video after the break.

Continue reading NVIDIA refreshes notebook graphics with GeForce GTX 560M, attracts ASUS, MSI, Toshiba and Alienware

NVIDIA refreshes notebook graphics with GeForce GTX 560M, attracts ASUS, MSI, Toshiba and Alienware originally appeared on Engadget on Sun, 29 May 2011 20:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Droid Bionic benchmark reports PowerVR GPU, new SOC inside?

Droid Bionic Benchmark

A very strange thing popped up on mobile graphic benchmarking site NenaMark the other day — an entry for the Droid Bionic. Now, it would be very easy to fake this test, and you’d be right to be skeptical given the incomplete score and the fact that it’s reporting PowerVR’s SGX 540 GPU, instead of the Tegra 2 we saw at CES. But, let’s not be too hasty — we heard back in April that NVIDIA’s mobile chip wasn’t playing nice with Verizon’s LTE. Perhaps when Motorola said it was delaying the Bionic to incorporate “several enhancements” it really meant “rebuilding the phone with a more LTE friendly CPU.” Both Samsung and Texas Instruments use the SGX 540, and Moto has previously turned to TI’s OMAP for the Droid, Droid 2, and Droid X. Then again, a single, suspiciously low benchmark score isn’t the most convincing basis for a rumor.

Droid Bionic benchmark reports PowerVR GPU, new SOC inside? originally appeared on Engadget on Sun, 29 May 2011 05:05:00 EDT. Please see our terms for use of feeds.

Permalink Droid Life  |  sourceNenaMark  | Email this | Comments

AMD ships five million Fusion chips, says it’s sold out

Sounds like Notbooks are making a dent: AMD says it’s shipped five million Fusion processors since the architecture’s debut, according to a report at CNET. In January, the company said the hybrid CPU / GPU chips had momentum, and as of last month it was quoting 3.9 million APUs out in the wild, but this week AMD says that demand has overtaken supply and it’s completely sold out of the Atom alternative. Sounds like Intel’s more than justified in seeking out hybrid solutions of its own, no matter where it might have to look to get a leg up in the integrated graphics market. Here’s hoping AMD’s other Fusion chips show just as much pep per penny (and milliampere-hour) as the original processor.

AMD ships five million Fusion chips, says it’s sold out originally appeared on Engadget on Sat, 28 May 2011 20:21:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceCNET  | Email this | Comments

ASUS Matrix GTX 580 and MARS II desktop graphics cards revealed, devour PCI slots

ASUS Matrix GTX 580

With all the talk of ASUS’s tablets recently it’s easy to forget the company also dabbles in graphics cards, some large enough to blot out the sun. We’ve got some details on its latest contestants for your PC gaming dollar, the MARS II and Matrix GTX 580 (above), and you might have to buy a new case just squeeze these unwieldy pixel-pushers inside. The Matrix will come in two flavors — standard and Platinum — both with 1.5GB of RAM and an enormous dual-fan cooling solution that eats up a jaw-dropping three PCI slots. But, hey, it should afford you some serious overclocking headroom. Though we’ve yet to seen any pics of the MARS II, the 3GB, dual-GPU behemoth is bound to be even more massive — we wouldn’t be surprised if ASUS had to provide a breakout box for whatever cooler it strapped to those pair of GTX 580 cores. Prices and release dates are still up in the air, but we’re sure all will be revealed during the official announcement at Computex. Check out the image after the break for more detailed specs.

[Thanks, Robert and Alexandre]

Continue reading ASUS Matrix GTX 580 and MARS II desktop graphics cards revealed, devour PCI slots

ASUS Matrix GTX 580 and MARS II desktop graphics cards revealed, devour PCI slots originally appeared on Engadget on Wed, 25 May 2011 18:40:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceVR-Zone, TechConnect  | Email this | Comments