Apple to rely on Intel’s Sandy Bridge graphics in future MacBooks, AMD GPUs in MacBook Pros?

Apple will use Intel’s Sandy Bridge CPUs in its future laptops, no surprises there, but what’s interesting about these forthcoming machines is that some of them might rely solely on Intel’s chip for both general and graphical processing tasks. That’s the word from the usual “sources familiar with Apple’s plans,” who expect “MacBook models with screen sizes of 13 inches and below” to eschew the inclusion of a discrete GPU and ride their luck on the improved graphical performance of Intel’s upcoming do-it-all chip. There are currently no sub-13.3-inch MacBooks, so the suggestion of one is surely intriguing, but the major point here seems to be that NVIDIA’s being left out of the Apple party, because MacBook Pros are also predicted to switch up to AMD-provided graphics hardware. All these changes should be taking place with Apple’s next refresh, which is naturally expected at some point in the new year. Although, as CNET points out, this could all be just a massive negotiating ploy to get NVIDIA to play nicer with its pricing, we’re inclined to believe Intel has finally gotten its integrated graphics up to a level where it pleases the discerning tastemakers at Apple.

Apple to rely on Intel’s Sandy Bridge graphics in future MacBooks, AMD GPUs in MacBook Pros? originally appeared on Engadget on Thu, 09 Dec 2010 08:02:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceCNET  | Email this | Comments

NVIDIA GeForce GTX 570 debuts: the 580 goes on a power diet to fit into $349 price bracket

Want to know what the famous act of cutting down a graphics card to match a given price point looks like? Well, here it is, the $349 GTX 580 (aka GeForce GTX 570): it has 480 CUDA cores running at 1464MHz, a 732MHz graphics clock, and 1.25GB of GDDR5 memory hurtling along at an effective rate of 3.8GHz. Each of those specs represents a moderate downgrade from NVIDIA’s original 500 series GPU, while the physical construction — including that vapor chamber cooler — is almost wholly identical to the 580. Aside from the paintjob, the only difference is that the GTX 570 can live on a pair of 6-pin auxiliary power connectors. The best comparison for the 570, however, turns out to be NVIDIA’s former flagship, the GTX 480, as reviewers found the new card’s performance to be nearly identical to the old tessellation monster. Verdicts invariably agreed that the 570 is quieter, cooler, and more power-efficient, making it pretty much a no-brainer of a purchase in its price bracket. Of course, every recommendation comes colored with the warning that AMD should finally be unveiling its upper-tier wares next week — we’d wait the extra few days before parting with our cash.

Read – HardOCP
Read – Tech Report
Read – Hot Hardware
Read – AnandTech
Read – Bit-tech
Read – Hexus
Read – Legit Reviews
Read – PC Perspective

Continue reading NVIDIA GeForce GTX 570 debuts: the 580 goes on a power diet to fit into $349 price bracket

NVIDIA GeForce GTX 570 debuts: the 580 goes on a power diet to fit into $349 price bracket originally appeared on Engadget on Tue, 07 Dec 2010 09:43:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceNVIDIA  | Email this | Comments

NVIDIA GeForce GT 540M refreshes mobile graphics midrange (update: hands-on pics)

Uh oh, just as we thought NVIDIA had moved beyond its penchant for rebadging hardware, here comes the vanguard of its 500M mobile GPU series — which happens to be specced nearly identically to what’s already on offer in the 400M family. The GT 540M chip maintains the same 96 CUDA cores and 128-bit memory interface as the GT 435M, but earns its new livery by cranking up graphics and processor clock speeds to 672MHz and 1344MHz, respectively, while also taking the onboard memory to a max speed of 900MHz. Power requirements have been kept unchanged, mind you, and NVIDIA itself admits it’s exploiting the maturation of the production process to just throw out some speedier parts. China gets the GT 540M immediately, courtesy of Acer, while the rest of the world should be able to buy in at some point next month. Jump past the break for the full press release.

Update: We’ve managed to track down the particular Acer model that’ll mark the GT 540M’s debut, it’s called the Aspire 4741G. The option we saw came equipped with a 2.66GHz Intel Core i5-480M processor, 4GB of RAM, a 640GB HDD, a Blu-ray disc drive, and a 14-inch screen up top. There’s not much, aside from the new top cover design, to really distinguish this from the rest of Acer’s Aspire line, with the keyboard in particular being the very same one that we’ve witnessed on Timeline series machines for over a year now — comfortable, well spaced, but exhibiting quite a bit of flex around the Enter key. See more of it in the gallery below.

Continue reading NVIDIA GeForce GT 540M refreshes mobile graphics midrange (update: hands-on pics)

NVIDIA GeForce GT 540M refreshes mobile graphics midrange (update: hands-on pics) originally appeared on Engadget on Sun, 05 Dec 2010 22:33:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NVIDIA GeForce GTX 580 and AMD Radeon HD 6870 square off in dual-card showdown

Whether you’re an NVIDIAn calling it SLI or a Radeonite referring to it as CrossFireX, a multi-card graphics setup is nowadays almost a prerequisite for experiencing the best that PC gaming has to offer. It’d be negligent of us, therefore, not to point you in the direction of the Tech Report crew’s latest breakdown, which takes an investigative peek at dual-card performance on NVIDIA’s latest and greatest GeForce GTX 580 and naturally compares it to a wide range of other alternatives on the market. AMD’s latest refresh, the Radeon HD 6870, is among those options, though it’s worth remembering that the company’s real high-end gear isn’t due for another couple of weeks. All the same, most people will be buying their holiday rigs right around now, and if you want an exhaustive guide as to what’s what on the graphics front, the source link is your best, um… source.

NVIDIA GeForce GTX 580 and AMD Radeon HD 6870 square off in dual-card showdown originally appeared on Engadget on Thu, 02 Dec 2010 11:36:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceTech Report  | Email this | Comments

VisionTek Killer HD 5770 combo NIC / GPU hikes frame rates, lowers ping times for $200

Mama always said that one was never enough, and just five months after revealing its first NIC / GPU combo card to us at Computex, Bigfoot Networks has taken the wraps off of its second. This go ’round, the outfit is partnering with VisionTek to produce the VisionTek Killer HD 5770, a single PCIe card that combines an AMD Radeon HD 5770 GPU (with 1GB of GDDR5 memory) and a Killer E2100 networking card. All told, buyers are presented with two DVI ports, a single HDMI output and a gigabit Ethernet jack. The card is compatible with Windows 7, Vista and XP, and put simply, it’s designed to both improve your frame rates (that’s AMD’s role) and lower your latency / jitter (hello, Bigfoot!). The NIC portion actually has a 400MHz onboard processor that helps minimize the impact of slight changes in your connection, and Bigfoot’s management software will be thrown in for good measure. The board is expected to hit North American retail shops within a fortnight or so, with the $199.99 asking price representing a ~$10 savings compared to buying an HD 5770 GPU and Killer 2100 separately. Oh, and you get a pretty sick dragon, too.

Continue reading VisionTek Killer HD 5770 combo NIC / GPU hikes frame rates, lowers ping times for $200

VisionTek Killer HD 5770 combo NIC / GPU hikes frame rates, lowers ping times for $200 originally appeared on Engadget on Wed, 01 Dec 2010 08:00:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

AMD’s Bobcat APU benchmarked: the age of the Atom is at an end

So small, and yet potentially so disruptive. AMD’s 1.6GHz Zacate chip, bearing a pair of Bobcat modules, has been taken off the leash today, resulting in a torrent of benchmarks pouring down onto the internet. While perusing the sources below, you might think to yourself that it’s not exactly a world beater, sitting somewhere in the middle of the pack on most tests, but compare it to Intel’s dual-core Atom D510 — its most immediate competition in the target sub-$500 laptop price range — and you’ll find a thoroughgoing whooping in progress. The highlight of these new Fusion APUs is that they integrate graphics processing within the CPU chip, and Zacate didn’t disappoint on that front either, with marked improvements over anything else available in its class. The resulting chips might still not have quite enough grunt to earn a place in your daily workhorse mobile computer, but their power efficiency and netbook-level pricing goals sure do look delightful. Or dangerous, if you’re Intel.

Read – AnandTech
Read – Tech Report
Read – PC Perspective
Read – Hot Hardware
Read – Legit Reviews

AMD’s Bobcat APU benchmarked: the age of the Atom is at an end originally appeared on Engadget on Tue, 16 Nov 2010 03:34:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

AMD promises Radeon HD 6900 series launch for the week beginning December 13th

AMD’s cutting it mighty close, but the latest word from its PR mouthpiece is that the hotly anticipated Radeon HD 6970 and HD 6950 will be unveiled just in time for the gift-giving holidays. Fudzilla has heard directly from the Radeon team, who say they’re “going to take a bit more time before shipping the AMD Radeon HD 6900 series.” The NDA is expected to lift on the week beginning December 13th, but it’s anyone’s guess whether reviews of the cards will be accompanied by widespread in-store availability. Our hearts say yes, but our minds are already making other plans.

AMD promises Radeon HD 6900 series launch for the week beginning December 13th originally appeared on Engadget on Mon, 15 Nov 2010 05:52:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceFudzilla  | Email this | Comments

Hack turns Kinect into 3D video capture tool

We all knew this would inevitably happen, but seeing it in action is something else — the Kinect transformed by the power of open-source drivers into a true 3D video camera for capturing oneself. UC Davis visualization researcher Oliver Kreylos fed the streams from his peripheral’s infrared and color cameras into a custom program that interpolated and reconstructed the result, generating a mildly mindblowing 3D virtual reality environment he can manipulate at will. And if it makes him look a little bit like the proficiently penciled protagonists in Take On Me, that’s just the cherry on top. Don’t miss the videos after the break to see what we’re talking about.

Continue reading Hack turns Kinect into 3D video capture tool

Hack turns Kinect into 3D video capture tool originally appeared on Engadget on Sun, 14 Nov 2010 20:06:00 EDT. Please see our terms for use of feeds.

Permalink CrunchGear  |  sourceOliver Kreylos  | Email this | Comments

AMD GPU roadmap points to a happy 2011 for Radeon lovers

The ATI name might be dead, but Radeon graphics cards are only growing bigger, bolder and better. AMD’s recent financial analyst day has made official what many of us already knew or suspected: there’ll be three new high-end GPUs forthcoming in the first quarter of 2011. The slides explicitly describe the recently launched HD 6870 / 6850 as mere refreshes, aiming to bring HD 5800 series performance in a more efficient package, but peek beyond them and you’ll see an armada of HD 6900 chips just itching to bring the fight to NVIDIA and its newly crowned GTX 580 king of the single-GPU hill. No specs yet, of course, but at least we now know there’ll be some fireworks to greet us early in the new year. Oh, and if the mobile realm is more your thing, we’ve got a shot of AMD’s plans on that front waiting for you just after the break.

Continue reading AMD GPU roadmap points to a happy 2011 for Radeon lovers

AMD GPU roadmap points to a happy 2011 for Radeon lovers originally appeared on Engadget on Thu, 11 Nov 2010 04:01:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceAMD [PDF]  | Email this | Comments

ARM intros next-gen Mali-T604 embedded GPU, Samsung first to get it (update: video)

Promising “visually rich user experiences not previously seen in consumer electronics devices,” ARM has introduced its latest embedded GPU architecture, Mali-T604, at its Technology Conference 2010 in California today. Though we’re unlikely to see it in devices any time soon, the introduction means that the new design is available to ARM licensees — and notably, the company points out that partner Samsung will be the first to get hooked up. Considering Sammy competes in the high-end embedded system-on-chip space already with its ARM-based Hummingbird line of cores, adding in the Mali-T604 is probably the next logical step for them. ARM says that it’s designed “specifically” with the needs of general purpose GPU computing in mind and includes extensive support both for OpenCL and DirectX, so look for some insane number-crunching capabilities on your next-generation phone, tablet, and set-top box. Follow the break for ARM’s press release.

Update: We sat down with ARM’s Jem Davies to get some more details about the new Mali, and discovered it’s only the first of several potential next-gen GPUs to come as part of the Midgard platform — while this particular processor is available with up to four shader cores, successors might have more. The T604 itself is no slouch, though, as it can theoretically deliver two to five times the performance of the company’s existing Mali 400 GPUs core for core and clock for clock — which themselves run circles around the PowerVR SGX 540 competition if you take ARM at its word. Davies told us that not only does the Mali-T604 do DirectX, it supports the game-friendly DirectX11 as well as the always-popular OpenGL ES 2.0, and will appear in an system-on-a-chip together with an ARM Cortex-A15 “Eagle” CPU, when both are eventually baked into silicon several years down the road. Of course, in the eyes of marketers the future is always now, so get a look at conceptual uses (hint: augmented reality) for ARM’s new Mali right after the break.

Additional reporting by Sean Hollister

Continue reading ARM intros next-gen Mali-T604 embedded GPU, Samsung first to get it (update: video)

ARM intros next-gen Mali-T604 embedded GPU, Samsung first to get it (update: video) originally appeared on Engadget on Wed, 10 Nov 2010 20:52:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments