Intel Arrandale chips detailed, priced and dated?

Who’s up for some more Intel roadmap rumoring? The latest scuttlebutt from “notebook players” over in the far East is that the chip giant has finally settled on names, speeds, and prices for its first three Arrandale CPUs, which are expected to arrive in the first half of 2010. The Core i5-520UM and Core i7-620UM both run at 1.06GHz, while the top Core i7-640UM model speeds ahead at 1.2GHz, with bulk-buying prices of $241, $278, and $305 per unit of each processor. Even if the processing speeds might not impress on paper, these 32nm chips splice two processing cores, the memory controller, and graphics engine all into the same package and thereby deliver major power savings. Platform pricing is expected to remain at around $500 for netbooks, while the ultrathins these chips are intended for should hit the $600 to $800 range… if Lord Intel wills it so.

Filed under:

Intel Arrandale chips detailed, priced and dated? originally appeared on Engadget on Thu, 12 Nov 2009 09:38:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

ST-Ericsson’s U8500 platform gives your next smartphone wicked 3D powers

It’s one thing for ARM to develop a potent GPU meant to add impressive 3D capabilities to devices that were previously forced to run the likes of “Snake,” but it’s another thing entirely to see a platform and semiconductor company come forward and take it one step closer to the mainstream. ST-Ericsson has done just that with its U8500 platform, which is the first to integrate ARM’s Mali-400 graphics processing unit into a solution that can be easily fitted into future phones. Think your iPhone 3GS GPU is mighty enough? Hop on past the break and mash play — it’ll make those fancy water reflections you’re currently drooling over look downright ugly.

[Via B4Tech, thanks Chris]

Continue reading ST-Ericsson’s U8500 platform gives your next smartphone wicked 3D powers

Filed under: ,

ST-Ericsson’s U8500 platform gives your next smartphone wicked 3D powers originally appeared on Engadget on Wed, 04 Nov 2009 17:13:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

EVGA GeForce GTX 275 Co-opts a GTS 250 for PhysX duties

Ready for some more dual-GPU madness, only this time in the resplendent green of NVIDIA? EVGA has gone and concocted a special Halloween edition of the GTX 275, which has sprouted an entire GTS 250 appendage solely for PhysX gruntwork. Dubbed a new form of Hybrid SLI, EVGA’s latest combines — for the first time, from what we can tell — two different GPUs and assigns them with specific and mutually exclusive tasks. Whether this concept takes off will depend to a large extent on the effectiveness of PhysX acceleration and whether it can show more efficient scaling than regular old SLI with two boards or more conventional dual-GPU setups like the GTX 295. Color us intrigued, either way.

P.S. – That’s what the actual card will look like, we’re not making it up.

[Via PC Perspective]

Filed under:

EVGA GeForce GTX 275 Co-opts a GTS 250 for PhysX duties originally appeared on Engadget on Tue, 03 Nov 2009 05:14:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

ATI’s dual-GPU Radeon HD 5970 pictured in the wilderness


And now… fighting out of the red corner, weighing in with two Evergreen GPUs, and wearing black trunks and red trim, it’s the Radeon HD 5970. ATI’s latest challenger for the title of undisputed graphics champion has been snared in the wild, and its photo shoot reveals a suitably oversized beast. Measuring in at 13.5 inches and requiring both an eight- and six-pin power connector, the pre-production sample can fit inside only the roomiest and best-powered rigs around. It’s named somewhat confusingly, with AMD dropping its X2 nomenclature for dual GPU setups, but it features two HD 5870 chips running in onboard Crossfire on the same PCB, and foreshadows a HD 5950, which will combine a pair of the more affordable HD 5850s. Performance figures available earlier have been pulled, at the behest of AMD, but we’ve got plenty of eye candy to admire, and there’s also no price tag in sight to spoil our daydreaming pleasure.

[Via PC Perspective]

Filed under:

ATI’s dual-GPU Radeon HD 5970 pictured in the wilderness originally appeared on Engadget on Sat, 31 Oct 2009 21:10:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

NVIDIA launches Fermi next-gen GPGPU architecture, CUDA and OpenCL get even faster

NVIDIA had told us it would be accelerating its CUDA program to try and get an advantage over its competitors as OpenCL brings general-purpose GPU computing to the mainstream, and it looks like that effort’s paying off — the company just announced its new Fermi CUDA architecture, which will also serve as the foundation of its next-gen GeForce and Quadro products. The new features are all pretty technical — the world’s first true cache hierarchy in a GPU, anyone? — but the big takeaway is that CUDA and OpenCl should run even faster on this new silicon, and that’s never a bad thing. Hit up the read links for the nitty-gritty, if that’s what gets you going.

Read – NVIDIA Fermi site
Read – Hot Hardware analysis
Read – PC Perspective analysis

Filed under:

NVIDIA launches Fermi next-gen GPGPU architecture, CUDA and OpenCL get even faster originally appeared on Engadget on Thu, 01 Oct 2009 02:09:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Gallery: Tablet Computing From 1888 to 2010

collage

The word “tablet” used to refer to a flat slab for bearing an inscription. Leave it to the tech industry to make it into something far more complicated and confusing.

Scores of products marketed as “tablets” have come and gone, and now — with rumors of imminent tablet computers from Apple, Dell, Microsoft and others — the category seems ripe for a rebound.

“If people can figure out a new device category that consumers will want to buy that isn’t a laptop or a phone, that opens a whole new possibility in markets to conquer,” explains Michael Gartenberg, a tech strategist with Interpret. “That’s why companies continue to invest in this space, and we have a large number of bodies that are littered in this space.”

Let’s take a look at tablets past, present and future. If the upcoming tablets are to succeed, they’ll need to learn from hideous mistakes like the Apple Newton and the Tablet PC.

Origins
picture-21 The origins of the tablet computer can be traced as far back as the 19th century. Electrical engineer Elisha Gray registered an 1888 patent (.pdf) describing an electrical-stylus device for capturing handwriting. Famous for his contributions to the development of the telephone, Gray’s idea with a “tablet” was not for drawing, but rather a method of using telegraph technology to transmit handwritten messages. (Think of it as a primitive form of instant messaging or e-mailing.)

Gray’s concept wasn’t merely a flat slab. His patent depicts two instruments: a transmitter and a receiver. The transmitter is a pen-like device connected to two electric circuits acting as interruptors. Current interruptions are used to translate the transmitter pen’s movements into signals transmitted to the receiver pen to mimic the movements, thereby reproducing the message on a piece of paper.

This description hardly sounds anything like a tablet, but later electronic-handwriting-recognition patents built from the idea of transmitting and receiving instruments, eventually combining them into one slab-shaped device like the tablets we see today.

The Apple Newton
applenewton
The Newton MessagePad (above) was the first attempt by a major computer company at producing a commercial tablet-type computer for the mass market. Weighing in at about two pounds, Apple’s 1993 foray into tablet computing sported a powerful-for-its-time 20 MHz processor and a pen-centric interface. Writing recognition in the first version was so bad that it was famously mocked in a Doonesbury cartoon, and though it subsequently improved, the Newton never recovered from the initial PR blow. In 1998, Apple discontinued the Newton when Steve Jobs retook the helm as CEO, leaving a small coterie of true believers to keep the product’s memory alive.

PDAs and Smartphones
9423_screensource1
While no one refers to their iPhone as a “pocket tablet,” these devices are an important stage in the development of tablet computers.

Palm founder Jeff Hawkins learned from Apple’s mistakes and set out to build a pocket-sized computer that was smaller, cheaper, more modest in its ambitions and ultimately more useful than the Newton. He succeeded wildly with the 1996 launch of the Palm Pilot, spawning a long line of pen-based personal digital assistants from Palm, HP, Dell and others.

When Apple returned to the touchscreen world with the iPhone in 2007, it showed that it had paid close attention during the decade since the Newton flopped. The iPhone was simple, small, elegant and did a handful of things — make calls, browse the web, handle e-mail — very well. The fact that it wasn’t an all-purpose portable computer didn’t seem to matter so much compared to its usability and design.

Graphics tablets

bambooGraphics tablets are computer input devices with a stylus-controlled interface. The technologies used vary, but generally all graphic tablets use the received signal to determine the horizontal and vertical position of the stylus, distance of the stylus from tablet surface and the tilt (vertical angle) of the stylus. Popular among digital illustrators, tablets facilitate a natural way to create computer graphics, especially 2-D illustrations.

Given their specialty, graphics tablets fill a niche for digital artists. Some consumer applications include writing Chinese, Japanese or Korean characters, working with handwriting recognition software to transfer them onto the computer. The stylus can also be used as a mouse.

However, for other languages, including English, the majority of consumers prefer typing on a keyboard for speedier writing, according to Gartenberg. Thus, the graphics tablet fills a niche in the design industry, but it is not a major product category in the consumer market. Wacom is the most prominent manufacturer producing graphics tablets today. (Example above: Wacom Bamboo Fun)


Video: ATI Radeon Eyefinity eyes-on, featuring Left 4 Dead on a 175-inch display

Vision rebranding wasn’t AMD’s only big unveil yesterday, as the company had on display a number of different stations for its ATI Radeon Eyefinity technology. Sure, there’s three-monitor Google Earth and airbrushing, but the real kicker, in case you doubted earlier claims that playing Left 4 Dead on three 30-inch screens “absolutely changes the experience for the better,” is footage of the game being playing on a 175-inch display, comprised of six HD projectors and boasting 5,500 x 2,000 pixel resolution. Sure, it’s not the greatest gaming screen we’ve seen, but short of having access to your own football stadium, it’s mighty impressive. See for yourself after the break.

Continue reading Video: ATI Radeon Eyefinity eyes-on, featuring Left 4 Dead on a 175-inch display

Filed under: ,

Video: ATI Radeon Eyefinity eyes-on, featuring Left 4 Dead on a 175-inch display originally appeared on Engadget on Fri, 11 Sep 2009 12:07:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Giz Explains: Why Tech Standards Are Vital For Apple (And You)

Tech standards are important. They’re, well, standards. They shape the way the world works, ideally. So if you wanna influence your little world, you probably wanna shape (or maybe even create) standards. Take Apple, for example.

They Call It “Open” For a Reason
One of the more excellent aspects of Snow Leopard, actually, is its full-scale deployment of OpenCL 1.0—Open Computing Language—a framework that allows programmers to more easily utilize the full power of mixes of different kinds of processors like GPUs and multi-core CPUs. (Much of the excitement for that is in leveraging the GPU for non-graphical applications.)

OpenCL lives up to its name: It is a royalty-free open standard managed by the Khronos Group, and supported by AMD/ATI, Apple, ARM, IBM, Intel, Nvidia, among others. Interesting thing about this open industry standard is that it was developed and proposed by… Apple.

What Is a Standard?
By “standard,” we’re talking about a format, interface or programming framework that a bunch of companies or people or organizations agree is the way something’s going to get done, whether it’s how a movie is encoded or the way websites are programmed. Otherwise, nothing works. A video that plays on one computer won’t play on another, web sites that work in one browser don’t work in another, etc. With increased connectedness between different machines and different platforms, standards are increasingly vital to progress.

Standards can range from open (anybody can use them, for free) to open with conditions (anybody can use them as long they follow conditions X, Y and Z) to closed (you gotta have permission, and most likely, pay for it). Some companies view standards strictly as royalty machines; others don’t make much money on them, instead using them to make sure developers do things the way they want them to. Apple falls into this latter category, by choice or possibly just by fate.

Kicking the Big Guy in the Shins
Of course, OpenCL isn’t the only open standard that Apple’s had a hand in creating or supporting that actually went industry-wide. When you’re the little guy—as Apple was, and still is in computer OS marketshare, with under 10 percent—having a hand in larger industry standards is important. It keeps your platform and programming goals from getting steamrolled by, say, the de facto “standards” enforced by the bigger guy who grips 90 percent of the market.

If you succeed in creating a standard, you’re making everybody else do things the way you want them done. If you’re doubting how important standards are, look no further than the old Sony throwing a new one at the wall every week hoping it’ll stick. Or Microsoft getting basically everybody but iTunes to use its PlaysForSure DRM a couple years ago. Or its alternative codecs and formats for basically every genuine industry standard out there. To be sure, there is money to be made in standards, but only if the standard is adopted—and royalties can be collected.

Web Standards: The Big Headache
The web has always been a sore spot in the standards debate. The web is a “universal OS,” or whatever the cloud-crazy pundits call it, but what shapes your experience is your browser and in part, how compliant it is with the tools web developers use to build their products. Internet Exploder shit all over standards for years, and web programmers still want IE6 to die in a fiery eternal abyss.

Enter WebKit, an open source browser engine developed by Apple based off of the KHTML engine. It’s so standards-compliant it tied with Opera’s Presto engine to be the first to pass the Acid3 test. What’s most striking about WebKit isn’t the fact it powers Safari and Google Chrome on the desktop, but basically every full-fledged smartphone browser: iPhone, Android, Palm Pre, Symbian and (probably) BlackBerry. So WebKit hasn’t just driven web standards through its strict adherence to them, but it has essentially defined, for now, the way the “real internet” is viewed on mobile devices. All of the crazy cool web programming you see now made is made possible by standards-compliant browsers.

True, OpenCL and WebKit are open source—Apple’s been clever about the way it uses open source, look no further than the guts of OS X—but Apple is hardly devoted to the whole “free and open” thing, even when it comes to web standards.

All the AV Codecs You Can Eat
The recent debate over video in the next web standards, known collectively as HTML5, shows that: Mozilla supports the open-source Ogg Theora video codec, but Apple says it’s too crappy to become the web’s default video standard—freeing everyone from the tyranny of Adobe’s Flash. Apple says Ogg’s quality and hardware acceleration support don’t match up to the Apple-supported MPEG-4 standardized H.264 codec, which is tied up by license issues that keep it from being freely distributed and open. (Google is playing it up the middle for the moment: While it has doubts about the performance of Ogg Theora, Chrome has built-in support for it and H.264.)

Apple has actually always been a booster of MPEG’s H.264 codec, which is the default video format supported by the iPhone—part of the reason YouTube re-encoded all of its videos, actually—and gets hardware acceleration in QuickTime X with Snow Leopard. H.264 is basically becoming the video codec (it’s in Blu-ray, people use it for streaming, etc.).

Why would Apple care? It means Microsoft’s WMV didn’t become the leading standard.

A sorta similar story with AAC, another MPEG standard. It’s actually the successor to MP3, with better compression quality—and no royalties—but Apple had the largest role in making it mainstream by making it their preferred audio format for the iPod and iTunes Store. (It saw some limited use in portables a little earlier, but it didn’t become basically mandatory for audio players to support it until after the iPod.) Another bonus, besides AAC’s superiority to MP3: Microsoft’s WMA, though popular for a while, never took over.

FireWire I Mean iLINK I Mean IEEE 1394
Speaking of the early days of the iPod, we can’t leave out FireWire, aka IEEE 1394. Like OpenCL, Apple did a lot of the initial development work (Sony, IBM and others did a lot of work on it as well), presented it to a larger standards body—the Institute of Electrical and Electronics Engineers—and it became the basis for a standard. They tried to charge a royalty for it at first, but that didn’t work out. It’s a successful standard in a lot of ways—I mean, it is still on a lot of stuff like hard drives and camcorders still—but USB has turned out to be more universal, despite being technically inferior. (At least until USB 3.0 comes out, hooray!)

Update: Oops, forgot Mini DisplayPort, Apple’s shrunken take on DisplayPort—a royalty-free video interface standard from VESA that’s also notably supported by Dell—which’ll be part of the official DisplayPort 1.2 spec. Apple licenses it for no fee, unless you sue Apple for patent infringement, which is a liiiiittle dicey. (On the other hand, we don’t see it going too far as industry standard, which is why we forgot about it.)

That’s just a relatively quick overview of some of the standards Apple’s had a hand in one way or another, but it should give you an idea about how important standards are, and how a company with a relatively small marketshare (at least, in certain markets) can use them wield a lot of influence over a much broader domain.

Shaping standards isn’t always for royalty checks or dominance—Apple’s position doesn’t allow them to be particularly greedy when it comes to determining how you watch stuff or browse the internet broadly. They’ve actually made things better, at least so far. But, one glance at the iPhone app approval process should give anybody who thinks they’re the most gracious tech company second thoughts about that.

Still something you wanna know? Send questions about standards, things that are open other than your mom’s legs or Sony Ultra Memory Stick XC Duo Quadro Micro Pro II to tips@gizmodo.com, with “Giz Explains” in the subject line.

Ask Engadget: Best ultraportable laptop for gaming?

We know you’ve got questions, and if you’re brave enough to ask the world for answers, here’s the outlet to do so. This week’s Ask Engadget question is coming to us from Ron, who would just be happy with an ultraportable with an actual, bona fide, worthwhile GPU.

“I am looking for a 12- or 13-inch ultraportable that can also play modern games at a reasonable level, for less than $1,000. I know the brainiacs out there can help me out. Love the site, thanks!”

We know for sure that Dell’s Studio XPS 13 has the guts to pull off a few modern titles, but we know there are far more options out there than that. So, who here has a super small laptop with a discrete GPU worth bragging about? Don’t hold back now, vaquero.

Filed under: ,

Ask Engadget: Best ultraportable laptop for gaming? originally appeared on Engadget on Thu, 27 Aug 2009 21:03:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Sony finally admits NVIDIA chips are borking its laptops, offers free repair

Last summer, while Dell and HP were busy pinpointing and replacing faulty NVIDIA chips in their notebooks, Sony was adamant that its superior products were unaffected by the dreaded faulty GPU packaging. Well, after extensive support forum chatter about its laptops blanking out, distorting images and showing random characters, the Japanese company has finally relented and admitted that “a small percentage” of its VAIO range is indeed afflicted by the issue. That small percentage comes from the FZ, AR, C, LM and LT model lines, and Sony is offering to repair yours for free within four years of the purchase date, irrespective of warranty status. Kudos go to Sony for (eventually) addressing the problem, but if you’re NVIDIA, don’t you have to stop calling this a “small distraction” when it keeps tarnishing your reputation a full year after it emerged?

[Thanks, Jonas]

Filed under:

Sony finally admits NVIDIA chips are borking its laptops, offers free repair originally appeared on Engadget on Tue, 11 Aug 2009 12:09:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments