Giz Explains: How to Choose the Right Graphics Card

There are plenty of great graphics cards out there, no matter what you’re looking for. Thing is, the odds are seemingly stacked against you ever finding the right one. It doesn’t have to be that hard.

Whether you’re buying a new computer, building your own or upgrading an old one, the process of choosing a new graphics card can be daunting. Integrated graphics solutions—the kind that come standard with many PCs—have trouble playing games from three years ago, let alone today, and will put you at a disadvantage when future technologies like GPGPU computing, which essentially uses your graphics card as an additional processor, finally take hold. On top of all this, we’re in the middle of a price dip—it’s objectively a great time to buy. (Assuming you’re settled on a desktop. Ahem.) The point is, you’ll want to make the right choice. But how?

Set Specific Goals, Sight Unseen
Your first step to finding the right graphics card is to just step back. Just as graphics card specs are nigh-on impossible to understand, naming conventions and marketing materials will do nothing except give you a headache. The endlessly higher numerical names, the overlapping product lines, the misleadingly-named chip technologies—just leave them. For now, pretend they don’t exist.

Now, choose your goals. What games do you want to play? What video output options and ports do you want? What resolution will you be playing your games at? Do you have any use for the fledgling GPGPU technologies that are slowly permeating the marketplace? And although you may have to adjust this, set a price goal. Ready-built PC buyers will have to consider whatever upgrade cost your chosen company is charging, and adjust accordingly. For people upgrading their own systems, $150-$200 has been something of a sweet spot: It’ll get you a card with a new enough GPU, and sufficient VRAM to handily deal with mainstream games for a solid two years. If you want to spend less, you can; if you want to spend more, fine.

These are the terms that matter most. Seriously, disregard any allegiance to Nvidia or ATI, prior experiences with years-old graphics hardware or some heretofore distant, unreleased and unspec’d game franchise. Be decisive about what you want, but as far as hardware and marketing materials go, start blind.

Don’t Get Caught Up In Specs
Now that you’ve laid out your ambitions, as modest or extreme as they may be, it’s time to dive into the seething, disorienting pool of hardware that you’ll be choosing from. The selection, as you’ll find out, is daunting. The first layer of complexity comes from the big two—Nvidia and ATI—whose product lines read more like Terminator robot taxonomies than something generated by humans. Here’s Nvidia’s desktop product line, right now:

It seems like you ought to be able glean a linear progression of performance (or at least price) out of that alphanumeric pile, right? Not at all. How in the world are we to know that the 9800GTX is generally more powerful than the GTS 250, or that the 8800GTS trumps a 9600GT? A two letter suffix can mean more than a model number, and likewise, a model number can mean more than membership in a product line. These naming conventions change every couple years, and occasionally even get traded between companies. For example, I’ve personally owned two graphics cards that bore 9×00 names—you just won’t see them on the chart above, because they were made by ATI. Point is: You don’t need to bother with this nonsense.

The next layer of awfulness comes from the sundry OEMs that rebrand, tweak and come up with elaborate ways to cool offerings from the big two. This is what Sapphire, EVGA, HIS, Sparkle, Zotac and any number of other inanely named companies do. They can, on occasion, cause some sizable changes to the performance of the GPUs they’re built around, but by and large, the Nvidia or ATI label on the box is still the best indication of what to expect from the product, i.e., a Zotax Gtx285 won’t be that much better or worse than an eVGA or stock model. You’ll get a different fan/heatsink configuration, different hardware styling, and possibly different memory or GPU frequency specs, but the most important difference—and the only one you should really concern yourself with—is price.

Graphics cards’ last, least penetrable line of defense against your comprehension is hardware jargon. Bizarre, unhelpful spec sheets are, and always have been, a common feature in PC hardware, from RAM (DDR3-1600!) to processors (12 MB L2 cache! 1333MHz FSB!).

The image associated with this post is best viewed using a browser.Graphics cards are worse. Each one has three MHz-measured speeds you’ll see advertised—the core clock, the CPU (shader) clock and the memory frequency. VRAM—the amount of dedicated memory your card has to work with—is another touted specification, ranging from 256MB to well beyond the 1GB barrier for gaming cards. On top of frequency, memory introduces a whole slew of additional confusing numbers: memory type (as in, DDR2 or DDR3); interface width (in bits, the higher the better); and memory bandwidth, nowadays measured in GB/s. And increasingly, you’ll see processor core numbers trotted out. Did you know that Nvidia’s top-line card has 480 of them? No? Good.

The best way to approach these numbers is to ignore them. Sure, they provide comparative evaluation and yes, they do actually mean something, but unless you’re a bonafide graphics card enthusiast, you won’t be able to look at a single spec—or a whole spec sheet—and come to any useful conclusions about the cards. Think of it like cars: horsepower, torque and engine displacement are all real things. They just demand context before they can be taken to mean anything to the driver. That’s why road tests carry so much weight.

Graphics cards have their own road testers, and they’ve got the only numbers you need to worry about.

The image associated with this post is best viewed using a browser.Respect the Bench, or Trust the Experts
In the absence of meaningful specs, names or distinguishing features, we’re left with benchmarks. This is a good thing! For years, sites like Tom’s Hardware, Maximum PC, and Anandtech have tirelessly run nearly every new piece of graphics hardware through a battery of tests, providing the buying public with comparative measures of real-word performance. These are the only numbers you need to bother yourself with, and where those goals you settled on come into play.

Here’s how to apply them. Say you just really want to play Left 4 Dead, and have about a hundred dollars to spend. Navigate over to Tom’s, check their benchmarks for that particular game, and scroll down the list. You’re looking for a card that is a) an option on whatever system you’re buying and b) can handle the game well—at a high resolution and high texture quality—which, generally speaking, is a comfortable 60 frames per second. Find the card, check the price and you’re practically done. Once you’ve zeroed in on a card based on your narrow criteria, expand outward. You can check out more games benchmarks and seek out standalone reviews, which will enlighten you on other, less obvious considerations, like fan noise, power draw and reported reliability. (Note: resources for notebook users are a little more sparse. That said, Notebook Check [click the British flag for English] does good work.]

From there, your next worry will be buying for the future. You shouldn’t buy the bare minimum hardware for the current generation of games—there’s no need to spring for a card that’ll be obsolete within a few months, no matter how cheap it is. But buying the latest, greatest dual-GPU graphics cards is an equally bad value proposition. As generations of video hardware have come and gone, one thing has remained constant: A company’s midrange offerings, usually pegged at about $150-$200, are your best bet, period. Sometimes they’ll be new products, and sometimes they’ll have been around a while. What you’ll be buying, basically, is the top end of the last generation. This is fine, and will keep the vast majority of users happy for the lifecycle of their PC. Those of you who live on the bleeding edge probably don’t need this guide anyway.

Your alternative route is to just trust the experts. Sites like Ars Technica and Maximum PC regularly assemble system guides at various pricepoints, in which they’ve made your value judgments for you. Tom’s even assembles a “Best Cards for the Money” guide each month, which is invaluable. At given price points, the answer will often be obvious, and these guys know what they’re talking about.

But keep in mind, they’re applying the same formula you can, just with a slightly more knowing eye. The matter truly is as simple as broadly deciding what you need, consulting the right sources and floating far enough above the spec-ravaged landscape so as to avoid getting a headache. Good luck.

Great Sony Walkman TV and Print Ads of the 1980s

To commemorate the Sony Walkman‘s 30th birthday, here are the trippy ads Sony used to promote it in the ’80s. Noble monkeys, off-key kids and sweet-toothed senseis—where’s that f’d up sense of humor now, Sony?

Back in 1983, Sony declared the WM-10 Super Walkman the “world’s smallest cassette player,” and promoted it with ads that appealed to the dudes and to the ladies. There’s the fantasy hardware building demonstration, 1 minute into the following ad compilation (here if you don’t want to wade through Seth Green’s Matchbox spot and the rockin’ Simon hair-band ad):
The image associated with this post is best viewed using a browser.

And then there’s the dancer who’d prefer a slenderer music player:

The image associated with this post is best viewed using a browser.

OK, maybe that second one appealed to anybody with a leotard fixation (which, in 1983, was pretty much everybody).

Most people in their 30s will hate me for bringing this one up: The 1986 My First Sony campaign was responsible for sticking the following song inside the heads of a generation of people who are just now able to forget it. Click at your own peril…

The image associated with this post is best viewed using a browser.

Here’s one of the last cassette Walkman commercials, from 1990 or thereabouts, where a father grills his ridiculously dumb daughter on the pictures that appear on TV. She gets everything wrong—everything—but he let’s her mistaken sighting of a Walkman slide, because Walkmen (Walkmans?) are so cool.

The image associated with this post is best viewed using a browser.

And about that noble monkey, his name was Choromatsu, and he died at the extremely ripe age of 29 back in 2007. Here’s his 1988 spot, in which he grips a (Japan-only?) WM-501 and contemplates nature:

The image associated with this post is best viewed using a browser.

Before the zany TV commercials there were the fat-bucking-insane print ads. For instance, the small sampling below contains:
• A slick-looking posse of urbanites with nice shoes and likely heroin addictions
• A sensei sucking a lollipop while sitting next to a nipply lass 2X his height
• A lady perilously guiding a ten-speed at velocity while holding a Walkman

Special shoutout to Don the Intern for those mad researching skills. Hat tips to Pocket Calculator’s Walkman Museum, to Tim and Nick Jarman’s Walkman Central and to Bing’s image search tool. Try it out—it’s really quite different than Google’s.

Panasonic Lumix DMC-GH1 Review: A $1500 Misfit

The micro-four-thirds standard created by Panasonic, Olympus and Leica has intrigued us but its mightiest product to date, the Panasonic Lumix DMC-GH1, leaves us scratching our heads.

Camera Be Still
When it comes to still shooting, there is no difference between the GH1 and the G1 that Mahoney reviewed last November. It has a digital viewfinder instead of an optical one, which takes some getting used to but tends to work. It’s got a huge number of manual and automatic options, as well as some uniquely digital settings, like “film mode” where you can manually adjust the color balance, saturation, contrast and noise reduction of the “film” you’re using. Because the sensor is 4:3 (hence the format’s name), you can change the aspect ratio to 16:9 for a wider view, but of course you sacrifice some pixels in the process. Update: Reader Ben tells me that no pixels are lost in the aspect ratio switch.

The camera has many of these novel options to keep track of, but it doesn’t pay a huge dividend to those who do. As Mahoney said in the original piece, its high-ISO shots are a bit more noisy than most DSLRs, and the lens selection is paltry compared to Canon and Nikon. As someone who carries mainly entry-level DSLRs (and generally wants for nothing more), I found myself simultaneously overwhelmed and unimpressed, though I did manage to eek out a few halfway decent shots, which I’ve stuck in the gallery below.

All of the above features and capabilities can be found on the $800 DMC-G1. What I tested, though, was the $1500 GH1, with an “H” for “Highdefinitionvideo.”

It’s Got an H In It
The H makes a big big difference, as David Pogue mentioned, and as Mahoney lamented.

The 1080p video is, in fact, astonishingly good, when you’re shooting in the right light with a decent lens. I used two lenses, the highly functional 14-140mm kit lens, and a playful 7-14mm wide angle lens with a touch of the fisheye.

The video comes in AVCHD format, which some people don’t like. I don’t mind it, though when I previewed it in VLC, it appeared to have a painful amount of compression artifacts. I was going to condemn the camera for that, until I wrangled the video in VisualHub, and found that all of the playback artifacts disappeared in conversion, and probably wouldn’t appear in other software. (Panasonic sent me GH1 software, but it was for PCs only, and I didn’t have a chance to check it out; some of you already know what to do with AVCHD vid anyway, so I wouldn’t make a big deal out of the included software either way.) As you can see in this quick up-close video of Wynona—dropped from 1080p to 500×280 and converted to FLV for your consumption—you can certainly get a lot done:

The rustling you hear is me playing with the camera strap to attract an otherwise lethargic cat’s attention; over the weekend, when I shot video of my family, the stereo mic array worked well, as long as I kept my own stinkin’ trap shut. Its placement, facing upwards, on top of the flash, means that the shooter’s voice is far louder than that of his or her subjects.

Video certainly is the GH1’s coup de grace, as others have proclaimed. Practically speaking, it’s a damn sight better than the video from the Canon T1i and the Nikon D5000, which are fine for quick snips but lack the autofocus necessary for a nice fluid continuous shot (Touch of Evil opener, anyone?). The GH1 dynamically refocuses well enough, though as you can see in the Wynona video, it can’t go super-macro with that 7-14mm lens.

Stupid Money
Still, we’re back to the same dilemma here: If moderately video capable DSLRs are selling for MSRPs around $900 (also with decent kit lenses), how can this baby be worth $600 extra? Still-only DSLRs cost in the $600 range—how can the GH1 be $900 more than those?

It’s a powerful camera, but I certainly didn’t feel as comfortable shooting with it as I do with Canon and Nikon DSLRs, and the video is, after all, video. The argument for video on other DSLRs is their compatibility with all kinds of lenses; here, it’s more like a decent video camera without a huge number of lenses. As Mahoney mentioned in the G1 review, you can get a lens adapter and use some nice Leica lenses, but do you really want to go to all that trouble? We’d be better suited for some a handful of interesting, made-for-micro-four-thirds primes.

Even if we get all that, though, the price remains prohibitive. If you are tempted by the video capability of this camera, you are still better suited to buying a nice DSLR and a true HD camcorder of your choosing from Panasonic or Sony or Canon. I wish I could say that the excellent 1080p video tips the scales, but it doesn’t. [Product Page]

In Brief:
HD video performance is exceptional for a high-end still camera, and notably better than “competing” DSLRs

Lots of manual digital manipulation means a lot to read up on and remember—it’s not easily hidden from the beginner, but in the hands of an undaunted shooter, there’s a lot of potential

The camera’s entry cost is far too high to justify when it’s not a big winner in still shooting, and when HD camcorder prices are dropping

Computing Classic: The Kitchen Computer

The 1969 Kitchen Computer by Honeywell was not just a fancy cutting board. It was meant to store recipes, even recommending meals from ingredients on hand. The problem is, you had to know binary to use it.

The machine’s designers assumed that housewives would do all the cooking, and yet, also assumed they’d be open to learning binary: is the Honeywell Kitchen Computer the most or least sexist computer ever made? I don’t know. I do know its the most beautiful minicomputer I’ve ever put my eyes on. The plastic chassis hid so much of the 150 pound machine’s weight in its black pedestal. Then again, it could have been a lot bigger, had it had an actual user interface that wasn’t binary: The $10,600 price set by Neiman Marcus included two weeks of programming lessons in a language known as BACK.

The machine itself was a 16-bit minicomputer—the class right below mainframes—and its official name was actually the H316 Pedestal. It was part of the Series 16 lineup, based on the DDP-116. (A machine most notable for its use as ARPANET Interface Message Processors, early machinations that ran the predecessor to the modern internet.)

It had 4KB of magnetic memory, expandable to 16KB, which was pre programmed with a few recipes. Its system clock was 2.5MHz. It took 475 watts to operate.

Dag Spicer, curator from the Computer History Museum, says, “None were ever sold.”

He adds, in an article at Dr. Dobbs, that in the late 1960s, “with that kind of budget, the solution would likely be a live-in chef or the traditional 3×5 card file, no?”

Indeed.

[Wiki, The Computer History Museum, Dr. Dobbs, Old Computers.com]

The Computer History Museum is a wonderful place. If you’re in northern CA, I recommend you find a way to stop by. We’ll be running pieces from their collection as an ongoing series called Computing Classic. Special thanks to Fiona Tang, John Hollar and the amazing Dag Spicer for their help.

VholdR ContourHD wearable HD camcorder hands-on and impressions

VholdR ContourHD hands-on and impressions

We’ve been toying with a VholdR ContourHD helmet cam for a few weeks now, seeing how well it blends in with our adrenaline-fueled lifestyle. We’ve mounted it on various helmets, tried it with some moderately extreme endeavors, and, now that we’ve had a chance to refill our asthma inhalers, we thought we’d share a few of those adventures and give our impressions of this helmet-mounted HD shooter.

Continue reading VholdR ContourHD wearable HD camcorder hands-on and impressions

Filed under:

VholdR ContourHD wearable HD camcorder hands-on and impressions originally appeared on Engadget on Tue, 30 Jun 2009 14:12:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

In Which We Provoke Kim Jong Il in 77 Offensive and Hilarious Ways

OK, so if crazy Kim Jong Il does try nuking Hawaii this weekend, don’t blame me. He was planning on it before this horrifying and hilarious gallery of shameful Photoshops appeared. Oh god, what have I done?



First Place — Nick Dwyer
Second Place —T. Baxter
Third Place — Dave Corrasa

The image associated with this post is best viewed using a browser.


The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.

The image associated with this post is best viewed using a browser.

The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.

The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.


The image associated with this post is best viewed using a browser.






The image associated with this post is best viewed using a browser.


The image associated with this post is best viewed using a browser.

The image associated with this post is best viewed using a browser.



The image associated with this post is best viewed using a browser.



The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.

The image associated with this post is best viewed using a browser.






The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.


The image associated with this post is best viewed using a browser.

The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.



The image associated with this post is best viewed using a browser.
The image associated with this post is best viewed using a browser.


The image associated with this post is best viewed using a browser.



The image associated with this post is best viewed using a browser.

iPhone 3G vs 3GS Network Speed Test Shows No Real Difference

Thanks to all our Chicago readers who sent in their speed test data from their iPhone 3G and 3GS. Here’s our conclusion: the 7.2Mbps AT&T’s testing in Chicago doesn’t really make any difference right now in speeds.

The 3GS turned out to be slightly faster in downloads (1202kbps vs. 1161kbps), but just about the same in uploads. Its latency was much better 175ms vs. 210ms, which reflects the same thing we found in our iPhone 3GS review and is probably attributable to its faster processor.

Either AT&T’s 7.2Mbps isn’t really widely deployed yet even in Chicago, a city they’ve been running deployment tests on for a few months now, or it makes no real difference in everyday usage. We’ll test this again once 7.2Mbps gets rolled out to more cities to find out which.

And if you’re still not sure about what 3G speeds mean, or the differences between different phone techs, see our Giz explainer on all the mobile terms. And the next generation technology? 4G? See what’s coming up in that explainer. [Thanks to all our readers who participated!]

Update: AT&T tells us that the trial is only live in Chicago on a handful of cell sites and on an internal basis, so none of you guys should be connecting to the faster network. The public trials are coming later this year, so it makes sense that the speeds are exactly the same.

So Long Desktop PC, You Suck

Desktop PCs have been in decline for a decade, and countless people have said their piece about it. But new evidence suggests the desktop tower’s death spiral is underway—and we’re not too broken up about it.

I say this as a guy who was baptized into the tech world with a desktop; who still obsessively follows the latest PC components from Intel, Nvidia, ATI and the like; who has built, fixed or upgraded more towers than I care to remember; and who, until a few years ago, was an avid PC gamer. As someone who would be, by most measures, a desktop-PC kinda guy, I just can’t go on pretending there’s a future for them.

The State of the Industry
This is more than a hunch; a grim future is borne out by the numbers. A week ago, iSuppli issued a broad report on the state of the PC industry. The leading claim was predictable: The PC industry was experiencing lower-than-expected quarterly sales—down about 8% from the same time last year. This included laptops, and made sense, because the whole economy’s gone to hell, right? People aren’t buying computers.

Except that’s not quite what’s happening. In the same period, laptop shipments—already higher than desktop shipments on the whole—grew 10% over last year. Desktops were entirely to blame, dropping by an astounding 23%. That’s not decline—it’s free fall.

Stephen Baker, an analyst for industry watchers NPD, shared with me a wider picture of how retail PC sales break down. The way he put it made measuring the rise and fall of sales percentages seem dumb—there really aren’t any sales to lose: “In US retail, 80% of sales are notebooks now,” he said. “Start throwing in stuff like iMacs and all-in-ones”—which share more hardware DNA with laptops and netbooks than traditional desktops—”and it gets even higher.”

The Buyer’s Dilemma
To understand why this is happening doesn’t take anything more than a little empathy. Put yourself in the shoes of any number of potential consumers, be it kids, adults, techies, or luddites. In virtually any scenario, a laptop is the sensible buy.

Take my dad. Despite spending three decades in front of commercial jet instrument panels, his relationship with computers is, at best, strained. When he came to me a few months ago asking for advice about a laptop to replace his desktop, I assumed it was a just a whim, based on what he saw happening around him. It wasn’t, at all. As someone who uses a computer mostly for news, email, music, etc—like a significant part of the population—he was actually being intensely rational. A laptop would do everything he needs simply and wirelessly, with a negligible price difference from a functionally equivalent desktop. If he wants a monitor, keyboard and mouse, he can just attach them. Choosing a desktop PC wouldn’t just be a not-quite-as-good choice—it’d be a bad one.

The TradeoffsLet’s look at mainly stock examples taken (hastily) from Dell’s current product line. Their configurations could be tweaked and changed to make desktops look slightly better or slightly worse, but we chose them because they are typical budget-minded consumer choices. We are not talking about workstations, and we’re not talking about all-in-ones, because if anything, they are keeping this category alive. When it comes to pure household computer buying, you can hunt for deals all you want, but laptops and desktops are more closely paired than you might expect.

That’s not to say that there aren’t noticeable tradeoffs. Graphics performance, although I wasn’t specifically angling for that with these configurations, is generally better in a desktop. Likewise, hard drives—being that desktops use larger, cheaper 3.5-inch units—are faster and more capacious across the board. Greater amounts of RAM can be had for less in a desktop, the optical drives can be slightly faster, and the ports for those and other drives can be used for expansion.

But these tradeoffs aren’t nearly as pronounced as they once were, nor are they as consequential. On account of the huge demand and sales volume, newer mobile processors have become a hotbed for innovation, now rivaling most any desktop processor, and mobile graphics engines—though still markedly inferior to dedicated desktop cards—have improved vastly in recent years, to a point where most consumers are more than satisfied.

And if you really look out for them, there are some amazing deals to be had on new notebooks. (Look at Acer’s 15-inch, 2.1GHz Core 2 Duo, 4GB DDR3 RAM laptop with 1GB GeForce GT130 graphics card and Blu-ray for $750, and then try to build the equivalent in a desktop at the same price.)

The important takeaway here is that the performance sacrifice you make in owning laptop is minimal, and mitigated, or even outweighed, by its practical advantages. Want a bigger screen on your notebook? Hook it up your HDTV. Want more storage? Buy a cheap, stylish bus-powered external USB drive. Want to use your desktop on the toilet? Good freakin’ luck.

The image associated with this post is best viewed using a browser.The Fall of the Gaming PC
But to say that the average user doesn’t have any reason to buy a hulking beige box isn’t that controversial, and even borders on obvious. The real, emotional, diehard support for the form factor is going to be found elsewhere anyway. I mean, hey, what about gamers? Have you ever tried to play Crysis on an Inspiron? Let’s jump back to the numbers.

Last year saw a huge 26% increase in game sales across platforms, powered mostly by Xbox 360, Wii and Nintendo DS sales, according to NPD. Breaking that number down, we see PC game sales down by 14%. That decrease barely even registered in the broader scheme of things, since total PC game sales amounted to just $700m of the industry’s $11b take. This year is looking even worse. You know what, let’s just call this one too: PC gaming? Also dead. Update: Luke at Kotaku points out that NPD’s numbers only cover retail game sales, where PC gaming is hurting the most. Due mostly to MMOs—hardly the exclusive domain of desktops—the PC gaming industry take is actually higher.

As the laptop is to my old man, the console is to the gamer. Just a few years ago, buying—or just as likely, building—a high-end gaming PC granted you access to a rich, unique section of the gaming world. Dropping a pile of cash for ATI’s Radeon 9800 to get that precious 128MB of VRAM was damn well worth it, since there was no other way to play your Half Life 2 and your Doom 3. PC titles were often demonstrably better than console games, and practically owned the concept of multiplayer gaming—a situation that’s changed, or even reversed, since all the major consoles now live online. We even spotted a prominent PC magazine editor (and friend of Giz) copping on Twitter to buying an Xbox game because it has multiplayer features the PC version doesn’t. Yes, things are different now.

NPD’s Baker sees it too: “Go back two years ago and think about all the buzz that someone like Falcon or Alienware or Voodoo was generating, and how much buzz they generate now, that might be a little bit telling.” He adds, “There’s considerably less interest in high powered gaming machines.” They’re luxury items in every sense, from their limited utility to their ridiculous price to their extremely low sales.

A Form Factor on Life Support
But no matter how irrational a choice the desktop tower is for the regular consumer, sales won’t hit zero anytime soon. As we’ve hinted, much of this can be explained by simple niche markets: Some businesses will always need powerful workstations; older folks will feel comfortable with a familiar form factor; some people will want a tower as a central file or media server; DIY types will insist on the economy and environmental benefit of desktop’s upgradeability; and a core contingent of diehard PC gamers, despite their drastically thinning ranks, will keep on building their LED-riddled, liquid-cooled megatowers until the day they die.

Baker sees another factor—less organic, more cynical—that’ll keep the numbers from bottoming too hard. “Desktops are a lot more profitable than notebooks for a lot of reasons, not the least of which is that big shiny monitor, which has a nice margin attached to it. For the retailers, people tend to buy a lot more peripherals and accessories when they buy desktops than when they buy notebooks.” Even if the volumes are ultra-low and concept is bankrupt, retailers are going to keep bloated, price-inflated desktops and desktop accessories out there on the sales floor until they’ve drained every last dollar out of them.

You’ll see plenty of desktop towers for years to come, in megamarts if not in people’s homes. You’ll still hear news about the latest, greatest graphics cards, desktop processors and the like. Enthusiasts and fansites will stay as enthusiastic and fanatical as they’ve ever been. These, though, are lagging indicators, trailing behind a dead (or maybe more accurately, undead) computing ideal that the computer-using public has pretty much finished abandoning.

10 Breakfast Gadgets For True Champions

Coffee, bacon, donuts and cigarettes—it’s the best part of waking up (if you are lucky enough to wake up that is). The following products will help you enjoy your own breakfast of champions.

[Image via rangerumors]

Windows 7: Cheaper Than Vista (and Every Other Windows OS)

It turns out, even if you don’t weigh in all the slightly confusing Windows 7 upgrade deals, Microsoft’s latest OS is its least expensive to date, and a real bargain compared to Vista.

Looking at full (non-upgrade) pricing of consumer Windows editions really tells the story: When you compare sticker prices, you can see that most editions hovered around the $200 mark, with a rare spike found in the $260 Vista Home Premium. When you adjust for inflation, that fairly regular pricing becomes a downward cascade—except for that Vista price hike.

The pro versions of Windows, starting with NT, tell the same story. $320 across the board, with a dip when XP Pro followed quickly on the heels of Windows 2000. But when you calculate for inflation, it’s just a smooth downward curve.

[Windows 7 Pricing: The Full Story; prices sourced from the following multiple or official locations: Washington Post, Businessweek, Microsoft, Cnet, Wired, Microsoft, CBROnline, Microsoft, Microsoft; inflation calculations made with Bureau of Labor Statistics CPI CalcSpecial thanks to Don the Intern for doing a ton of research on this!]