Apple iMac Review: 27 Inches and Less Chin

In the 10+ years since the iMac was born as Apple’s simple computer, it’s become visibly less of a computer and more of a display. And what a screen this new iMac has.

But First, Simplicity


This 1998 ad has Jeff Goldblum narrating that there are two physical steps to setting up an iMac. (“There’s no step three!”) Truthfully, they skipped the mouse and keyboard cable, though, which would bring it to 4 steps. Today, an iMac is set up using just one power cable, depending on wireless networking and bluetooth peripherals to get the rest done. So it’s even simpler than it was 10 years ago. And as I said, the screen is becoming more prominent than ever.

The LCD

The 27-inch iMac’s screen is the thing to focus on in this revision. It is practically as bright (and more contrasty) than any of the previous iMacs—even Cinema Displays—and it looks astounding. It’s LED-driven so it comes to full luminescence immediately and takes up less power. It also has better side-to-side viewing angle as an IPS tech monitor; like the iMac 24 before it, it goes 178 degrees without much change in color accuracy or brightness. And here’s the kicker: Although it has 19% more area of LCD than the old 24-incher, it has 60% more pixels. That makes it more pixel dense than any of the Cinema Displays at 109ppi. And with a 2560×1440 resolution it has 90% of the dot count of a 30-inch cinema display. All these stats are great. They sound great, and they make for a powerful picture. But the actual view of the screen leaves me with a positive—but slightly imperfect—impression.

The default brightness is a bit much, but of course you can turn it down. And the contrast is welcome; even my new 13-inch MacBook Pro looks yellowed and washed out next to it. But at this pixel density, which is sharper than my notebook, it’s almost too sharp, requiring me to sit closer than I would ordinarily do with a 27 inch display. I like the feeling of crispness — 16% crisper than the last generation. But my eyes feel like the pictures are being delivered by a land shark holding a laser pointer straight into my corneas, and I can feel the strain within minutes. I would have to jack up as many font sizes as possible or sit as close as I do to my MacBook to make it work for long long periods of time. Maybe I’m just a wimp of a geek, but I’ve never been sensitive to these sorts of things on any sort of machinery before.

This is the iMac next to a 13-inch MBP and a Dell 2407 24-inch monitor. The iMac’s screen puts both to shame in brightness and clarity.

Apple is making a big deal of the fact this screen is 16:9. I think it looks better in this wider iteration, but it’s not an epic jump since the last gen was 16:10. You’re losing vertical pixel count here, on both the 21.5 and 27-inch models, despite added diagonal inches. Also, the glass cover is now edge to edge, without the thin silver rim around it, on the top and sides. It’s still glossy and very very reflective, despite being covered in anti-reflective coating.

I will feel guilty for mentioning this, because it’s ever so slight, but I’ll feel more guilty if I don’t mentioning it to you: The screen, when it’s white, has the tiniest bit of blotchiness to it. The backlighting is slightly uneven in my model. It had no impact on viewing quality once the screen was filled with an image other than one of pure white, so don’t sweat it.

My previous comparison to the 30-inch Cinema Display wasn’t for academic purposes, either. One of the most interesting features on the new iMac is that it can use its Mini DisplayPort (normally an output) as an input; that is, it can become a secondary display for notebooks or other devices. Factor in the near-identical specs to the 30-inch Cinema Display, most notably its updated LED screen, and you have absolutely no reason to buy a 30-inch Cinema Display when you can have this—but not just yet.

That’s what two full sized 1080p trailers look like on this screen.

Eager to test this shit and be the first to the internet with an image of an Xbox linked into an iMac (“Worlds collide!” would be the headline, I decided), I ordered a monoprice Mini-DisplayPort-to-HDMI adapter. Unfortunately, I discovered that the inputs would not work with a PS3 or Xbox at any res, HD or otherwise. The current adapters on the market are unidirectional, I was told, and so they won’t work to take HDMI sources and pipe them into the iMac. I’m sure someone is making a cable as we speak for this very abominable purpose of piping in Microsoft gaming to a desktop Mac—but it’s not here yet. (New cables, by the way, will include audio, which the iMac is capable of taking through its connector and the iMac is able to display video sources up to its native resolution.) The issue is, this could take months. That’s a long time, so don’t buy an iMac planning to use it with a gaming console or Blu-ray player right away.

Using it with a laptop was an interesting situation. Odd, for sure, but a welcome bonus and an obvious use. Here’s how it works. You plug in a Mini-DisplayPort-to-Mini-DisplayPort cable to the iMac, which must be turned on (unlike Sony’s all-in-one, which works while off.) The iMac flickers for a second and the laptop’s picture replaces the iMac’s. Here’s where it gets sort of weird. When the iMac is acting as a monitor, the keyboard and mouse are all blocked from working, except a few keys: The pause/play, FF, RR, volume controls and brightness keys all work. They won’t display the typical volume/brightness/FF/whatever iconography, because you’re actually still looking at your MacBook. You can actually then use your iMac as a display for one computer while listening to music on another—but why would you want to? And if you were playing a game with an Xbox, you’d be listening to the game. To toggle between the iMac and the external source, you hit Command+F2.

(*The 21.5-inch iMac is not as sharp or impressive as the 27, but a fine evolution nonetheless; see chart)

Oh, one more thing: The LED display is also thinner than the traditional panel. Even so, when combined with the extra width and height, Apple’s designers are given adequate room to play with the layout and thermal properties of the iMac. Which brings us to the chassis and internals.

The Chassis


The iMac’s chassis went from all plastic to aluminum and glass in 2007. The first aluminum models were stamped out in car factories because no computer factories could work with aluminum pieces that big. Now, the iMac has even more aluminum in them with bigger cases and a seamless wraparound back made of metal instead of the black plastic cap. Despite the loss of the slimming effect of a black plastic back, the computer’s dimensions work in its favor; it’s about 1mm thinner and obviously wider, so it still feels undoubtedly skinny.

Oh, and the stand is tapered by 1.1mm on its front (as is Apple’s wont), to further hide volume.

Aside from the more flattering aspect ratios, the chin—one of the only giveaways that this is not just a screen but a computer—has shrunk by 22%. It looks much better, in my opinion. The case’s bigger size affects its internal layout, too. Apple and iFixit brought several of these details to my attention.

The most important changes are that the GPU and CPU are placed at nearly opposite ends of the case, with their own heatsinks to throw off copious heat with three very quiet fans. (The iMac’s sound profile at idle, for a stock build, is still just a whisper, less than 20db.)


Ports: The back of the case has a Mini DisplayPort, 4 USB 2.0 ports, power plug (the machine’s only wire), Firewire 800, minijack/optical input and output, and Gigabit Ethernet. There’s Bluetooth 2.1 EDR wireless with which the mouse and keyboard interface, and 802.11 N Wi-Fi. Although the entire case is aluminum, the antenna has been cleverly hidden in a plastic Apple logo top center on the back. Reception is a touch stronger than on my notebook.

The iChat camera and microphone (the latter of which is made up of about a dozen closely-grouped pinprick holes, like on the MacBook Pro) are situated on the top of the iMac. And despite the new model’s height they sound fine (if not a touch more distant because of the height) when compared to previous models. The top mount for the microphone keeps the sound from the new, more powerful two-way speakers from interfering with it; measured using a song and SPL meter, my notebook came in at 70db and the iMac at 76db at sitting distance. Louder, richer and noticeably so than a laptop, though I didn’t have an iMac 24 on hand to compare with.

The larger case allows the iMac to use four sticks of user-serviceable RAM, accessible from the bottom. (That’s useful futureproofing now that OS X Snow Leopard is shipping, and programs and the OS in 64-bit can address more than 4GB at a time.)

How About Performance?

The iMac I’m testing is a 3.06GHz Core2Duo processor with 4GB of RAM and an ATI Radeon 4670 graphics. Those are decent parts but not the highest-end quad-core i5/i7 chips or ATI Radeon 4850 GPU that will ship in iMacs in November. More importantly, the machine I have here that is shipping now is about on par with higher-end, custom-order machines from the last generation. The system benchmarks I ran earlier this week indicate that everything performs practically the same. And since we don’t have a Core i5/i7 machine to work with, I’ve included Apple’s approximations of how much boost the iMac will get from those parts — obviously, many grains of salt are necessary when reading, especially when measuring value of extra CPU cores as literal multipliers when most software still can’t leverage those channels efficiently.

As for 3D, Maclife has some framerate scores from Doom 3 and Call of Duty that are not by any means exact but somewhat representative of the machine I’m using today. But again, the bottom line is that this machine that I have, shipping today, is not faster than machines equipped similarly from the last generation—they’re just cheaper for any given performance point.

But again, even if you wait for the higher end machines, there’s no guarantee you’ll be able to access most of that extra power. Snow Leopard hasn’t seen many apps, besides the ones that ship with it that can take advantage of its multicore CPU and GPU technologies. Programs will come, but immediate speed gains aren’t guaranteed here if you buy the quad-core machines.

Here’s an exception: Those Core i5/i7 chips are also clocked slower than the Core 2 Duo chips on the lower-end machines, but have the ability to run single core applications at a greater clock speed. Since all four cores won’t be burning, the chip uses the spare electricity and the extra thermal overhead to dynamically and automatically overclock the core that is working: The i5 chip goes from 2.66GHz to 3.2GHz and the 2.8GHz i7 chip goes to 3.46GHz (with 4 cores that run hyperthreaded for up to 8 virtual cores.)

Sounds fast, but we’ll dive into deeper tests in November. For now, you should be aware that if your desktop is less than 18 months old, you’d be somewhat silly to upgrade before the highest end chips from this generation of iMac are out.

What Else You Got?

The iMac replaces its old mouse with the new Magic Mouse, with a multitouch surface and 360 degree scrolling and swiping, almost like the gestures you find on a Macbook trackpad. I’ve said it before: I primarily use Laptops because I love trackpads. The gestures, fingertip precision and proximity to the keyboard make it a must have, and this mouse fixes some of those issues. (*Jason Chen reviewed the mouse and liked it but it was not without flaws. Read that if you’re considering buying an iMac, because it’s the only option Apple offers.)

The one detail I found problematic specifically with the Magic Mouse as it pertains to the 27-inch iMac is that even when the pointer sensitivity is set to the highest level, a swipe of the wrist at a moderately fast speed goes only 2/3 across the giant pixel landscape. Only by whipping my hand across my mouse pad can I trigger enough mouse acceleration to get across the screen. They should turn up the sensitivity, frankly. Software update please!

The keyboard is also changed, going from the old wired keyboard, which was stamped out of the screen cutout of the chassis, with a wireless Bluetooth model. Apple states that the keyboard’s narrow profile makes it a better fit next to the mouse. I think it also makes sense as a remote control for the computer from afar when watching media, since this is the biggest iMac ever that doubles as a monitor. But it looks a little small and out of proportion with the machine itself, since the Mac got wider and the keyboard got shorter. (Correction: The keypad-less change happened last revision. I just miss that numeric pad keyboard’s width from the first generation of Aluminum iMacs. It seemed to fit perfectly.)


Oh, the white plastic remote that used to ship with all the laptops, AppleTV and iMacs has been replaced by an elliptical, aluminum remote with black rubber buttons. It’s longer, and shaped like an iPod nano but no longer comes with the iMac. It costs $19. I think when you buy a computer that is this expensive, they should THROW IN THE DAMN REMOTE.

Competitive Check

There are other all-in-ones from PC makers, but at the moment, none as large or high-res as the iMac 27. The ones from Sony (like the L) and HP have various extras like IR touchscreens, glowing monitor bodies, TV tuners and Blu-ray drives. Some are pretty decent, like the Touchsmart we just reviewed. If these things matter to you and you are not married to the Mac platform, you might consider them. But that touchscreen functionality is still half-baked, so don’t do it for the groping potential.

Value

The sweet spot is the $1200 21.5-inch config. But don’t upgrade that model beyond base without seriously considering the big bad 27-incher for $1700. And don’t upgrade that one at all without considering the quad-core models; both look very promising at $2000 or $2200. Basically, the custom builds are not a great value until you get to the quads. Go cheapest, 27, or quad. But cautious folks will wait on the quads ’til we test them.

There’s another angle here, too. Again, comparing the 27-inch iMac to the old as hell 30-inch Cinema Display makes those standalone monitors look like a pretty bad value when it costs only $100 more for just 10% more pixels—and, hey, it’s also not a computer.

Nerds, Sheathe Thy Wallet If You Can

Although the quad core benchmarks aren’t here yet, I think you’ve got enough information here to make an adult decision on whether to go cheap or double your price for something faster and bigger. It’s not like those new chips will be slower. But waiting a month on a new internal layout, design and screen is a great way to let Apple shake out whatever inevitable hiccups are there at the start of a new run. Plus, if Snow-Leopard-specific apps make their way to market (hello, <Handbrake!) and some performance scores come out in the meantime, hey, cool.

Big beautiful screen is super high res and bright.


Chassis design evolving to new heights of beauty; less chin.


Faster parts not out yet; current components available in previous generation.


No Blu-ray player, touchscreen or other things that aren’t important to me, but may be important to you. Maybe.

Last Minute Guide to Saving Money on Windows 7

Tomorrow’s launch is the big day if you want to get in on Windows 7 deals, so Prof. Dealzmodo is hooking you up with a handy, up to the minute guide packed with tips on how to save money.

Software

Once you have looked over everything Windows 7 has to offer and decided (correctly) that the upgrade is worth getting, the first thing you have to do is figure out which version is right for you. The Real Cost of Upgrading to WIndows 7 will help you answer that question along with tips on how to cheaply upgrade your hardware if necessary. You also have the option of purchasing OEM (Original Equipment Manufacturer) copies at a significant discount if you are willing to sacrifice transferable PC keys and support. NewEgg is a great place to score OEM disks.

If you are a college student there is even better news—you can still get real deal Windows 7 on a Ramen budget with the deal Microsoft is throwing your way. If you are a student at an American university or you have a working university email, you can get Windows 7 Home Premium or Pro for only $30. Even if you don’t currently attend college, many of you might still have a valid email. Make sure and check before you go and waste money on a full price copy.

Hardware

If you have decided to get your copy of Windows 7 along with some brand new hardware, launch day is a great time to hunt for deals on laptops and desktops. Here are a few great deals already coming down the pipeline:

• Best Buy will offer a HP Slimline laptop, HP mini netbook, 18.5″ LCD monitor and Netgear Wireless-G router package with Windows 7 (includes Geek Squad setup) for $1200. This is one hell of a deal—pre-orders are already being taken on Best Buy’s website.

• Customers who buy a new PC running Windows 7 Home Premium can upgrade a Windows XP or Windows Vista-based PC they already own with a discounted box copy of Windows 7. This offer will run through Jan. 2, 2010.

• Dell is offering $100 off on a Dell Studio XPS13

• The Acer AZ5610-U9072 23″ Touch All-in-One will be priced at $880

Techdealdigger has some great deals on laptops including a 17.3-inch HP with Core 2 Duo processor, 4GB RAM, 320GB HDD and Windows 7 Premium for $550, and a 16-inch HP with Core i7 processor, 3GB RAM, 320GB HDD and 64-bit Windows 7 Home Premium for $800.

• Check out the Windows home page tomorrow. Word on the street is that they will be showcasing several deals from retailers.

Chances are there will be deals going on everywhere you look, so make sure to shop around before you buy. If you are patient enough, the potential for a glut of Windows 7 PC inventory could translate into even better holiday deals.

Computer Benchmarking: Why Getting It Right Is So Damn Important


We’re constantly bombarded with benchmark results, used to pitch everything from web browsers to cell service. But if benchmarks aren’t built properly, results are erroneous or misleading. Here’s what goes into a great benchmark, and how to make your own.

Why Do Benchmarks Matter?

Benchmarks typically measure the performance of the bottlenecks in your system. Benchmarks of your car measure its speed, braking and cornering. Benchmarks of your mechanical toothbrush measure the percentage of plaque it can remove from your teeth. As you attempt to test more complex systems, it becomes increasingly more difficult to create accurate benchmarks. These days, computers can be very difficult to test accurately.

On paper, making a great benchmark seems simple—it should be a quantitative test that measures something meaningful, delivers correct results and produces similar results when repeated in similar circumstances. However, in the real world, it can be difficult to find a test that fits all three criteria. Worse, it’s relatively easy for anyone with an agenda to change the starting variables enough to manipulate a benchmark’s results. It’s more important than ever for you to know the difference between good and bad benchmarks—especially if you want to avoid being hoodwinked.

There are dozens of examples of benchmark shenaniganry over the last decade, but I’m going to pick on Nvidia. In 2008 Nvidia famously claimed that high-end quad-core CPUs were overkill, and that the GPU could do everything the CPU could do better and faster. As is frequently the case, there was a demo to sell the point. Nvidia was showing a video transcoding app that used the power of Nvidia GPUs to convert video 19x faster than a quad-core CPU. However, the application used for the CPU part of the comparison was only able to utilize a single core on the CPU, an unusual situation for video conversion apps even then. When the exact same test was run using an industry-standard software that could use all four CPU cores, the performance difference was much less dramatic. So, while Nvidia created a benchmark that really did work, the results weren’t indicative of the actual performance that people in the real world would get.


The Lab vs. The Real World

There are two basic types of benchmarks: synthetic and real world. Even though we tend to favor real-world benchmarks at Maximum PC (where I am editor-in-chief), both types of tests have their place. Real-world benchmarks are fairly straightforward—they’re tests that mimic a real-world workflow, typically using common applications (or games) in a setting common to the typical user. On the other hand, synthetic benchmarks are artifices typically used to measure specific parts of a system. For example, synthetic benchmarks let you measure the pixel refresh speed of a display or the floating-point computational chutzpah of a CPU. However, the danger of relying on synthetic benchmarks is they may not measure differences that a user would actually experience.

Let’s look at hard drive interface speeds, for instance. Synthetic benchmarks of the first generation SATA interface showed a speedy pipe between SATA hard drives and the rest of the system—the connection benchmarked in the vicinity of 150MB/sec. When the second generation SATA 3Gbps spec was introduced, tests showed it was twice as fast, delivering around 300MB/sec of bandwidth to each drive. However, it wasn’t correct to say that SATA 3Gbps-equipped drives were twice as fast as their first-gen SATA kin. Why not? In the real world, that extra speed didn’t matter. If you tested two identical drives, and enabled SATA 3Gbps on one and disabled it on the other, you’d notice minimal—if any—performance differences. The mechanical hard drives of the era weren’t capable of filling either pipe to capacity—a higher ceiling means nothing when nobody’s bumping their head. (Today, SSD drives and even the large mechanical disks can saturate even a SATA 3Gbps pipe, but that’s a topic for another day.)

So, real-world benchmarks are perfect, right? Not necessarily. Let’s look at the Photoshop script we run at Maximum PC to measure system performance. We built a lengthy Photoshop script using dozens of the most common actions and filters, then we measure the time it takes to execute the script on a certain photo using a stopwatch. It’s a relatively simple test, but there’s still plenty of opportunity for us to muck it up. We could use an image file that’s much smaller or larger than what you currently get from a digital camera. If we ran the script on a 128KB JPEG or a 2GB TIFF, it would measure something different than it does using the 15MB RAW file we actually use for the test.

So, how do we know that our Photoshop benchmark is delivering correct results? We test it. First, we run the benchmark many times on several different hardware configurations, tweaking every relevant variable on each configuration. Depending on the benchmark, we test different memory speeds, amounts of memory, CPU architectures, CPU speeds, GPU architectures, GPU memory configurations, different speed hard drives and a whole lot more; then we analyze the results to see which changes affected the benchmark, and by how much.

But by comparing our results to the changes we made as well as other known-good tests, we can determine precisely what a particular benchmark measures. In the case of our Photoshop script, both CPU-intensive math and hard disk reads can change the results. With two variables affecting outcome, we know that while the test result is very valuable, it is not, all by itself, definitive. That’s an important concept: No one benchmark will tell you everything you need to know about the performance of a complex system.

Making Your Own Photoshop Benchmark

Once you get the hang of it, it’s never a bad idea to run your own benchmarks on a fairly regular basis. It will help you monitor your machine to make sure its performance isn’t degrading over time, and if you do add any upgrades, it will help you see if they’re actually doing anything. Just don’t forget to run a few tests when your computer is new (and theoretically performing at its peak), or before you swap in new RAM or a new HDD or other parts. If you forget, you won’t have a starting data point to compare to future results.

If you don’t own an expensive testing suite like MobileMark or 3DMark, don’t sweat it. If you have an application that you use regularly and can record and play back macros or scripts, like Photoshop, you can build a script that includes the activities you frequently use. We run a 10MP photograph through a series of filters, rotations and resizes that we frequently use as one of our regular system testing benchmarks at Maximum PC.

To make your own, launch Photoshop and open your image. Then go to Windows —> Action, click the down arrow in that palette to select New Action. Name it and click Record, then proceed to put your file through your assorted mutations. Always remember to revert to the original file between each step, and make the final action a file close, so you can easily tell when the benchmark is done. Pile in a lot of actions: As a general rule, you want the total script to take at least two minutes to run—the longer it takes, the less important small inaccuracies on your stopwatch work matter. When you’re finished assigning actions and have closed the file, click the little Stop button in the action palette to finish your script.

Once finished, make sure your new action is highlighted, then click the menu down arrow in the Action palette again and select Action Options. Assign a function key, which will let you start your benchmark by pressing a keyboard shortcut. (We use F2.) Then, open the Action palette menu again, and select Playback Options. Set it to Step-by-Step and uncheck Pause for Audio Annotation. Once that’s done, ready your stopwatch. (Most cell phones include one, in case you aren’t a track coach.) Load your image, then simultaneously start the stopwatch and press the keyboard shortcut you just selected. Stop the stopwatch when the file closes. We typically run this type of test three times, to minimize any human error we introduce by manually timing the test. If you want to try the same script we use at Maximum PC, you can download it here.

Gaming Benchmarks

Additionally, if you’re a gamer, there are tons of games with built-in benchmarks. These help you know what settings to run in games to maximize image quality without sacrificing framerate as well as measure the impact of use on your computer’s overall speed.

Check out Resident Evil 5 benchmark, which includes both DirectX 9 and DirectX 10 modes. Running this test is easy—simply install it and select DirectX 9 or DirectX 10 mode. (Remember, you’ll need a Radeon 4800 series card or newer or a GeForce 8800 series card or newer and be running on Vista or Windows 7 to use DirectX 10 mode.) If you want to compare performance over a period of time, we recommend the fixed run, it’s simply more repeatable. If you’re trying to tell what settings to use, the variable mode isn’t as consistent, but it shows actual gameplay, which will be more representative of your in-game experience. Once you’re in the game, you’ll want to change to your flat panel’s native resolution and do a test run of your benchmark. For a single-player game, we like to choose settings that will minimize the framerate drops below 30fps. For multiplayer, we sacrifice image quality for speed and target 60fps. After all, dropped frames in a deathmatch will get you killed.

The Practical Upshot

Like everything else, there are good benchmarks and bad benchmarks. However, there’s absolutely nothing mysterious about the way a benchmarking should work. In order to know whether you can trust benchmarks you read online, you need to know exactly what’s being tested—how the scenario starts, what variables are changed and exactly what’s being measured. If you can’t tell that a test is being run in a fair, apples-to-apples manner, ask questions or try duplicating the tests yourself. And when someone doesn’t want to share their testing methodology? That’s always a little suspicious to me.

Will Smith is the Editor-in-Chief of Maximum PC, not the famous actor/rapper. His work has appeared in many publications, including Maximum PC, Wired, Mac|Life, and T3, and on the web at Maximum PC and Ars Technica. He’s the author of The Maximum PC Guide to Building a Dream PC.

Top 10 Worst Technology Achievements over the last 40 Years

This article was written on July 23, 2007 by CyberNet.

Technology is something that has greatly evolved over time, but in order to find the good stuff we had to go through the bad stuff first. Computer World put together a great article today that details what they believe are the 10 biggest technology flops of the past 40 years. So I thought I would take a look at their list and make my own comments…

  1. Apple Newton – This device was created in 1993 as a $700 PDA. The handwriting recognition was awful, and often the aspect of the Newton that people made fun of. One of these bad boys will still run you a few hundred dollars (used) on eBay.
    Apple Newton
  2. DIVXThis is different from "DivX" which many of you know to be the popular video codec. DIVX is an abbreviation for Digital Video Express, which was an attempt by Circuit City to start a new video rental system. Customers would "rent" DIVX discs which they could keep and watch for two days, or pay a continuation fee to keep it longer. After the time was up the customer would just throw the discs away, and as you can imagine this didn’t last long because customers needed "DIVX enhanced DVD players" to watch the movies. Here’s a Circuit City commercial promoting the service:

  3. Dot-bombs – The dot-com bubble could only last so long. One website after another launched in the late 90’s, and as the new millennium hit many of these sites came crashing down. It almost makes me wonder if we’re entering another one of these "bubbles" with all of the Web 2.0 services popping up.
  4. IBM PCjr – This was a personal computer sold between 1984 and 1985, but it had several downfalls. It costed almost $1300 without a monitor, had no hard drive (it used cartridges), and the keyboard was different from what people were used to.
    IBM PCjr
  5. Internet Currency – This was started by sites like Flooz and Beenz with hopes of creating a type of money that could be used only on the Internet (much like frequent flier miles or gift cards). Um, yeah, we can just use credit cards to buy things online.
    Flooz
  6. Iridium – Motorola provided the technology and financial backing to launch 66 satellites into space to be used for voice and data communications. To make a call you would have to fork out between $3 and $14 per minute on one of the brick-sized phones. Try putting one of these in your pocket: :)
    Iridium
  7. Microsoft Bob – This is often considered to be one of the worst products ever created, and all it was intended to do was add a familiar interface on top of Windows 3.11. The problem was that cartoonish rooms were created for users to group applications and tasks, and it essentially made you feel like you were a two-year old trying to learn how to use the computer. Microsoft Bob is still floating around the file sharing networks, and is even said to run on Windows XP, but this gallery should be enough to keep you satisfied.
    Microsoft Bob
  8. Net PC – CNet covered the original announcement on these computers back in 1997, and said "Net PCs typically will have no floppy disk drive or expansion slots. Promoted by Microsoft, Intel, and Compaq, among others, the systems are supposed to reduce ownership cost for companies that currently use networked PCs. They will purportedly allow IS staff to maintain and update desktops from the center of the corporate network, instead of visiting each PC."
  9. Paperless Office – The dream that everything we read, send, and share is done only in a digital format does not appear to be true quite yet. A study by MIT Press in 2002 even said that email causes a 40% increase in paper use for most organizations. Sure we keep putting more and more things in a digital format (especially books), but we’re not ditching the paper copies quite yet.
  10. Virtual Reality – Being able to throw yourself into a game, or visit a place that you’ve never been to all in the comfort of your own house is an appealing idea. For some reason it hasn’t really taken off, but maybe some day it will be as good as this demo:

So those are the top 10 technology achievements that Computer World says are the worst from the last 40-years. It took me awhile to find interesting videos or images for each one, but it was fun writing this and taking a look back at how far we have come.

Drop us a comment below saying what you think is the worst gadget or application ever developed. It can be one that is on this list, or it can be something that just popped into your head. I can’t wait to hear what everyone comes up with!

For another interesting list read about the top 10 most important laptops

Copyright © 2009 CyberNet | CyberNet Forum | Learn Firefox

Related Posts:


OLPC Battery Life Review Emerges

This article was written on September 11, 2007 by CyberNet.

OLPC BatteryWe all know that computer manufacturers tend to give predictions on the battery life that are normally not attainable, but is the OLPC also guilty of that? CNet got the idea that they should test the battery life on the OLPC to see what it truly is.

What kind of battery life is the OLPC supposed to have? As we’ve previously noted the OLPC is supposed withstand 10 to 12 hours of "heavy use" on a full charge, but what do they define as heavy use? Before we get into that let’s take a look at what CNet’s initial test results were:

The best of the NiMH (nickel-metal hydride) batteries produced a little over 4 hours of operation. Of the two brands of lithium-ion batteries tested, one was about the same as the NiMH batteries; the other ran for a little over 5 hours.

That doesn’t sound too bad at first glance, but the interesting thing was that the computer wasn’t doing a darn thing during the tests…it was idling! So that can hardly be claimed to be heavy use. Jim Gettys, the Vice President of Software Engineering on the OLPC, contacted CNet to tell them how they can configure their OLPC for maximum battery life:

  • Configure the DCON (display controller) chip to refresh the display whenever possible, so the primary display clock source can be shut down (saving about 0.52 W)
  • Turn off the backlight (saving about 1 watt)
  • Optimize the wireless firmware to reduce power consumption (savings unspecified)

Doing all of that is supposed to get the OLPC battery life near the reported 10 hours, but they consider that to be heavy usage? Well, Gettys did. He said that the target environment would be such that those criteria would more than likely be met. The CNet reviewer didn’t feel the same though:

I think that the usage model for a classroom environment should assume that the backlight is on and that students are typing, drawing and making their way through computer-aided learning programs. In such an environment, the figures from OLPC suggest to me that the XO will run for only 4 to 6 hours per charge.

Even in a more official July 2007 battery life test the OLPC capped out around 5 hours and 30 minutes of battery life, and that’s with dozens of computers being tested some of which had the backlight off. Heck, my Dell Inspiron can get about 6 hours of battery life if I turn the backlight all the way down and turn off the wireless. And it has a Intel Core 2 Duo processor which has gotta suck up a lot more power than anything in the OLPC. But then again my computer wasn’t a mere few hundred dollars.

Copyright © 2009 CyberNet | CyberNet Forum | Learn Firefox

Related Posts:


Apple’s New 13-inch MacBook and 15-inch MacBook Pro

This article was written on October 14, 2008 by CyberNet.

Over the last several weeks, you may have heard people talking about “supposed” changes Apple was going to make to their MacBook laptops. Some of what we heard turned out to be just rumors, but much turned out to be true. Today Steve Jobs, along with a few of his pals from Apple took the stage to unveil the new laptops. At the end of the day we were left with a new 13-inch MacBook, a new 15-inch MacBook Pro, a MacBook Air with better graphics, and a new LED Cinema Display “made especially for a MacBook.”

About the new 13-inch MacBook:

The “better” of the two “new” MacBook’s, the 2.4GHz model, comes with features you’d expect from a MacBook Pro, like an LED-backlit display. In looks it has definitely changed, and now the MacBook’s and MacBook Pros look a lot alike.

New MacBook Features:

  • Built from “a solid block of aluminum” – in other words, it should be durable (no more plastic).
  • Illuminated keyboard (only 2.4 GHz model)
  • New trackpad is one big button – no longer is there a button at the bottom of the trackpad for clicking — just click anywhere
  • Improved graphics (powered by NVIDIA)
  • Multi-Touch trackpad

Pricing of the new models is $1299 for the 2.0GHz model and $1,599 for the $2.4GHz model. One of the best moves Apple could have made was to drop the price of the white 13-inch MacBook (no aluminum casing) to $999.

About the new 15-inch MacBook Pro:

The biggest change here is in looks. You’ll see what I mean when you take a look at the image below:

new macbook pro.png

A black rim around the edge of the screen is new, and reminds us of the iMacs.

Other features:

  • Improved graphics (still powered by NVIDIA)
  • Updated keyboard
  • New glass touch-pad (with multi-touch)
  • Made from a solid piece of aluminum

*Note: The 17-inch MacBook Pro was not updated

If you’d like more information, checkout the following:

The New MacBook
The New MacBook Pro
The New LED Cinema Display

We’re thinking that with the update, the new products look great. However, it’s not enough to entice someone to go out and buy a new computer if they updated within the last year or so.

When we were purchasing the MacBook Pro’s in April, there were rumors back then that Apple would be releasing updated models in the Summer. We thought long and hard about whether it would be worth waiting but all of the rumors at the time pointed to the idea that the new laptops would be equipped with the MacBook Air style keyboard (also known as the Chicklet keyboard). After playing with the MacBook Air, we realized that we preferred the “old style” keyboard without the spaces between the keys. After today, we are definitely happy we didn’t wait to make our purchases, seeing as the keyboard has changed.

Copyright © 2009 CyberNet | CyberNet Forum | Learn Firefox

Related Posts:


Giz Explains: Why Quantum Computing Is the Future (But a Distant One)

Over 400 million transistors are packed on dual-core chips manufactured using Intel’s 45nm process. That’ll double soon, per Moore’s Law. And it’ll still be like computing with pebbles compared to quantum computing.

Quantum computing is a pretty complicated subject—uh, hello, quantum mechanics plus computers. I’m gonna keep it kinda basic, but recent breakthroughs like this one prove that you should definitely start paying attention to it. Some day, in the future, quantum computing will be cracking codes, powering web searches, and maybe, just maybe, lighting up our Star Trek-style holodecks.

Before we get to the quantum part, let’s start with just “computing.” It’s about bits. They’re the basic building block of computing information. They’ve got two states—0 or 1, on or off, true or false, you get the idea. But two defined states is key. When you add a bunch of bits together, usually 8 of ’em, you get a byte. As in kilobytes, megabytes, gigabytes and so on. Your digital photos, music, documents, they’re all just long strings of 1s and 0s, segmented into 8-digit strands. Because of that binary setup, a classical computer operates by a certain kind of logic that makes it good at some kinds of computing—the general stuff you do everyday—but not so great at others, like finding ginormous prime factors (those things from math class), which are a big part of cracking codes.

Quantum computing operates by a different kind of logic—it actually uses the rules of quantum mechanics to compute. Quantum bits, called qubits, are different from regular bits, because they don’t just have two states. They can have multiple states, superpositions—they can be 0 or 1 or 0-1 or 0+1 or 0 and 1, all at the same time. It’s a lot deeper than a regular old bit. A qubit’s ability to exist in multiple states—the combo of all those being a superposition—opens up a big freakin’ door of possibility for computational powah, because it can factor numbers at much more insanely fast speeds than standard computers.

Entanglement—a quantum state that’s all about tight correlations between systems—is the key to that. It’s a pretty hard thing to describe, so I asked for some help from Boris Blinov, a professor at the University of Washington’s Trapped Ion Quantum Computing Group. He turned to a take on Schrödinger’s cat to explain it: Basically, if you have a cat in a closed box, and poisonous gas is released. The cat is either dead, 0, or alive, 1. Until I open the box to find out, it exists in both states—a superposition. That superposition is destroyed when I measure it. But suppose I have two cats in two boxes that are correlated, and you go through the same thing. If I open one box and the cat’s alive, it means the other cat is too, even if I never open the box. It’s a quantum phenomenon that’s a stronger correlation than you can get in classical physics, and because of that you can do something like this with quantum algorithms—change one part of the system, and the rest of it will respond accordingly, without changing the rest of the operation. That’s part of the reason it’s faster at certain kinds of calculations.

The other, explains Blinov, is that you can achieve true parallelism in computing—actually process a lot of information in parallel, “not like Windows” or even other types of classic computers that profess parallelism.

So what’s that good for? For example, a password that might take years to crack via brute force using today’s computers could take mere seconds with a quantum computer, so there’s plenty of crazy stuff that Uncle Sam might want to put it to use for in cryptography. And it might be useful to search engineers at Google, Microsoft and other companies, since you can search and index databases much, much faster. And let’s not forget scientific applications—no surprise, classic computers really suck at modeling quantum mechanics. The National Institute of Science and Technology’s Jonathan Home suggests that given the way cloud computing is going, if you need an insane calculation performed, you might rent time and farm it out to a quantum mainframe in Google’s backyard.

The reason we’re not all blasting on quantum computers now is that this quantum mojo is, at the moment, extremely fragile. And it always will be, since quantum states aren’t exactly robust. We’re talking about working with ions here—rather than electrons—and if you think heat is a problem with processors today, you’ve got no idea. In the breakthrough by Home’s team at NIST—completing a full set of quantum “transport” operations, moving information from one area of the “computer” to another—they worked with a single pair of atoms, using lasers to manipulate the states of beryllium ions, storing the data and performing an operation, before transferring that information to a different location in the processor. What allowed it to work, without busting up the party and losing all the data through heat, were magnesium ions cooling the beryllium ions as they were being manipulated. And those lasers can only do so much. If you want to manipulate more ions, you have to add more lasers.

Hell, quantum computing is so fragile and unwieldy that when we talked to Home, he said much of the effort goes into methods of correcting errors. In five years, he says, we’ll likely be working with a mere tens of qubits. The stage it’s at right now, says Blinov, is “the equivalent of building a reliable transistor” back in the day. But that’s not to say those of tens of qubits won’t be useful. While they won’t be cracking stuff for the NSA—you’ll need about 10,000 qubits for cracking high-level cryptography—that’s still enough quantum computing power to calculate properties for new materials that are hard to model with a classic computer. In other words, materials scientists could be developing the case for the iPhone 10G or the building blocks for your next run-of-the-mill Intel processor using quantum computers in the next decade. Just don’t expect a quantum computer on your desk in the next 10 years.

Special thanks to National Institute of Standards and Technology’s Jonathan Home and the University of Washington Professor Boris Blinov!

Still something you wanna know? Send questions about quantum computing, quantum leaps or undead cats to tips@gizmodo.com, with “Giz Explains” in the subject line.

Dixons launches slim little Advent Altro CULV PCs

UK electronics retailer Dixons just got the memo that slim is in, and its taken it to heart. The store is on the verge of launching two new PCs as part of its Advent brand — and the 13.3-inch CULV Altro line is pretty sexy looking. Already drawing the obvious comparisons to the MacBook Air in the looks department, the Altro boasts an Intel Celeron CPU, 3GB of RAM, and a 120GB hard drive, WiFi, Bluetooth, plus USB and HDMI ports, and one multifunction connector for hooking up an external port replicator. If the specs of the Altro aren’t beefy enough for you, there will be a second version — the Elite — which will have an Intel Core 2 Solo processor, and a “premium” flush glass finish. Both of the Advent Altros will be available at Dixons (that’s UK-only) starting August 24th, with prices at £600 (around $987) for the standard model and £800 (about $1,316) for the Elite. Both come with Windows Vista pre-installed, but a free upgrade to Windows 7 is also included. One more shot after the break.

[Via SlashGear]

Continue reading Dixons launches slim little Advent Altro CULV PCs

Filed under:

Dixons launches slim little Advent Altro CULV PCs originally appeared on Engadget on Mon, 17 Aug 2009 09:35:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Bill Gates: My 1979 Memories

Our Gizmodo ’79 celebration may have ended last week, but there’s room for a final post, written by famed retiree and mosquito wrangler Bill Gates. It’s no joke: Gates read the series then sent this in:

I read those 1979 stories all last week, and it put me in a nostalgic mood, so wanted to offer my own memory to add to the collection.

In 1979, Microsoft had 13 employees, most of whom appear in that famous picture that provides indisputable proof that your average computer geek from the late 1970s was not exactly on the cutting edge of fashion. We started the year by moving from Albuquerque back to Bellevue, just across the lake from Seattle. By the end of the year we’d doubled in size to 28 employees. Even though we were doing pretty well, I was still kind of terrified by the rapid pace of hiring and worried that the bottom could fall out at any time.

What made me feel a little more confident was that 1979 was the year we began to sense that BASIC was right on the verge of becoming the standard language for microcomputers. We knew this could be the catalyst that would unlock the potential of the PC to democratize computing and create the right conditions for an explosion in programs and applications that would lead to really rapid growth of the PC market.

By the middle of 1979, BASIC was running on more than 200,000 Z-80 and 8080 machines and we were just releasing a new version for the 8086 16-bit microprocessor. As the numbers grew, we were starting to think beyond programming languages, too, and about the possibility of creating applications that would have real mass appeal to consumers. That led to the creation of the Consumer Products Division in 1979. One of our first consumer products was called Microsoft Adventure, which was a home version of the first mainframe adventure game. It didn’t have all the bells and whistles of, say, Halo, but it was pretty interesting for its time.

Back in the 1970s, there was a publication called the International Computer Programs Directory that handed out what was known as the ICP Million Dollar Award for applications that had more than $1 million in annual sales. In the late 1970s the list included more than 100 different products, but they were all for mainframes. In April, the 8080 version of BASIC became the first software product built to run on microprocessors to win an ICP Million Dollar Award. That was a pretty good sign that a significant shift was underway.

Today, I would be surprised if the number of million-dollar applications isn’t in the millions itself, and they range from apps and games created by a single developer working at home that you can download to your cell phone to massive solutions built by huge development teams that run the operations of huge corporations.

More important, of course, is the fact that more than a billion people around the world use computers and digital technology as an integral part of their day-to-day lives. That’s something that really started to take shape in 1979.

Thanks for the memories, Bill—please keep us posted on that new beer keg of yours!

Microsoft Adventure shot found on YOIS

Cray-1: The Super Computer

Seymour Cray’s big super computer was crazy. It’s signals between components had to be timed by trimming long cables up to 1/16th of an inch at a time by hand and was basically interwoven with a giant refrigeration system.

Name: Cray-1
Year created: 1976
Creator: Cray Research, Inc.
Cost: $5 million to $10 million
Memory: 4MW semiconductor
Speed: 160 MFLOPS

Building supercomputers was a dream, an aspiration, and a life’s pursuit for Seymour Cray, and his work on the computers that bore his name was the culmination of work he had done for the U.S. Navy, for CDC [Control Data Corporation], and finally for his namesake company. When Cray left CDC in 1972, after his work on the 6600, 7600, and minimally the 8600, he took much of the supercomputer fire with him.

While Cray’s departure from CDC wasn’t overly dramatic, his impact on supercomputing was. Cray artfully designed computers so that each part worked to efficiently speed up the whole, and he usually didn’t rely on the newest experimental components, preferring instead to tweak existing technologies for maximum performance. For instance, the Cray-1 was the first Cray machine to use integrated circuits, despite their having been on the market for about a decade. At 160 MFLOPS, the Cray-1 was the fastest machine at the time, and despite what seemed like only a niche market for expensive superfast machines, Cray Research sold more than a hundred of them.

Form and size were always concerns for Cray, as far back as his days developing the CDC 160, which was built into an ordinary desk. There was also a big concern with the heat that could be generated by so many parts being packaged so tightly together, so Cray’s designs typically involved unique cooling solutions, whether it be Freon on the Cray-1, or Fluorinert, in which Cray-2’s circuit boards were immersed.

Core Memory is a photographic exploration of the Computer History Museum’s collection, highlighting some of the most interesting pieces in the history of computers. These excerpts were used with permission of the publisher. Special thanks to Fiona!

The photos in the book were taken by Mark Richards, whose work has appeared in The New York Times Sunday Magazine, Fortune, Smithsonian, Life and BusinessWeek. The eye-candy is accompanied by descriptions of each artifact to cover the characteristics and background of each object, written by John Alderman who has covered the culture of high-tech lifestyle since 1993, notably for Mondo 2000, HotWired and Wired News. A foreword is provided by the Computer History Museum’s Senior Curator Dag Spicer.

Or go see the real things at the Computer History Museum in Mountain View, Calif.



Gizmodo ’79 is a week-long celebration of gadgets and geekdom 30 years ago, as the analog age gave way to the digital, and most of our favorite toys were just being born.