31 Views Inside the Workings of Our Gadgets

For this week’s Photoshop Contest, I asked you to show us how your gadgets really work. We all know there’s something fishy that makes everything run, and it turns out that thing involves Chuck Norris and animals making shadow puppets.

Norman Rockwell: The Original King of the Photoshop

Back when Norman Rockwell ruled Saturday evenings, Adobe wasn’t even a gleam in some nerd’s eye, but a new book shows that the painter was, nevertheless, a photoshop god.

Very few Gizmodo readers were even born when Rockwell painted his last Saturday Evening Post cover, but we all know them. You hear that name and suddenly you can picture those overly detailed, cartoonishly dramatic but ultimately kinda corny depictions of American life. Well, Norman Rockwell: Behind the Camera, written and compiled by Ron Schick, has given me immense newfound respect for the man, for the meticulous photography, the real people and the unintentionally hilarious DIY props and sets that he required to make his painted fantasies of Americana come true.

The book is not about painting. Rockwell’s oil-on-canvas work feels like an afterthought for Schick, who mostly documents Rockwell’s photography and art direction. Throughout the book, you see a painting, then you see the photographs he took to make that painting. In most cases, many shots comprise the different elements, and are joined together only in paint. It’s almost sad: Vivid interactions between people, remembered jointly in the country’s collective consciousness, may never have taken place. Even people facing each other at point blank range were photographed separately, and might never have even met.

The photos are as memorable as the paintings: There’s a little boy whose feet are propped up on thick books, a walking still-life; there’s a naked lady who ended up a mermaid in a lobster trap; there are men and women in various states of frustration, concentration and bliss, whose facial expressions defined Rockwell’s style. These were mostly not agency models, but friends and neighbors who were pleased to help out, but not always thrilled by the finished product.

Since Rockwell was one of the most commercially successful artists of all time, you can imagine the rights to all of his images (paintings and photos) are carefully managed. The publisher was kind enough to let us show you the book cover plus two additional pairings, below. I encourage you to buy the book ($26.40 at Amazon)—what you see here is just a quick lick of the spoon:


Going and Coming, 1947
You’ll notice the book jacket shows a painting of a family embarking on a summer vacation—Granny, Spot and all—coupled with a photo of a similar scene with far less action. There’s a kid sticking out of the car in both, but many family members are missing. This is because they were photographed separately, in Rockwell’s studio, and painted in where needed. (You’ll also notice that the photo on the jacket is reversed—the car was pointed in the other direction but I suppose that wouldn’t have looked as cool.)


Circus, 1955
What I liked about this picture is that you get to see how ridiculous Rockwell’s sets could often be. He needed real faces, but he could fill in the rest. Hence piling chairs up on top of an old desk to simulate bleachers at the circus. Good thing nobody fell off the back and sued ole Rocky for millions—that twine used to hold the little girl’s chair in place doesn’t look OSHA certified. If the geeky looking fellow in the front looks familiar, it’s because Rockwell himself served as a model for his paintings all the time.


The Final Impossibility: Man’s Tracks on the Moon, 1969
Yep, here’s proof that the moon landing was faked. At least, Rockwell’s commemorative portrait of it was. NASA loved his work, so they loaned him spacesuits and helmets whenever he wanted, and for this, he got permission to photograph his models moonwalking around an Apollo Lunar Lander, with a black tarp doubling for infinity and beyond. Remember, this is when Apollo was new and the Cold War was in full swing, so getting access to the latest NASA toys took clout.

Behind the Camera covers many aspects of Rockwell that I had not known about previously. He was an outspoken civil rights activist, and many of his paintings dealt with race relations. There is a painting of two murdered men, one black and one white, accompanied by an almost absurd photo of two very alive guys lying side by side, eyes closed, on a carpet. There’s another painting of a little black girl being walked to school by US Marshals, and the many different closeup shots Rockwell required to paint the extreme detail of the tense, potent—and fabricated—moment.

I wish I could run a gallery of 100 shots from this book, because each page startled me in a different way. Meeting the real people behind the paintings, and learning that every painting was composed of masterfully planned photographs—always black and white, since the artist let his imagination add the color—I will no longer take Norman Rockwell for granted. In fact, I’m gonna kinda worship him from now on. [Amazon sales page; Little, Brown product page]

This Week’s 10 Best iPhone Apps

In this week’s net-neutral iPhone app roundup: Wild Things, physics games, Photoshop!, Twitter again (but that’s ok!), horse music, human music, and much, much more.

The Best

Where the Wild Things Are: Promotional apps are normally garbage, and in a few areas, this is a little fluffy (though there’s some neat media in here—it’s fairly generous). But hey, the people marketing this movie know exactly whose heartstrings they’re pulling at, and how to pull them. And the 3D monster toy is genuinely cool. Free.

iBlast Moki: A visually stunning physics-based platformer, with bombs. The levels are puzzles, but they don’t feel like work at all. A very, very safe buy at a dollar.

Photoshop: This app bears almost no resemblance to the Photoshop we all know and steal love. That’s fine though, because it’s a serviceable photo-editing (on the iPhone, this means filters, cropping, and a few other tricks) app that is free, unlike virtually all of its competition.

Tweetie: Few people like Twitter as much as Matt, and Matt likes few things as much as Tweetie 2: The $3 app is described as

the most polished Twitter app yet, oozing slickness with every swipe. Yet, it’s exploding with new features, and still really fast.

“Tweet tweet?” “Who’s there?” “THE WORST JOKE YOU’VE EVER HEARD.”

Weight Watchers: I’ve never thought about my diet too much, which means my life will be short, brutal and tasty. But I have seen people using WeightWatchers, and they seemed to sorta like it, and sometime get less fat! An iPhone app pretty much seems like the ideal tool for keeping a food journal, plus this one’s free.

Pet Acoustics: Excuse me everyone, I’ve got an announcement: People write muzak for dogs. And cats. And horses! Then they put it in iPhone apps, so you can use it to soothe your stable of animals, uh, on the go? This makes me laugh, which makes me happy. (Though I have absolutely no idea if it works, because my Labrador only listens to gangsta rap.) Two dollars.

Command & Conquer: Red Alert: This one isn’t out yet, but I defy you to name a game franchise that needs an iPhone title more than C&C. TouchArcade got an early hands-on, and they say it’s fantastic—and surprisingly faithful to the original.

Rock Band: Another long-overdue addition to the store, Rock Band, the app, is kind of a jerk: While it was taking foreeever to show up, companies like Tapulous stepped in an made decent rhythm games to fill the void. Now that it’s here, and it looks great—multiple instruments, a decent song list—it’s going to poop on everyone else’s party. It’ll be here in a few weeks, price TBD.

MotionX Drive GPS: It’s not brand-new, but it’s too good a value not to mention here. $3 a month, or 25 per year is amazing for a turn-by-turn nav app, and Wilson enthusiastically deemed it to be fine:

I am not going to tell you this is the best turn-by-turn road navigation app in the world. The designers made some funny UI choices, there’s no multi-destination or point-on-map routing, it doesn’t have text-to-speech, and it only runs in portrait mode, taking up awkward space on my dashboard. Still, there’s almost no reason not to get it.

Indeed.

iLickit: This app deserves more credit than I can give it for being the first designed for use with the human tongue. Ho ho, you wacky app developers, what’s next!? Wait, ugh, don’t tell me. Not in the store, yet.

Honorable mentions

Explore the New York City Which Could’ve Been With the Phantom City iPhone App

PewPewPew (With Your iPhone): Ahem:

pewpewpewpew, bangbangbang boomPEW, swishpewpewpewpew.

Also, augmented reality. A dollar.

iSheriff: It’s a lot like that PewPew AR app above, rebalanced: It’s free, which is cool; and it’s not quite as playful: it puts people in zoomable crosshairs, and has gore effects, which makes it a little creepy.

Good Things Do Come in Threes with Tap Tap Revenge 3

MapQuest Stumbles Back Into the App Store With Budget Turn-by-Turn

FHM: DUDE MAG, in an app. Lots of near-nakedness here, with daily updated FHM non-boob content too. $2.

Let’s Draw Some Sheep: No, really, let’s draw some sheep! Because that’s just about all you can do with this moderately charming little app. $1.

Other App News on Giz

• ChilliX, who makes all kinds of neat, usually paid iPhone apps, is giving away their entire catalog for free this weekend.

Flash Apps to Come to the iPhone, But Not to Safari

The iPhone App Store Gold Rush May Be Running Low on Gold

Apocalypse Nigh, AT&T Opens Network for VoIP Over 3G on iPhone

This list is in no way definitive. If you’ve spotted a great app that hit the store this week, give us a heads up or, better yet, your firsthand impressions in the comments. And for even more apps: see our previous weekly roundups here, and check out our Favorite iPhone Apps Directory. Have a great weekend, everybody!

44 PlayStation 3 Ads Too Offensive For Even Sony To Use

Sony has a penchant for making questionable or offensive ads. But man, nothing they’ve done comes close to some of the stuff you guys came up with. You’ve been warned; no whining about being offended allowed.

First Place—Brian Garten

Second Place—Jairo Filho

Third Place

Computer Benchmarking: Why Getting It Right Is So Damn Important


We’re constantly bombarded with benchmark results, used to pitch everything from web browsers to cell service. But if benchmarks aren’t built properly, results are erroneous or misleading. Here’s what goes into a great benchmark, and how to make your own.

Why Do Benchmarks Matter?

Benchmarks typically measure the performance of the bottlenecks in your system. Benchmarks of your car measure its speed, braking and cornering. Benchmarks of your mechanical toothbrush measure the percentage of plaque it can remove from your teeth. As you attempt to test more complex systems, it becomes increasingly more difficult to create accurate benchmarks. These days, computers can be very difficult to test accurately.

On paper, making a great benchmark seems simple—it should be a quantitative test that measures something meaningful, delivers correct results and produces similar results when repeated in similar circumstances. However, in the real world, it can be difficult to find a test that fits all three criteria. Worse, it’s relatively easy for anyone with an agenda to change the starting variables enough to manipulate a benchmark’s results. It’s more important than ever for you to know the difference between good and bad benchmarks—especially if you want to avoid being hoodwinked.

There are dozens of examples of benchmark shenaniganry over the last decade, but I’m going to pick on Nvidia. In 2008 Nvidia famously claimed that high-end quad-core CPUs were overkill, and that the GPU could do everything the CPU could do better and faster. As is frequently the case, there was a demo to sell the point. Nvidia was showing a video transcoding app that used the power of Nvidia GPUs to convert video 19x faster than a quad-core CPU. However, the application used for the CPU part of the comparison was only able to utilize a single core on the CPU, an unusual situation for video conversion apps even then. When the exact same test was run using an industry-standard software that could use all four CPU cores, the performance difference was much less dramatic. So, while Nvidia created a benchmark that really did work, the results weren’t indicative of the actual performance that people in the real world would get.


The Lab vs. The Real World

There are two basic types of benchmarks: synthetic and real world. Even though we tend to favor real-world benchmarks at Maximum PC (where I am editor-in-chief), both types of tests have their place. Real-world benchmarks are fairly straightforward—they’re tests that mimic a real-world workflow, typically using common applications (or games) in a setting common to the typical user. On the other hand, synthetic benchmarks are artifices typically used to measure specific parts of a system. For example, synthetic benchmarks let you measure the pixel refresh speed of a display or the floating-point computational chutzpah of a CPU. However, the danger of relying on synthetic benchmarks is they may not measure differences that a user would actually experience.

Let’s look at hard drive interface speeds, for instance. Synthetic benchmarks of the first generation SATA interface showed a speedy pipe between SATA hard drives and the rest of the system—the connection benchmarked in the vicinity of 150MB/sec. When the second generation SATA 3Gbps spec was introduced, tests showed it was twice as fast, delivering around 300MB/sec of bandwidth to each drive. However, it wasn’t correct to say that SATA 3Gbps-equipped drives were twice as fast as their first-gen SATA kin. Why not? In the real world, that extra speed didn’t matter. If you tested two identical drives, and enabled SATA 3Gbps on one and disabled it on the other, you’d notice minimal—if any—performance differences. The mechanical hard drives of the era weren’t capable of filling either pipe to capacity—a higher ceiling means nothing when nobody’s bumping their head. (Today, SSD drives and even the large mechanical disks can saturate even a SATA 3Gbps pipe, but that’s a topic for another day.)

So, real-world benchmarks are perfect, right? Not necessarily. Let’s look at the Photoshop script we run at Maximum PC to measure system performance. We built a lengthy Photoshop script using dozens of the most common actions and filters, then we measure the time it takes to execute the script on a certain photo using a stopwatch. It’s a relatively simple test, but there’s still plenty of opportunity for us to muck it up. We could use an image file that’s much smaller or larger than what you currently get from a digital camera. If we ran the script on a 128KB JPEG or a 2GB TIFF, it would measure something different than it does using the 15MB RAW file we actually use for the test.

So, how do we know that our Photoshop benchmark is delivering correct results? We test it. First, we run the benchmark many times on several different hardware configurations, tweaking every relevant variable on each configuration. Depending on the benchmark, we test different memory speeds, amounts of memory, CPU architectures, CPU speeds, GPU architectures, GPU memory configurations, different speed hard drives and a whole lot more; then we analyze the results to see which changes affected the benchmark, and by how much.

But by comparing our results to the changes we made as well as other known-good tests, we can determine precisely what a particular benchmark measures. In the case of our Photoshop script, both CPU-intensive math and hard disk reads can change the results. With two variables affecting outcome, we know that while the test result is very valuable, it is not, all by itself, definitive. That’s an important concept: No one benchmark will tell you everything you need to know about the performance of a complex system.

Making Your Own Photoshop Benchmark

Once you get the hang of it, it’s never a bad idea to run your own benchmarks on a fairly regular basis. It will help you monitor your machine to make sure its performance isn’t degrading over time, and if you do add any upgrades, it will help you see if they’re actually doing anything. Just don’t forget to run a few tests when your computer is new (and theoretically performing at its peak), or before you swap in new RAM or a new HDD or other parts. If you forget, you won’t have a starting data point to compare to future results.

If you don’t own an expensive testing suite like MobileMark or 3DMark, don’t sweat it. If you have an application that you use regularly and can record and play back macros or scripts, like Photoshop, you can build a script that includes the activities you frequently use. We run a 10MP photograph through a series of filters, rotations and resizes that we frequently use as one of our regular system testing benchmarks at Maximum PC.

To make your own, launch Photoshop and open your image. Then go to Windows —> Action, click the down arrow in that palette to select New Action. Name it and click Record, then proceed to put your file through your assorted mutations. Always remember to revert to the original file between each step, and make the final action a file close, so you can easily tell when the benchmark is done. Pile in a lot of actions: As a general rule, you want the total script to take at least two minutes to run—the longer it takes, the less important small inaccuracies on your stopwatch work matter. When you’re finished assigning actions and have closed the file, click the little Stop button in the action palette to finish your script.

Once finished, make sure your new action is highlighted, then click the menu down arrow in the Action palette again and select Action Options. Assign a function key, which will let you start your benchmark by pressing a keyboard shortcut. (We use F2.) Then, open the Action palette menu again, and select Playback Options. Set it to Step-by-Step and uncheck Pause for Audio Annotation. Once that’s done, ready your stopwatch. (Most cell phones include one, in case you aren’t a track coach.) Load your image, then simultaneously start the stopwatch and press the keyboard shortcut you just selected. Stop the stopwatch when the file closes. We typically run this type of test three times, to minimize any human error we introduce by manually timing the test. If you want to try the same script we use at Maximum PC, you can download it here.

Gaming Benchmarks

Additionally, if you’re a gamer, there are tons of games with built-in benchmarks. These help you know what settings to run in games to maximize image quality without sacrificing framerate as well as measure the impact of use on your computer’s overall speed.

Check out Resident Evil 5 benchmark, which includes both DirectX 9 and DirectX 10 modes. Running this test is easy—simply install it and select DirectX 9 or DirectX 10 mode. (Remember, you’ll need a Radeon 4800 series card or newer or a GeForce 8800 series card or newer and be running on Vista or Windows 7 to use DirectX 10 mode.) If you want to compare performance over a period of time, we recommend the fixed run, it’s simply more repeatable. If you’re trying to tell what settings to use, the variable mode isn’t as consistent, but it shows actual gameplay, which will be more representative of your in-game experience. Once you’re in the game, you’ll want to change to your flat panel’s native resolution and do a test run of your benchmark. For a single-player game, we like to choose settings that will minimize the framerate drops below 30fps. For multiplayer, we sacrifice image quality for speed and target 60fps. After all, dropped frames in a deathmatch will get you killed.

The Practical Upshot

Like everything else, there are good benchmarks and bad benchmarks. However, there’s absolutely nothing mysterious about the way a benchmarking should work. In order to know whether you can trust benchmarks you read online, you need to know exactly what’s being tested—how the scenario starts, what variables are changed and exactly what’s being measured. If you can’t tell that a test is being run in a fair, apples-to-apples manner, ask questions or try duplicating the tests yourself. And when someone doesn’t want to share their testing methodology? That’s always a little suspicious to me.

Will Smith is the Editor-in-Chief of Maximum PC, not the famous actor/rapper. His work has appeared in many publications, including Maximum PC, Wired, Mac|Life, and T3, and on the web at Maximum PC and Ars Technica. He’s the author of The Maximum PC Guide to Building a Dream PC.

34 Portable Gaming Devices That Aren’t So Portable

For this week’s Photoshop Contest, I asked you to invent some completely unportable portable gaming devices in honor of the PSPgo. As usual, your minds are more demented than I’d even imagined.

First Place

Second Place

Third Place

Google Sponsors Photoshop Compatibility on Linux

This article was written on February 18, 2008 by CyberNet.

photoshop wine

Google is once again showing their support for the open source community by sponsoring improvements to Wine. Wine, for those of you unfamiliar, is a free tool available for Linux which lets you run some Windows applications without needing to run Windows in a virtual machine. I think Wine says it the best: “Think of Wine as a compatibility layer for running Windows programs.

There’s no word on how much money Google has thrown at the Wine developers, but it must be a considerable amount since a lot of fixes have been checked in to the software to get Photoshop CS2 already working. As of right now Photoshop CS3 doesn’t even install, but with Google’s virtually bottomless pocketbook that should definitely be on the horizon.

As with most things running on Wine there are some known issues with Photoshop , but the fact that these improvements are being offered for free is a huge plus. After all, if Google wasn’t sponsoring the improvements they would probably be part of the paid-only CrossOver Linux which is created by the same developers.

So kudos to Google for providing some financial backing to the open source community!

Google Open Source Blog [via APC / image via LinuxScrew]

Copyright © 2009 CyberNet | CyberNet Forum | Learn Firefox

Related Posts:


42 Shocking Discoveries the Newly-Upgraded Hubble Didn’t Make

For this week’s Photoshop Contest, I asked you to imagine some truly shocking discoveries that the newly-rejiggered Hubble might make. And if the stuff you guys came up with really is out there, maybe we’re better off focusing on Earth.


First Place

Second Place

Third Place

56 Redesigns of the Snow Leopard Box

Not blown away by the box Snow Leopard comes in? What a life you must lead to be bothered by such things! Allow me to soothe your soul with a veritable tsunami of redesigns, most of them much, much worse.

First Place

Second Place

Third Place

Microsoft sucks at Photoshop

Officially.

Update: Microsoft tells CNET, “We are looking into the details of this situation. We apologize and are in the process of pulling down the [Polish] image.”

Update 2: And… it’s down. The un-shopped image is now up on the Polish site, although whatever harried graphics monkey that got the call to fix it didn’t do so well lining up the text box. At least that’s one mistake that won’t get you fired though, right?

[Thanks, David and Matt W]

Read – Microsoft’s English site
Read – Microsoft’s Polish site

Filed under:

Microsoft sucks at Photoshop originally appeared on Engadget on Tue, 25 Aug 2009 15:47:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments