Tech standards are important. They’re, well, standards. They shape the way the world works, ideally. So if you wanna influence your little world, you probably wanna shape (or maybe even create) standards. Take Apple, for example.
They Call It “Open” For a Reason
One of the more excellent aspects of Snow Leopard, actually, is its full-scale deployment of OpenCL 1.0—Open Computing Language—a framework that allows programmers to more easily utilize the full power of mixes of different kinds of processors like GPUs and multi-core CPUs. (Much of the excitement for that is in leveraging the GPU for non-graphical applications.)
OpenCL lives up to its name: It is a royalty-free open standard managed by the Khronos Group, and supported by AMD/ATI, Apple, ARM, IBM, Intel, Nvidia, among others. Interesting thing about this open industry standard is that it was developed and proposed by… Apple.
What Is a Standard?
By “standard,” we’re talking about a format, interface or programming framework that a bunch of companies or people or organizations agree is the way something’s going to get done, whether it’s how a movie is encoded or the way websites are programmed. Otherwise, nothing works. A video that plays on one computer won’t play on another, web sites that work in one browser don’t work in another, etc. With increased connectedness between different machines and different platforms, standards are increasingly vital to progress.
Standards can range from open (anybody can use them, for free) to open with conditions (anybody can use them as long they follow conditions X, Y and Z) to closed (you gotta have permission, and most likely, pay for it). Some companies view standards strictly as royalty machines; others don’t make much money on them, instead using them to make sure developers do things the way they want them to. Apple falls into this latter category, by choice or possibly just by fate.
Kicking the Big Guy in the Shins
Of course, OpenCL isn’t the only open standard that Apple’s had a hand in creating or supporting that actually went industry-wide. When you’re the little guy—as Apple was, and still is in computer OS marketshare, with under 10 percent—having a hand in larger industry standards is important. It keeps your platform and programming goals from getting steamrolled by, say, the de facto “standards” enforced by the bigger guy who grips 90 percent of the market.
If you succeed in creating a standard, you’re making everybody else do things the way you want them done. If you’re doubting how important standards are, look no further than the old Sony throwing a new one at the wall every week hoping it’ll stick. Or Microsoft getting basically everybody but iTunes to use its PlaysForSure DRM a couple years ago. Or its alternative codecs and formats for basically every genuine industry standard out there. To be sure, there is money to be made in standards, but only if the standard is adopted—and royalties can be collected.
Web Standards: The Big Headache
The web has always been a sore spot in the standards debate. The web is a “universal OS,” or whatever the cloud-crazy pundits call it, but what shapes your experience is your browser and in part, how compliant it is with the tools web developers use to build their products. Internet Exploder shit all over standards for years, and web programmers still want IE6 to die in a fiery eternal abyss.
Enter WebKit, an open source browser engine developed by Apple based off of the KHTML engine. It’s so standards-compliant it tied with Opera’s Presto engine to be the first to pass the Acid3 test. What’s most striking about WebKit isn’t the fact it powers Safari and Google Chrome on the desktop, but basically every full-fledged smartphone browser: iPhone, Android, Palm Pre, Symbian and (probably) BlackBerry. So WebKit hasn’t just driven web standards through its strict adherence to them, but it has essentially defined, for now, the way the “real internet” is viewed on mobile devices. All of the crazy cool web programming you see now made is made possible by standards-compliant browsers.
True, OpenCL and WebKit are open source—Apple’s been clever about the way it uses open source, look no further than the guts of OS X—but Apple is hardly devoted to the whole “free and open” thing, even when it comes to web standards.
All the AV Codecs You Can Eat
The recent debate over video in the next web standards, known collectively as HTML5, shows that: Mozilla supports the open-source Ogg Theora video codec, but Apple says it’s too crappy to become the web’s default video standard—freeing everyone from the tyranny of Adobe’s Flash. Apple says Ogg’s quality and hardware acceleration support don’t match up to the Apple-supported MPEG-4 standardized H.264 codec, which is tied up by license issues that keep it from being freely distributed and open. (Google is playing it up the middle for the moment: While it has doubts about the performance of Ogg Theora, Chrome has built-in support for it and H.264.)
Apple has actually always been a booster of MPEG’s H.264 codec, which is the default video format supported by the iPhone—part of the reason YouTube re-encoded all of its videos, actually—and gets hardware acceleration in QuickTime X with Snow Leopard. H.264 is basically becoming the video codec (it’s in Blu-ray, people use it for streaming, etc.).
Why would Apple care? It means Microsoft’s WMV didn’t become the leading standard.
A sorta similar story with AAC, another MPEG standard. It’s actually the successor to MP3, with better compression quality—and no royalties—but Apple had the largest role in making it mainstream by making it their preferred audio format for the iPod and iTunes Store. (It saw some limited use in portables a little earlier, but it didn’t become basically mandatory for audio players to support it until after the iPod.) Another bonus, besides AAC’s superiority to MP3: Microsoft’s WMA, though popular for a while, never took over.
FireWire I Mean iLINK I Mean IEEE 1394
Speaking of the early days of the iPod, we can’t leave out FireWire, aka IEEE 1394. Like OpenCL, Apple did a lot of the initial development work (Sony, IBM and others did a lot of work on it as well), presented it to a larger standards body—the Institute of Electrical and Electronics Engineers—and it became the basis for a standard. They tried to charge a royalty for it at first, but that didn’t work out. It’s a successful standard in a lot of ways—I mean, it is still on a lot of stuff like hard drives and camcorders still—but USB has turned out to be more universal, despite being technically inferior. (At least until USB 3.0 comes out, hooray!)
Update: Oops, forgot Mini DisplayPort, Apple’s shrunken take on DisplayPort—a royalty-free video interface standard from VESA that’s also notably supported by Dell—which’ll be part of the official DisplayPort 1.2 spec. Apple licenses it for no fee, unless you sue Apple for patent infringement, which is a liiiiittle dicey. (On the other hand, we don’t see it going too far as industry standard, which is why we forgot about it.)
That’s just a relatively quick overview of some of the standards Apple’s had a hand in one way or another, but it should give you an idea about how important standards are, and how a company with a relatively small marketshare (at least, in certain markets) can use them wield a lot of influence over a much broader domain.
Shaping standards isn’t always for royalty checks or dominance—Apple’s position doesn’t allow them to be particularly greedy when it comes to determining how you watch stuff or browse the internet broadly. They’ve actually made things better, at least so far. But, one glance at the iPhone app approval process should give anybody who thinks they’re the most gracious tech company second thoughts about that.
Still something you wanna know? Send questions about standards, things that are open other than your mom’s legs or Sony Ultra Memory Stick XC Duo Quadro Micro Pro II to tips@gizmodo.com, with “Giz Explains” in the subject line.