Matrox pushes eight displays with a single-slot PCIe x16 GPU

Matrox has been distancing itself from the consumer market for awhile now, but even we couldn’t resist this one. Hailed as the planet’s first single-slot octal graphics card, the M9188 supports up to eight DisplayPort or single-link DVI outputs, and if you’re up for getting really crazy, you can hook up a pair to drive 16 displays from a single workstation. The card itself packs 2GB of memory and supports resolutions as high as 2,560 x 1,600 (per output), which should be just enough to create the Google Earth visualization system you’ve always dreamed of. In related news, the outfit also introduced the far weaker 1GB M9128, which can drive a grand total of two displays for $259. Oh, and as for pricing on the octal guy? Try $1,995 when it ships later this quarter.

Filed under:

Matrox pushes eight displays with a single-slot PCIe x16 GPU originally appeared on Engadget on Tue, 10 Nov 2009 15:27:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

NVIDIA Fermi GT300 GPU delayed until 2010?

We’re so sorry, true NVIDIA believers, but that airbrushed “I love Fermi” shirt is just gonna have to wait a little longer to see the light of day, at least according to a report from our favorite chip-centric foreign news syndicate, Digitimes. Taiwanese industry sources say the release of its Fermi GT300 GPU has been delayed until Fiscal 2011, which for the company means not until at least late January 2010. That “NVIDIA New Year” fete you’ve been gloating about on Facebook? We really hope you can get the deposit back on the rented space.

Filed under:

NVIDIA Fermi GT300 GPU delayed until 2010? originally appeared on Engadget on Tue, 10 Nov 2009 02:52:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

ST-Ericsson’s U8500 platform gives your next smartphone wicked 3D powers

It’s one thing for ARM to develop a potent GPU meant to add impressive 3D capabilities to devices that were previously forced to run the likes of “Snake,” but it’s another thing entirely to see a platform and semiconductor company come forward and take it one step closer to the mainstream. ST-Ericsson has done just that with its U8500 platform, which is the first to integrate ARM’s Mali-400 graphics processing unit into a solution that can be easily fitted into future phones. Think your iPhone 3GS GPU is mighty enough? Hop on past the break and mash play — it’ll make those fancy water reflections you’re currently drooling over look downright ugly.

[Via B4Tech, thanks Chris]

Continue reading ST-Ericsson’s U8500 platform gives your next smartphone wicked 3D powers

Filed under: ,

ST-Ericsson’s U8500 platform gives your next smartphone wicked 3D powers originally appeared on Engadget on Wed, 04 Nov 2009 17:13:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Asustek announces a 1.1 Teraflop, Tesla GPU powered supercomputer

Some of us love nothing more than a portable and convenient netbook — something that Asustek knows all too well — but how about those of us who need real computing power? To that end, Taipei’s choice for all things ultraportable has just announced its very own 1.1 Teraflop supercomputer. Dubbed the ESC 1000, this (albeit large) desktop-sized machine sports a 3.33GHz Intel LGA1366 Xeon W3580 microprocessor and three CUDA-based Tesla C1060 GPUs, the likes of which we last saw in Dell’s Precision “personal supercomputer” line. Shipping with 24GB of DDR3 DRAM (1333MHz) and a 500GB SATA II hard drive, the machine is said to have a cost structure of $14,519 over five years. We’re guessing that you’ll be able to both surf the net and watch HD quality video on the thing, although you probably won’t be taking it along with you to Crazy Mocha any time soon. According to a company spokesperson, this thing is ready to ship now, although a launch date and street price have yet to be determined. One more pic after the break.

Continue reading Asustek announces a 1.1 Teraflop, Tesla GPU powered supercomputer

Filed under:

Asustek announces a 1.1 Teraflop, Tesla GPU powered supercomputer originally appeared on Engadget on Wed, 28 Oct 2009 01:18:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Maingear, CyberPower and iBuyPower gaming desktops pick up ATI Radeon HD 5870

ATI’s Radeon HD 5870 GPU has already taken its rightful place within a few of Alienware’s newest desktops, but as with most every major GPU launch, a few of the smaller guys are also taking the opportunity to offer gamers the option to pick one up inside of a new rig. Maingear‘s Ephex, F131, Prelude, and Dash can all be ordered up right now with the staggeringly potent graphics card, and if none of those suit your fancy, CyberPower would be more than happy to have your business. In fact, it has squeezed the DirectX 11-friendly GPU into the Gamer Xtreme 4200 (starts at $999), Gamer Xtreme 5200 (starts at $1,393) and the AMD-based Gamer Dragon 9500 (starting at $927). Still on the hunt? iBuyPower has an eerily similar trio, though their lineup starts at just $819. Hit the read links below if you feel like putting together a system for kicks, but don’t blame us when the order button presses itself.

Read – Maingear rigs
Read – CyberPower rigs
Read – iBuyPower rigs

Continue reading Maingear, CyberPower and iBuyPower gaming desktops pick up ATI Radeon HD 5870

Filed under:

Maingear, CyberPower and iBuyPower gaming desktops pick up ATI Radeon HD 5870 originally appeared on Engadget on Wed, 23 Sep 2009 17:24:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

ATI Radeon HD 5870 blazes onto the scene, receives approving nods

Watch out now — the evergreen revolution has arrived, right on schedule and with the promised DirectX 11 and Eyefinity in tow. AMD’s new flagship graphics part, formerly known under the Cypress codename, is built on a 40nm process, sports an appropriately inflated 850MHz engine clock speed, 1600 stream processors, 153.6GBps memory bandwidth, over two billion transistors, and the freshly minted HD 5870 moniker. There’ll be a HD 5850 as well, which makes do with a 725MHz core clock, 1440 stream processors and slightly slower (or is it just less fast?) GDDR5 memory as well, but only the headline device has been made available to reviewers, so let’s see what they thought.

HardOCP whipped out their special Gold Award for the ocassion, noting that it “doubles performance, yet remains within the same power envelope.” The Tech Report crew agreed wholeheartedly, commending the “admirably low” power draw, noise levels and GPU temperatures. In fact, a pretty universal consensus shows that the new card spanks everything else out there in terms of performance, and makes a very compelling value proposition — a significant feat for a card that’s aimed at the usually less price-conscious enthusiast market. Hit up the read links below to revel in the full glorious details.

Show full PR text
AMD Changes the Game with ATI Radeon[TM] HD 5800 Series DirectX(R) 11-Compliant Graphics Cards, Harnessing the Most Powerful Processor Ever Created

World’s Most Advanced Graphics Processor Allows Consumers to Expand, Accelerate and Dominate Their PC Experience with First Full Support for Microsoft DirectX 11

SUNNYVALE, Calif.–(BUSINESS WIRE)–AMD (NYSE: AMD) today launched the most powerful processor ever created1, found in its next-generation ATI Radeon[TM] HD 5800 series graphics cards, the world’s first and only to fully support Microsoft DirectX(R) 11, the new gaming and compute standard shipping shortly with Microsoft Windows(R) 7 operating system. Boasting up to 2.72 TeraFLOPS of compute power, the ATI Radeon[TM] HD 5800 series effectively doubles the value consumers can expect of their graphics purchases, delivering twice the performance-per-dollar of previous generations of graphics products. AMD will initially release two cards: the ATI Radeon HD 5870 and the ATI Radeon HD 5850, each with 1GB GDDR5 memory. With the ATI Radeon[TM] HD 5800 series of graphics cards, PC users can expand their computing experience with ATI Eyefinity multi-display technology, accelerate their computing experience with ATI Stream technology, and dominate the competition with superior gaming performance and full support of Microsoft DirectX(R) 11, making it a “must-have” consumer purchase just in time for Microsoft Windows(R) 7 operating system.

“With the ATI Radeon HD 5800 series of graphics cards driven by the most powerful processor on the planet, AMD is changing the game, both in terms of performance and the experience,” said Rick Bergman, senior vice president and general manager, Products Group, AMD. “As the first to market with full DirectX 11 support, an unmatched experience made possible with ATI Eyefinity technology, and ATI Stream technology harnessing open standards designed to help make Windows 7 that much better, I can say with confidence that AMD is the undisputed leader in graphics once more.”

Dominate your competition with Microsoft DirectX(R) 11 support

With the ATI Radeon[TM] HD 5800 series of graphics cards, gamers will enjoy gaming supremacy and the ultimate advantage, realizing incredible HD gaming performance and the most engaging experience possible with DirectX(R) 11 gaming done right:

* Designed and built for purpose: Modeled on the full DirectX 11 specifications, the ATI Radeon HD 5800 series of graphics cards delivers up to 2.72 TeraFLOPS of compute power in a single card, translating to superior performance in the latest DirectX 11 games, as well as in DirectX 9, DirectX 10, DirectX 10.1 and OpenGL titles in single card configurations or multi-card configurations using ATI CrossFireX[TM] technology. When measured in terms of performance experienced in some of today’s most popular games, the ATI Radeon HD 5800 series is up to twice as fast as the closest competing product in its class,5 allowing gamers to enjoy incredible new DirectX 11 games – including the forthcoming DiRT[TM]2 from Codemasters, and Aliens vs. Predator[TM] from Rebellion, and updated version of The Lord of the Rings Online[TM] and Dungeons and Dragons Online(R) Eberron Unlimited[TM] from Turbine – all in stunning detail with incredible frame rates.
* Generations ahead of the competition: Building on the success of the ATI Radeon[TM] HD 4000 series products, the ATI Radeon HD 5800 series of graphics cards is two generations ahead of DirectX 10.0 support, and features 6th generation evolved AMD tessellation technology, 3rd generation evolved GDDR5 support, 2nd generation evolved 40nm process technology, and a feature-rich compute shader, all geared towards delivering the best gaming experience money can buy.
* The ultimate in game compatibility: The DirectX 11 API was developed on AMD graphics hardware and represents the cornerstone of DirectX 11 gaming. All initial DirectX 11 games were developed and/or continue to be developed on AMD DirectX 11 hardware. With more than 20 DirectX 11 games currently in development, this innate optimization for ATI Radeon graphics cards, in combination with monthly ATI Catalyst[TM] driver releases, help ensure a stable, reliable and high-performance experience for the latest games.

Accelerate with ATI Stream technology

With the ATI Radeon HD 5800 series of graphics card, PC users can unleash Windows 7 and realize the potential of a better computing experience to help do more with their PC:

* Harness the home supercomputer: One ATI Radeon HD 5870 graphics card would have been one of the top 10 supercomputers in the world just six years ago – today that same processing power can be found in your home PC, working with high-performance CPUs to deliver a superior experience.
* Windows(R) 7 done right: Windows 7 is the first compute-capable operating system and the ATI Radeon HD 5800 series of graphics cards with ATI Stream technology accelerate it like nothing else, being the first and only card to support DirectCompute 11.
* Create and do more, faster than ever before with ATI Stream technology: Enjoy new features, functionality and improved performance in top media, entertainment and productivity applications made possible by ATI Stream technology.6
* Most expansive support of industry standards: The ATI Radeon HD 5800 of graphics cards fully support both DirectX 11 and OpenCL, ensuring broad application support now and the future.

Expand the PC experience with ATI Eyefinity multi-display technology

Enjoy multi-monitor computing with seamless enablement of the biggest game environments ever seen:

* The ultimate in seamless flexibility: Arrange one to three displays using the ATI Radeon[TM] HD 5870 and ATI Radeon[TM] HD 5850 graphics cards, or up to six displays using the forthcoming ATI Radeon[TM] HD 5870 Eyefinity6 graphics card, in a variety of configurations – any mix of portrait or landscape.
* See them before they see you: Unlock the potential of multi-monitor gaming at up to 12 x full HD resolution, the largest game environments ever displayed.10 Experience more visual detail and expanded battlefields that your gaming competitors may lack.
* Enjoy visual computing in eye-definition: Virtually obsolete scrolling by taking advantage of vast desktop real estate to put more information at your fingertips. Enjoy the best of today’s latest visually-enhanced online applications – social networking, video conferencing, video entertainment, and satellite imagery – all in stunning detail.

Ecosystem support

* The ATI Radeon[TM] HD 5800 series of graphics cards is supported by a dozen add-in-board companies, including ASUS, Club 3D, Diamond Multimedia, Force3D, GIGABYTE, HIS (Hightech Information Systems), MSI, Multimedia, PowerColor, SAPPHIRE Technology, VisionTek and XFX.

Supporting Quotes

“By incorporating the ATI Radeon[TM] HD 5870 graphics processor’s revolutionary DirectX 11 and ATI Eyefinity multi-monitor capabilities into the Alienware desktop gaming system, Dell Gaming continues to lead the industry in delivering performance, immersion and visual experience levels that shatter all previous limitations,” said Arthur Lewis, head of Dell gaming group.

“I had high expectations of AMD’s new DirectX 11 GPUs, but nothing really prepared me for the breathtaking experience that I’m now enjoying,” said Dirk Ringe, vice president, EA Phenomic. “Frame rates are so silky-smooth at ultra high-resolutions, even with all effects turned to max, that the new hardware makes previous hardware look like a quaint antique! The quality of the rendering in BattleForge is something that I used to dream about only a year ago – and the flexibility and power of DirectCompute 11 opens our eyes to a multitude of new possibilities. We applaud AMD’s and Microsoft’s vision in creating the DirectX 11 API and this amazing new hardware and we can say without hesitation that it represents the future of gaming.”

“We were simply astonished by the performance of the DirectCompute 11 hardware in AMD’s DirectX 11 GPUs,” said Ruslan Didenko, project lead, GSC Gameworld. “By meeting the full DirectX 11 hardware spec AMD has created a beast of a GPU that is light years ahead of its DirectX 10.1 and DirectX 10 predecessors. We strongly recommend a full-on DirectX 11 GPU from AMD as very simply the best way to experience our stunning new game, S.T.A.L.K.E.R.: Call of Pripyat. A vision of loveliness, in every gut-wrenching detail!”

“Trinigy remains committed to supporting the game development industry with top-notch game engine technology that combines efficiency, creative freedom and performance,” said Dag Frommhold, managing director at Trinigy. “We’re extremely excited to be working with AMD to support their DirectX 11 graphics processors. AMD’s quality drivers and hardware complement our commitment to game developers perfectly by empowering them to produce higher-level in-game graphics than ever before.”

Read – Hot Hardware review
Read – AnandTech review
Read – Driver Heaven review
Read – HardOCP review
Read – Hexus review
Read – PC Perspective review
Read – Tech Report review
Read – Legit Reviews

Filed under:

ATI Radeon HD 5870 blazes onto the scene, receives approving nods originally appeared on Engadget on Wed, 23 Sep 2009 04:17:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Eyes-on with Intel’s Pine Trail CPU/GPU hybrid and new Gulftown gaming chip (update)

See that tiny little thing? That’s not just a CPU, it’s Intel’s next-gen Atom Pine Trail CPU / GPU hybrid, and it’s set to pop in all sorts of devices here at IDF. Intel was demoing it in a nettop running 480p video, but they assured us it was capable of 1080p playback — we’ll believe it when we see it, obviously. Also on display here at IDF: demo machines running 32nm Arrandale chips, with Intel’s Clear HD video playback system, and the next-gen Gulftown gaming chip, which has six cores and will slot right into your X58 mobo to provide more power than you possibly need sometime next year. Check it all in the gallery!

Update:
So the first Intel rep we spoke to was a little confused — Pine Trail only supports 480p playback, although it can apparently do 720p if pushed. Native HD isn’t on Intel’s roadmap until the next generation of these chips, so If you want HD right now, Intel’s pointing manufacturers to the Broadcom Crystal HD video accelerator, which usually ends up costing about $30 extra at retail. So to recap: Intel’s integrating graphics into its CPU dies, but in order to play back HD content, you still need a separate video processor to handle the decoding. How very efficient.

Filed under: ,

Eyes-on with Intel’s Pine Trail CPU/GPU hybrid and new Gulftown gaming chip (update) originally appeared on Engadget on Tue, 22 Sep 2009 14:06:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

OTOY uses AMD GPUs, black magic to put Crysis on iPhone

No need to dust off your spectacles — Crysis on the iPhone has been achieved. Just last week we took a peek at the graphical enhancements on the iPhone 3GS, but this demonstration didn’t rely on the factory goods from Apple. Instead, a recent OTOY demonstration put to use some of AMD’s newest GPU technology in order to play back one of the leading-edge 3D titles on a smartphone. In short, OTOY renders the game on remote servers and then sends information to a recipient; needless to say, an HDTV displayed all sorts of artifacts, but on a screen that’s just a few inches large, those flaws become invisible. So, is this really the killer app to supplant Apple’s own App Store for gaming on the iPhone? We get the feeling OTOY needs at least few clean-cut commercials with little-known underground music before they can bank on that.

[Via SlashGear]

Filed under: , ,

OTOY uses AMD GPUs, black magic to put Crysis on iPhone originally appeared on Engadget on Mon, 14 Sep 2009 21:04:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Giz Explains: Why Tech Standards Are Vital For Apple (And You)

Tech standards are important. They’re, well, standards. They shape the way the world works, ideally. So if you wanna influence your little world, you probably wanna shape (or maybe even create) standards. Take Apple, for example.

They Call It “Open” For a Reason
One of the more excellent aspects of Snow Leopard, actually, is its full-scale deployment of OpenCL 1.0—Open Computing Language—a framework that allows programmers to more easily utilize the full power of mixes of different kinds of processors like GPUs and multi-core CPUs. (Much of the excitement for that is in leveraging the GPU for non-graphical applications.)

OpenCL lives up to its name: It is a royalty-free open standard managed by the Khronos Group, and supported by AMD/ATI, Apple, ARM, IBM, Intel, Nvidia, among others. Interesting thing about this open industry standard is that it was developed and proposed by… Apple.

What Is a Standard?
By “standard,” we’re talking about a format, interface or programming framework that a bunch of companies or people or organizations agree is the way something’s going to get done, whether it’s how a movie is encoded or the way websites are programmed. Otherwise, nothing works. A video that plays on one computer won’t play on another, web sites that work in one browser don’t work in another, etc. With increased connectedness between different machines and different platforms, standards are increasingly vital to progress.

Standards can range from open (anybody can use them, for free) to open with conditions (anybody can use them as long they follow conditions X, Y and Z) to closed (you gotta have permission, and most likely, pay for it). Some companies view standards strictly as royalty machines; others don’t make much money on them, instead using them to make sure developers do things the way they want them to. Apple falls into this latter category, by choice or possibly just by fate.

Kicking the Big Guy in the Shins
Of course, OpenCL isn’t the only open standard that Apple’s had a hand in creating or supporting that actually went industry-wide. When you’re the little guy—as Apple was, and still is in computer OS marketshare, with under 10 percent—having a hand in larger industry standards is important. It keeps your platform and programming goals from getting steamrolled by, say, the de facto “standards” enforced by the bigger guy who grips 90 percent of the market.

If you succeed in creating a standard, you’re making everybody else do things the way you want them done. If you’re doubting how important standards are, look no further than the old Sony throwing a new one at the wall every week hoping it’ll stick. Or Microsoft getting basically everybody but iTunes to use its PlaysForSure DRM a couple years ago. Or its alternative codecs and formats for basically every genuine industry standard out there. To be sure, there is money to be made in standards, but only if the standard is adopted—and royalties can be collected.

Web Standards: The Big Headache
The web has always been a sore spot in the standards debate. The web is a “universal OS,” or whatever the cloud-crazy pundits call it, but what shapes your experience is your browser and in part, how compliant it is with the tools web developers use to build their products. Internet Exploder shit all over standards for years, and web programmers still want IE6 to die in a fiery eternal abyss.

Enter WebKit, an open source browser engine developed by Apple based off of the KHTML engine. It’s so standards-compliant it tied with Opera’s Presto engine to be the first to pass the Acid3 test. What’s most striking about WebKit isn’t the fact it powers Safari and Google Chrome on the desktop, but basically every full-fledged smartphone browser: iPhone, Android, Palm Pre, Symbian and (probably) BlackBerry. So WebKit hasn’t just driven web standards through its strict adherence to them, but it has essentially defined, for now, the way the “real internet” is viewed on mobile devices. All of the crazy cool web programming you see now made is made possible by standards-compliant browsers.

True, OpenCL and WebKit are open source—Apple’s been clever about the way it uses open source, look no further than the guts of OS X—but Apple is hardly devoted to the whole “free and open” thing, even when it comes to web standards.

All the AV Codecs You Can Eat
The recent debate over video in the next web standards, known collectively as HTML5, shows that: Mozilla supports the open-source Ogg Theora video codec, but Apple says it’s too crappy to become the web’s default video standard—freeing everyone from the tyranny of Adobe’s Flash. Apple says Ogg’s quality and hardware acceleration support don’t match up to the Apple-supported MPEG-4 standardized H.264 codec, which is tied up by license issues that keep it from being freely distributed and open. (Google is playing it up the middle for the moment: While it has doubts about the performance of Ogg Theora, Chrome has built-in support for it and H.264.)

Apple has actually always been a booster of MPEG’s H.264 codec, which is the default video format supported by the iPhone—part of the reason YouTube re-encoded all of its videos, actually—and gets hardware acceleration in QuickTime X with Snow Leopard. H.264 is basically becoming the video codec (it’s in Blu-ray, people use it for streaming, etc.).

Why would Apple care? It means Microsoft’s WMV didn’t become the leading standard.

A sorta similar story with AAC, another MPEG standard. It’s actually the successor to MP3, with better compression quality—and no royalties—but Apple had the largest role in making it mainstream by making it their preferred audio format for the iPod and iTunes Store. (It saw some limited use in portables a little earlier, but it didn’t become basically mandatory for audio players to support it until after the iPod.) Another bonus, besides AAC’s superiority to MP3: Microsoft’s WMA, though popular for a while, never took over.

FireWire I Mean iLINK I Mean IEEE 1394
Speaking of the early days of the iPod, we can’t leave out FireWire, aka IEEE 1394. Like OpenCL, Apple did a lot of the initial development work (Sony, IBM and others did a lot of work on it as well), presented it to a larger standards body—the Institute of Electrical and Electronics Engineers—and it became the basis for a standard. They tried to charge a royalty for it at first, but that didn’t work out. It’s a successful standard in a lot of ways—I mean, it is still on a lot of stuff like hard drives and camcorders still—but USB has turned out to be more universal, despite being technically inferior. (At least until USB 3.0 comes out, hooray!)

Update: Oops, forgot Mini DisplayPort, Apple’s shrunken take on DisplayPort—a royalty-free video interface standard from VESA that’s also notably supported by Dell—which’ll be part of the official DisplayPort 1.2 spec. Apple licenses it for no fee, unless you sue Apple for patent infringement, which is a liiiiittle dicey. (On the other hand, we don’t see it going too far as industry standard, which is why we forgot about it.)

That’s just a relatively quick overview of some of the standards Apple’s had a hand in one way or another, but it should give you an idea about how important standards are, and how a company with a relatively small marketshare (at least, in certain markets) can use them wield a lot of influence over a much broader domain.

Shaping standards isn’t always for royalty checks or dominance—Apple’s position doesn’t allow them to be particularly greedy when it comes to determining how you watch stuff or browse the internet broadly. They’ve actually made things better, at least so far. But, one glance at the iPhone app approval process should give anybody who thinks they’re the most gracious tech company second thoughts about that.

Still something you wanna know? Send questions about standards, things that are open other than your mom’s legs or Sony Ultra Memory Stick XC Duo Quadro Micro Pro II to tips@gizmodo.com, with “Giz Explains” in the subject line.

Ask Engadget: Best ultraportable laptop for gaming?

We know you’ve got questions, and if you’re brave enough to ask the world for answers, here’s the outlet to do so. This week’s Ask Engadget question is coming to us from Ron, who would just be happy with an ultraportable with an actual, bona fide, worthwhile GPU.

“I am looking for a 12- or 13-inch ultraportable that can also play modern games at a reasonable level, for less than $1,000. I know the brainiacs out there can help me out. Love the site, thanks!”

We know for sure that Dell’s Studio XPS 13 has the guts to pull off a few modern titles, but we know there are far more options out there than that. So, who here has a super small laptop with a discrete GPU worth bragging about? Don’t hold back now, vaquero.

Filed under: ,

Ask Engadget: Best ultraportable laptop for gaming? originally appeared on Engadget on Thu, 27 Aug 2009 21:03:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments