ECS plans a trio of netbooks, duo of all-in-one PCs for Computex

Oh look, it’s nearly time for Computex, which means it’s finally time for ECS to come out to play again. For whatever reason, it seems the aforesaid PC maker only pulls out the stakes for Taiwan’s biggest consumer electronics show, and with the doors opening early next week, we’re getting a sneak peek at what it’ll be bringing to the mix. Not surprisingly, three of the five new machines are said to be of the netbook variety, with the other two being all-in-one desktops. ‘Course, the whole lot will be humming along on Intel’s all-too-modest Atom, though we are led to believe that at least one rig will get equipped with NVIDIA’s promising Ion technology. The T10IL (shown left) is apt to steal most of the attention, boasting a thin-and-light frame that’ll look awfully similar to ASUS’ Eee PC 1008HA. The V10IL (shown right) is expected to be more of a vanilla type machine in terms of both design and specification, and the other guys are slated to be revealed at the show. You’re tense with anticipation, aren’t you?

Filed under:

ECS plans a trio of netbooks, duo of all-in-one PCs for Computex originally appeared on Engadget on Fri, 29 May 2009 14:59:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

ASUS Mars GPU weds twin GeForce GTX 285s, might just melt your face

You into frame rates? No, we mean are you frickin’ bonkers over watching your rig hit triple digits in a Crysis timedemo? If you’re still nodding “yes,” have a gander at what’ll absolutely have to be your next buy. The ASUS Mars 295 Limited Edition is quite the unique beast, rocking a pair of GTX 285 chips that are viewed by Windows as a GeForce GTX 295. All told, you’re looking at 240 shader processors, a 512-bit GDDR3 memory interface, 32 total memory chips and 4GB of RAM. Amazingly, the card is totally compatible with existing drivers and is Quad-SLI capable, and if all goes to plan, it’ll actually peek its head out at Computex next week. Rest assured, we’ll do everything we can to touch it.

Filed under: ,

ASUS Mars GPU weds twin GeForce GTX 285s, might just melt your face originally appeared on Engadget on Fri, 29 May 2009 11:22:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Alienware’s M17X gaming laptop with twin GTX 280M GPUs truly is all powerful

The announcement wasn’t scheduled for a few more days — four according to the teaser site — but it looks like Alienware’s All Powerful gaming laptop has been set free anyway. So, does it live up to the clues? Pretty much… how does a pair of 1GB NVIDIA GeForce GTX 280M GPUs strike you? No Core i7 listed, instead we’re looking at a Core 2 Extreme quad-core CPU at the top end with up to 8GB of 1333MHz DDR3 memory, and 1TB of 7200-rpm disk or a 512GB SSD if you prefer. RAID 1 or RAID 0? Sure. Rounding things out is a nine-cell battery of unstated performance, FireWire, 4x USB, eSATA, ExpressCard, 802.11n WiFi, 8-in-1 media card reader, dual-layer Blu-ray, a 1920 x 1200 pixel edge-to-edge LCD, DisplayPort and HDMI-outs all wrapped up in a massive chassis weighing 11.68-pounds with a 15.98 x 12.65 x 2.11-inch footprint. It’s also packing a GeForce 9400M G1 GPU with HybridPower technology that allows you to scale the graphics back to conserve battery power. Prices start at $1,799 for a lot less than we mentioned above.

As a footnote to the details above, PCWorld also says that Alienware will use next week’s E3 show to update us on its 42.8-inch curved monitor we went hands-on with back in January of 2008.

[Thanks, Steve]

Filed under: ,

Alienware’s M17X gaming laptop with twin GTX 280M GPUs truly is all powerful originally appeared on Engadget on Fri, 29 May 2009 05:28:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Lenovo’s Ion-powered IdeaPad S12 shows HD prowess on video

Lenovo did itself a solid by beating the likes of Acer and ASUS with its Ion-infused IdeaPad S12, and now we’re beginning to see a few more details on what performance will be like. We still wish the machine had something a bit more powerful than a 1.6GHz Intel Atom N270, but despite the fact that it’s hobbling along on an aged CPU, the machine seems to handle 3D gaming and 1080p content with poise. The crew over at Notebooks managed to spend a little quiet time with a pre-production version of the S12, and it even managed to host up a few videos while the machine was kicking out content that would make the typical netbook buckle. Feel free to hit the read link to have a look yourself, and be sure to mind the three American SKUs. Here’s a preview: the Ion-powered version (read: the one you want) will run $499.99 and include 1GB of RAM, a 6-cell battery and a 160GB hard drive.

[Via GottaBeMobile]

Continue reading Lenovo’s Ion-powered IdeaPad S12 shows HD prowess on video

Filed under:

Lenovo’s Ion-powered IdeaPad S12 shows HD prowess on video originally appeared on Engadget on Mon, 25 May 2009 10:51:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Lenovo’s $449 IdeaPad S12 now official: first netbook with NVIDIA’s Ion chipset

We’ve seen NVIDIA’s Ion placed within a nettop, a motherboard, and now (at long last), a laptop. Yep, the machine you’re inevitably peering at above (Lenovo’s S12) is both the company’s first 12.1-inch netbook and the planet’s first netbook with Ion baked in, and it’s likely just a snippet of the kind of material we can expect to see at Computex. Frankly, this is one of the first netbooks in ages that has managed to get our blood moving, with a 100-percent full-size keyboard, the promise of 1080p video playback, a sub-3 pound weight and a starting tag of just $449. Other specs include a WXGA (1,280 x 800) resolution LED-backlit panel, Intel’s 1.6GHz Atom CPU (the one big “ugh”), 1GB of DDR2 memory, 160/250/320GB HDD options, an optional 6-cell battery, Ethernet jack, WiFi, Bluetooth, three USB 2.0 sockets, an ExpressCard slot, a 4-in-1 card reader and VGA / HDMI outputs. Thankfully you’ll find Windows XP running the show, and you’ll be able to grab your own starting next month. Full release is after the break.

Continue reading Lenovo’s $449 IdeaPad S12 now official: first netbook with NVIDIA’s Ion chipset

Filed under:

Lenovo’s $449 IdeaPad S12 now official: first netbook with NVIDIA’s Ion chipset originally appeared on Engadget on Mon, 25 May 2009 06:30:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Lenovo’s ION-based S12 makes netbooks exciting again (update: less than $600)

We don’t have a lot of information at the moment, but CNET is reporting that Lenovo is getting ready to launch the world’s first ION-based netbook. While CNET doesn’t give it a model number, the filename used on the image reads “LenovoS12netbook” and is said to pack discrete graphics and NVIDIA’s Ion processor chipset.Technically, the 12-inch laptop is too big for netbook classifaction and too chubby to be a CULV thin-and-light. Then again, those are classifactions of Intel’s making which doesn’t mean a whole lot to the boys from NVIDIA.

Update: PC Perspective has additional detail about the S12: 1.6GHz Atom N270 processor, full-size keyboard, and what looks like an HDMI-out when it lands in July or August for less than $600.

Update 2
: Interesting. Netbooknews.de has a proven record with insider-netbook news and claims that the S12 will eventually include a Via Nano processor option.

[Via PCPer]

Filed under:

Lenovo’s ION-based S12 makes netbooks exciting again (update: less than $600) originally appeared on Engadget on Mon, 25 May 2009 01:55:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

CE-Oh no he didn’t! Part LX: NVIDIA calls Intel’s single-chip Atom pricing “pretty unfair”

Now that Intel’s been slapped with a record $1.45b antitrust fine in Europe, it seems like the claws are coming out — AMD just put up that “Break Free” site, and today we’ve got NVIDIA CEO Jen-Hsun Huang calling Chipzilla’s Atom pricing “unfair.” It seems that Intel sells the standard Atom chip for $45 on its own, but bundles the diminutive CPU into the oh-so-familiar netbook configuration for just $25, meaning NVIDIA’s Ion chipset isn’t price-competitive. Of course, this is just another twist in the endless argument about Ion, but despite the denials, this isn’t the first time we’ve heard whispers that Intel pretty much forces manufacturers to buy complete Atom chipsets — the dearth of Ion-powered netbooks in the market is fairly suspicious considering the GeForce 9400M at the heart of the platform is a well-known quantity. On the other hand, we’ve also heard this is all going to change and change dramatically at Computex next month, so we’ll see — either way, things are bound to get interesting.

[Via TrustedReviews]

Filed under:

CE-Oh no he didn’t! Part LX: NVIDIA calls Intel’s single-chip Atom pricing “pretty unfair” originally appeared on Engadget on Wed, 20 May 2009 15:47:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

eMachines delivers EL1300 line of small form factor PCs

Once the laughing stock of the PC world, eMachines has managed to pull together some rather stylish looking rigs over the past few months. As the comeback continues, the company has outed two new Mini PCs in its EL1300 line, the $298 EL1300G-01w and the $398 EL1300G-02w. Both systems include a chassis that’s 10.7-inches tall, 4.2-inches wide and 15-inches long (not exactly “mini” in our books…), and while the power ain’t anything to write home about, it should handle Word processing and the occasional YouTube video fine. Speaking of specs, both rigs boast a 1.6GHz AMD Athlon 2650e CPU, NVIDIA’s GeForce 6150SE integrated graphics, a 160GB SATA HDD, 18x SuperMulti DVD burner, nine USB 2.0 sockets and a multicard reader. Personally, we’d select the more pricey of the two, as that one arrives with a 20-inch LCD (E202H) and Windows XP rather than Vista Home Basic. Totally your call though, boss.

Filed under:

eMachines delivers EL1300 line of small form factor PCs originally appeared on Engadget on Wed, 20 May 2009 09:36:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Intel’s Medfield Project May, May Not Go Into Smartphones

It’s all very wink wink, nudge nudge, hush hush, but the odor that Intel is giving off in this Fortune article about the Medfield project is that Intel’s trying to shrink x86 down to smartphones.

Intel’s roadmap looks like this: Now they have Atom, which powers many of the netbooks on the market today. Next comes Moorestown, which is supposed to be like the Atom, but house two chips and be a low-power solution that can be customizable (the 2nd chip) for whatever gadget a client shoves it into. Moorestown isn’t quite small enough for smartphones, but Intel’s saying Medfield may be, when Medfield follows up Moorestown.

There’s a lot of hinting, but not a lot of outright declaration here, so it’s not certain that Medfield may be able to fit into something the size of an iPhone or a Pre or an Android. What they are saying is that they can fit into something the size of a UMPC or a MID or a large PMP—something that Nvidia’s Tegra or Qualcomm’s Snapdragon are aiming for as well.

The timeline for Medfield is 2011ish, so there’s a while yet before anything materializes. But if Intel does somehow find a way to get their system-on-a-chip into your phones, that means bigger OSes and more laptop-like performance. We’ll see. [Fortune]

Giz Explains: GPGPU Computing, and Why It’ll Melt Your Face Off

No, I didn’t stutter: GPGPU—general-purpose computing on graphics processor units—is what’s going to bring hot screaming gaming GPUs to the mainstream, with Windows 7 and Snow Leopard. Finally, everbody’s face melts! Here’s how.

What a Difference a Letter Makes
GPU sounds—and looks—a lot like CPU, but they’re pretty different, and not just ’cause dedicated GPUs like the Radeon HD 4870 here can be massive. GPU stands for graphics processing unit, while CPU stands for central processing unit. Spelled out, you can already see the big differences between the two, but it takes some experts from Nvidia and AMD/ATI to get to the heart of what makes them so distinct.

Traditionally, a GPU does basically one thing, speed up the processing of image data that you end up seeing on your screen. As AMD Stream Computing Director Patricia Harrell told me, they’re essentially chains of special purpose hardware designed to accelerate each stage of the geometry pipeline, the process of matching image data or a computer model to the pixels on your screen.

GPUs have a pretty long history—you could go all the way back to the Commodore Amiga, if you wanted to—but we’re going to stick to the fairly present. That is, the last 10 years, when Nvidia’s Sanford Russell says GPUs starting adding cores to distribute the workload across multiple cores. See, graphics calculations—the calculations needed to figure out what pixels to display your screen as you snipe someone’s head off in Team Fortress 2—are particularly suited to being handled in parallel.

An example Nvidia’s Russell gave to think about the difference between a traditional CPU and a GPU is this: If you were looking for a word in a book, and handed the task to a CPU, it would start at page 1 and read it all the way to the end, because it’s a “serial” processor. It would be fast, but would take time because it has to go in order. A GPU, which is a “parallel” processor, “would tear [the book] into a thousand pieces” and read it all at the same time. Even if each individual word is read more slowly, the book may be read in its entirety quicker, because words are read simultaneously.

All those cores in a GPU—800 stream processors in ATI’s Radeon 4870—make it really good at performing the same calculation over and over on a whole bunch of data. (Hence a common GPU spec is flops, or floating point operations per second, measured in current hardware in terms of gigaflops and teraflops.) The general-purpose CPU is better at some stuff though, as AMD’s Harrell said: general programming, accessing memory randomly, executing steps in order, everyday stuff. It’s true, though, that CPUs are sprouting cores, looking more and more like GPUs in some respects, as retiring Intel Chairman Craig Barrett told me.

Explosions Are Cool, But Where’s the General Part?
Okay, so the thing about parallel processing—using tons of cores to break stuff up and crunch it all at once—is that applications have to be programmed to take advantage of it. It’s not easy, which is why Intel at this point hires more software engineers than hardware ones. So even if the hardware’s there, you still need the software to get there, and it’s a whole different kind of programming.

Which brings us to OpenCL (Open Computing Language) and, to a lesser extent, CUDA. They’re frameworks that make it way easier to use graphics cards for kinds of computing that aren’t related to making zombie guts fly in Left 4 Dead. OpenCL is the “open standard for parallel programming of heterogeneous systems” standardized by the Khronos Group—AMD, Apple, IBM, Intel, Nvidia, Samsung and a bunch of others are involved, so it’s pretty much an industry-wide thing. In semi-English, it’s a cross-platform standard for parallel programming across different kinds of hardware—using both CPU and GPU—that anyone can use for free. CUDA is Nvidia’s own architecture for parallel programming on its graphics cards.

OpenCL is a big part of Snow Leopard. Windows 7 will use some graphics card acceleration too (though we’re really looking forward to DirectX 11). So graphics card acceleration is going to be a big part of future OSes.

So Uh, What’s It Going to Do for Me?
Parallel processing is pretty great for scientists. But what about those regular people? Does it make their stuff go faster. Not everything, and to start, it’s not going too far from graphics, since that’s still the easiest to parallelize. But converting, decoding and creating videos—stuff you’re probably using now more than you did a couple years ago—will improve dramatically soon. Say bye-bye 20-minute renders. Ditto for image editing; there’ll be less waiting for effects to propagate with giant images (Photoshop CS4 already uses GPU acceleration). In gaming, beyond straight-up graphical improvements, physics engines can get more complicated and realistic.

If you’re just Twittering or checking email, no, GPGPU computing is not going to melt your stone-cold face. But anyone with anything cool on their computer is going to feel the melt eventually.