WinFast HPVC1100 is world’s first external SpursEngine encoder

Toshiba’s Cell-based SpursEngine HD video co-processor has made plenty of appearances within monstrous gaming machines, but this marks the very first time where it has stepped out of the laptop chassis and into a portable enclosure. Granted, the language barrier is killing us here, but it seems as if the Leadtek WinFast HPVC1100 wraps a SpursEngine encoder into an on-the-go solution that can be lugged around with a standard laptop in order to churn through video while on set, in the field or on the road. Other specs include 128MB of RAM, a PCI-Express slot and a weight of 1.54 pounds; there’s no word just yet on pricing or availability. One more shot is after the break.

[Via Akihabara News]

Continue reading WinFast HPVC1100 is world’s first external SpursEngine encoder

Filed under: ,

WinFast HPVC1100 is world’s first external SpursEngine encoder originally appeared on Engadget on Tue, 12 May 2009 08:53:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Happy 40th Birthday AMD: 4 Ways You Beat Intel in the Glory Days

AMD, the other chip company, is 40 years old today. It’s the scrappy underdog to the Intel juggernaut. Today, it’s not in great shape, but at one point, it was actually beating Intel on innovation.

AMD tried to kill the megahertz myth before Intel. During the Pentium 4 days Intel kept pushing clock speeds higher and higher, before it hit a wall and abandoned the Prescott architecture. The message was clearly, “more megahertz is more better.” AMD’s competing Athlon XP chips, while clocked slower, often beat their Pentium 4 rivals. Ironically, AMD was the first to 1GHz, as some commenters have pointed out (don’t know how I forgot that). Obviously though, AMD’s performance lead didn’t last forever.

AMD beat Intel to 64-bit in mainstream computers. And we’re not just talking about its Opteron and Athlon 64 processors. AMD actually designed the X86-64 specification, which Intel wound up adopting and licensing—so AMD’s spec is used Intel’s 64-bit processors to this day.

AMD was first to consider energy efficiency in processor designs. Okay, this is kind of an extension of point number one, but during Intel’s Pentium 4 ‘roid rage period AMD’s processors consistently used less power than Intel’s. Intel’s performance per watt revelation didn’t really start until the Pentium M (which was actually a throwback to the P6 architecture), which set the tone for Intel’s new direction in its successor, the Core line of chips.

AMD beat Intel to having an integrated memory controller. A tech feature AMD lorded over Intel for years: AMD’s processors started integrating the memory controller with its processors years ago, reducing memory latency. Intel’s first chip to use an integrated memory controller is the Core i7—before, the memory controller was separate from the processor. (Here’s why Intel says they held off.)

Athlon XP and Athlon 64—those were the good old days, AMD’s cutthroat competitive days. The days they were ahead of Intel. I miss them—at one point, every hand-built computer in my house ran AMD processors. I felt like a rebel—a rebel with faster, cheaper computers.

Unfortunately, I don’t run AMD chips anymore. Intel came back, and came back hard. But here’s hoping for another resurgence, and another 40 years, guys. Share your favorite AMD memories in the comments.

Apple Quietly Recruits Chip Designers for In-House Tech

84994761_3a42505030_o

Apple’s recent hiring spree of chip designers reveals the company may be about to exert even more control over the components that go into its products.

The company may go so far as manufacturing computer processors in-house, according to The Wall Street Journal, which cites only anonymous sources to bolster its claim that the internally designed chips will appear in products no sooner than 2010.

The publication also cites profiles on professional networking site LinkedIn, which lists more than 100 Apple employees with past expertise in chips at companies such as Intel, Samsung and Qualcomm.

These recruitments, coupled with Apple’s 2008 acquisition of PA Semiconductor, serve as strong evidence that the company is moving toward chip design for its hardware, including iPhones and iPods and possibly Macs. Such a move would reduce Apple’s dependence on Intel, which manufactures processors for current Mac computers, and Samsung, which provides an ARM-based microprocessor for the iPhone.

Apple has always kept a tight rein on its suppliers, going so far as acquiring them when necessary to ensure consistent access to critical components. Apple has enough clout that it was even able to negotiate with Intel — a far bigger company — to develop a smaller version of the Core 2 Duo processor for the MacBook Air.

By acquiring in-house semiconductor talent, Apple opens several options: It could more easily customize chips and chipsets from suppliers like Intel, giving Apple hardware unique features (and perhaps raising additional, hardware-based barriers to hackintosh clones — generic PCs running OS X). It could develop its own graphics processors for the iPhone and iPod touch, giving them more serious gaming chops. It could create more compact system-on-a-chip processors that would enable future iPhones (or iPhone-like devices) to be even smaller. Or, in the most ambitious case, it could develop its own CPUs.

In November, Wired.com also speculated that Apple was moving toward in-house chip manufacturing when the company hired former IBM executive Mark Papermaster. Papermaster was a key player in developing the PowerPC chips used in previous-generation Macs.

With control over processor production, Apple will be able to design exclusive features for its gadgets and better guard its secrets from rivals.

Though in-house chip manufacturing would enable Apple to tighten control over its products, technology strategist Michael Gartenberg said it’s unlikely the corporation will produce its own processors for Mac computers. He explained the move would be risky for Apple, as it would cost billions of dollars, and it would be difficult to compete with Intel.

“People have lost fortunes competing with Intel,” Gartenberg said. “It doesn’t make sense [for Apple]. You’d have to get to a point where Intel simply wasn’t able to meet Apple’s needs in any shape or form.”

Rather than producing computer chips, it’s more likely Apple is hiring chip designers to produce custom chipset variants for future products, which could offer special audio and graphic enhancements exclusive to Apple gadgets, Gartenberg speculated. He added that chip experts can also loan advice on manufacturing and design processes to create smaller, thinner and lighter gadgets.

Updated 12 p.m. PDT: Added comment from technology strategist Michael Gartenberg.

See Also:

In Major Shift, Apple Builds Its Own Team to Design Chips [WSJ]

Photo: blakie/Flickr


AMD’s $69 2.8GHz Athlon X2 7850 Black Edition CPU launched, reviewed

AMD already showed us yesterday what kind of graphical prowess could be crammed into a sub-$100 GPU, and today it’s attempting to pull the same kind of stunt on the CPU front. The Athlon X2 7850 Black Edition — a 2.8GHz chip with 2MB of L3 cache and loads of overclocking potential — has just been loosed, and with a downright stunning $69 MSRP, we’d say it’ll have budget gamers across the nation paying attention. Reviewers across the web voiced their appreciation for the low price, and while the processor didn’t burn any barns down along the way, it did manage to garner a sufficient amount of praise while on the bench. NeoSeeker seemed to capture the general consensus with this: “the Athlon X2 7850 is a decent processor that is able to power even the latest games.” ‘Course, the performance-per-watt was a bit lacking given the 65nm manufacturing process, but it’s not like you can have your cake and eat it too.

Read – NeoSeeker (“a decent processor”)
Read – HiTechLegion (“performed very well”)
Read – Guru3D (“packs decent muscle and has reasonable overclock potential”)
Read – Bit-tech (“unsurprisingly underwhelming compared to the 7750 Black Edition”)
Read – Overclocker’s Club (“impressed with the increased performance”)
Read – Benchmark Reviews (“an incredible value”)
Read – Detailed specifications
Read – AMD press release

Filed under:

AMD’s $69 2.8GHz Athlon X2 7850 Black Edition CPU launched, reviewed originally appeared on Engadget on Wed, 29 Apr 2009 12:18:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Demand for Intel’s Atom CPUs finally beginning to cool?

It was inevitable, really — but the incessant demand for Intel’s woefully underpowered Atom processors sure did last a lot longer than we anticipated. Originally made famous by those so-called “netbooks,” the Atom is currently facing two hurdles in remaining wildly popular: 1) slumping demand for new PCs and 2) bona fide competition. For months on end, the Atom really was the only game in town when it came to powering netbooks and nettops, but with the unveiling on NVIDIA’s Ion, the promise of a GPGPU (or cGPU) and Intel’s own CULV platform, Atom’s necessity in the market is becoming less intense. The interesting part here is that Intel is purportedly hawking its inventory to “second-tier and China-based vendors” as it looks to minimize warehouse clutter, which certainly makes us hope for lower-cost low-cost lappies to show up in the near future.

Read – Atom demand slowing
Read – Intel: PC sales hit rock bottom

Filed under: ,

Demand for Intel’s Atom CPUs finally beginning to cool? originally appeared on Engadget on Tue, 28 Apr 2009 09:29:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

NVIDIA’s GT300 specs outed — is this the cGPU we’ve been waiting for?

NVIDIA’s been dabbling in the CPU space behind closed doors for years now, but with Intel finally making a serious push into the GPU realm, it’s about time the firm got serious with bringing the goods. BSN has it that the company’s next-generation GT300 will be fundamentally different than the GT200 — in fact, it’s being hailed as the “first truly new architecture since SIMD (Single-Instruction Multiple Data) units first appeared in graphical processors.” Beyond this, the technobabble runs deep, but the long and short of it is this: NVIDIA could be right on the cusp of delivering a single chip that can handle tasks that were typically separated for the CPU and GPU, and we needn’t tell you just how much your life could change should it become a reality. Now, if only NVIDIA would come clean and lift away some of this fog surrounding it (and the rumored GTX 380), that’d be just swell.

[Thanks, Musouka]

Filed under:

NVIDIA’s GT300 specs outed — is this the cGPU we’ve been waiting for? originally appeared on Engadget on Sun, 26 Apr 2009 13:22:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

AMD releases another notebook roadmap, does not release Fusion chips

Well, well, a new AMD roadmap promising a superior hybrid CPU/GPU chip sometime in the distant future. That doesn’t sound like the same old vaporware refrain we’ve been hearing about Fusion since 2006 at all, does it? Yep, everyone’s favorite underdog is back in the paperwork game, and this time we’ve got a sheaf of pointy-eared details on the company’s upcoming notebook plans, all culminating in the “Sabine” platform, which is wholly dependent on Sunnyvale actually shipping a mobile variant of the delayed Fusion APU in 2011 once it finds the Leprechaun City. In the meantime, look forward to a slew of forgettable laptops getting bumped to the “Danube” platform, which supports 45nm quad-core chips, DDR3-1066 memory, and an absolutely shocking 14 USB 2.0 ports. Ugh, seriously — does anyone else think AMD should suck it up, put out a cheap Atom-class processor paired with a low-end Radeon that can do reasonable HD video output, and actually take it to Intel in booming low-end market instead of goofing around with the expensive, underperforming Neo platform and a fantasy chip it’s been promising for three years now? Call us crazy.

[Via PC Authority; thanks Geller]

Filed under:

AMD releases another notebook roadmap, does not release Fusion chips originally appeared on Engadget on Wed, 15 Apr 2009 13:47:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Intel Core 2 Quad S-Series shaves power consumption to 65W

In a relatively hush-hush manner, Intel recently slipped out energy saving versions of its Core 2 Quad Q8200, Q9400 and Q9550 CPUs, all of which are suffixed with a simple “s.” Put simply, these S-Series chips are built using the same 45 nanometer process technology as used on the regular models, and aside from TDP, all the specifications are exactly alike. The difference comes in power consumption, as the S crew sucks down just 65 watts compared to 95 watts in the standard issue models. Tom’s Hardware had a chance to handle, benchmark and report on these new power sippers, and lucky for you, they found performance to be equal to that of the higher power chips. Granted, you’ll have to pony up a few extra bucks in order to treat Mother Earth (and your energy bill) better, but at least we’re working down the power ladder instead of the other way around.

[Via Tom’s Hardware, thanks Jonathan]

Filed under:

Intel Core 2 Quad S-Series shaves power consumption to 65W originally appeared on Engadget on Sun, 29 Mar 2009 10:44:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Intel to officially refresh laptop chips next week?

We had a hunch this refresh was coming, and according to information gathered by CNET, it’s all going down on Monday. The 2.53GHz SP9600, complete with its 6MB of cache memory and $316 sticker, will reportedly be revealed alongside the 1.6GHz SU9600, which will be pegged at $289. Furthermore, we should see a single-core 1.4GHz SU3500 ($262) with a thermal envelope of only 5.5 watts, which will obviously cater to those really, really low-power applications where horsepower isn’t a concern. Interestingly, these newfangled pieces of silicon won’t be those rumored CULV chips we heard about in January, as those won’t be good and ready ’til summertime. There’s also a slight chance that we’ll hear a bit more on Intel’s reemergence in the GPU field, but we’re not holding our breath quite so much on that. Dig in below for lots more, or just be patience and wait for the 30th. Totally your call.

Read
– Intel CPU details
Read – Intel GPU details

Filed under:

Intel to officially refresh laptop chips next week? originally appeared on Engadget on Fri, 27 Mar 2009 21:51:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Graphene chip could hit 1,000GHz, make your Core i7 feel totally inadequate

8GHz (with the help of liquid nitrogen) not quick enough? Leave it to the folks at MIT to make sure your zaniest desires are well taken care of. As research forges ahead on graphene, carbon nanotubes and buckyballs (remember those?), gurus at the university have discovered a breakthrough that could eventually lead to microchips that make existing silicon-based CPUs weep. In fact, the research could lead to practical systems in the 500 to 1,000 gigahertz range. The magic all ties back to advancements on a graphene chip known as a frequency multiplier, and while the nitty-gritty of all this is far too complicated for the layperson to grasp, all you really need to know is this: finally, you can rest assured that you’ll one day own a chip capable of handling Duke Nukem Forever.

[Via InformationWeek]

Filed under: ,

Graphene chip could hit 1,000GHz, make your Core i7 feel totally inadequate originally appeared on Engadget on Thu, 26 Mar 2009 08:03:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments