Apple Assembles Chip Design Team, Plans Custom iPhone Guts

The WSJ reports that Apple is putting together an all-star chip design team, starting with the former CTO of AMD, to work on in-house units for mobile devices. Old habits die hard.

Apple is prone to occasional fits of vertical integration, and has never been terribly reluctant to run counter to the prevailing hardware winds, but this doesn’t sound like some Jobsian act of contrarianism. The report indicates that it’s the iPhone’s unique power and performance demands that are driving this move, at least ostensibly:

Apple could use the internally developed chips to sharply reduce the power consumption of its hit iPhone and iPod touch devices, and possibly add graphics circuitry to help its hardware play realistic game software and high-definition videos, people familiar with its plans say.

Apple already works with Samsung, the manufacturer of the ARM-based processors used in the iPhone and iPod Touch, to design chips suited to their specific needs, and Apple is a large enough company that it doesn’t have trouble coaxing tailor-made hardware out of its suppliers. But totally in-house chip design boasts the huge advantage of secrecy; removing Samsung from the equation ensures that any power-saving, graphics-boosting chip features Apple manages to conjure for their next iWhatever don’t eventually find their way into units available to other industry giants like HTC or RIM.

So don’t confuse Apple’s latest move with an effort to spur innovation—from here, this looks like technology-hoarding, pure and simple; a bid to further insulate their mobile devices from competition by locking down their hardware as hard as they do their software. [WSJ]

NEC and Renesas looking to join forces against semiconductor evil

We’re always up for a good semiconductor merger, and it looks like NEC Electronics and Renesas are prepping the biggest one we’ve seen this week. The two companies have agreed to team up, creating a combined force of $13 billion in yearly sales, and the largest chip company in Japan — Renesas was already the product of a chip merger between Hitachi and Mitsubishi Electric. They’ll still be behind Intel and Samsung in the overall game, but we won’t hold that against them. Tokyo analysts believe this might lead to other “defensive” mergers by other Japanese chipmakers, but we’ll have to wait and see. NEC and Renesas hope to finish talks by July and become a single company by April of next year.

[Via Electronista]

Filed under:

NEC and Renesas looking to join forces against semiconductor evil originally appeared on Engadget on Tue, 28 Apr 2009 11:18:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Intel’s upcoming mobile chips to squeeze 3GHz out of Penryn, bring high-performance ULV to the masses

Yeah, we’ve had just about all the Atom we can handle, and it looks like Intel’s just about ready to help us back away from the difficult choice of sexy form factors for low prices and sexy form for exorbitant prices. Intel is working on Montevina Plus, which will push Penryn laptop chip technology past the 3GHz mark, while subsequently sending ULV chips into the mainstream, showing up in laptops ranging from $599 to $1,000, instead of the $1,500+ premiums they currently usually command — great news for ultraportable lovers that actually want to get a few things accomplished on the road. Intel also sees 2009 as the year of the nettop, at least in emerging markets, and will naturally be pushing Nehalem all over the place — with the way chip roadmaps are planned, the economic downturn naturally won’t be messing with any planned rollouts for the time being.

Filed under:

Intel’s upcoming mobile chips to squeeze 3GHz out of Penryn, bring high-performance ULV to the masses originally appeared on Engadget on Sun, 08 Mar 2009 07:28:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

AMD announces GLOBALFOUNDRIES spin-off, forgets to name it something awesome

AMD’s finally dumped its fabrication facilities and technology onto a new spin-off brand, as expected. ATIC (Advanced Technology Investment Company) is on board as well, and the newly-formed GLOBALFOUNDRIES has $6 billion in investments to start out with, along with 2,800 employees. GLOBALFOUNDRIES will be primarily be building chips for AMD, just like usual, but will also be open to other gigs as a 3rd party chip builder — its main rival TSMC just scored a deal with Intel to produce Atom chips on the cheap, a first for Intel who usually keeps production and processes in-house. First up for the new company? Churning out a 32nm process. We like the sound of that.

Filed under: ,

AMD announces GLOBALFOUNDRIES spin-off, forgets to name it something awesome originally appeared on Engadget on Wed, 04 Mar 2009 11:13:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Intel’s Barrett on Paranoia, the Core Craze and the End of Gigahertz

At first, Intel chairman Craig Barrett struck me as a testy old dude.

This would be fair, considering his company was about to announce a sudden 90% plunge in profits. So it’s understandable that, when I asked him about Nvidia’s recent coup, getting Apple to swap out Intel product for GeForce 9400M chipset, he said with more than a hint of disdain, “You’re obviously a Mac user.” Here’s a guy who is used to making judgments, and doing it quickly.

But when I told him I also built my desktop with an Intel Core 2 Duo Wolfdale chip, he reversed his decision. Laughing, he said, “You’re alright for a kid that wears black Keds.” This wasn’t his first reference to my sneakers—they were Adidas, actually—and it wasn’t his last either.

At 69, he is definitely one of the oldest guys running a powerhouse innovation company like Intel, and when he’s sitting there in front of you, he conveys an attitude that he’s seen it all. He hung up his labcoat for a tailored suit long ago, but talking to him, you can still tell that his degree from Stanford isn’t some MBA, but a PhD in materials science. Nerdspeak flows easily out of his mouth, and he closes his eyes while calmly making a point, like a college professor. At the same, you get a sense of the agitation within. After all, he’ll be the first to tell you that in business, he still lives by the mantra of his Intel CEO predecessor Andy Grove: “Only the paranoid survive.”

In the end, I really liked the guy. He’s tough but fair, like an Old Testament king. Here are excerpts from our conversation, chip guru to chip fanboy, about vanquishing your competition, the limitations of clock speed, the continuing rage of the multi-core race and how to keep paranoid in your golden years.

What’s the endgame of the multi-core arms race? Is there one?
If everything works well, they continue to get Moore’s Law from a compute power standpoint. [But] you need software solutions to go hand-in-hand with software solutions…There’s a whole software paradigm shift that has to be happen.

How involved is Intel in the software side of making that happen?
Probably the best measure is that if look at the people we hire each year, we still hire more software engineers than hardware engineers.

Where do you see Larrabee, Intel’s in-development, dedicated high-end GPU, taking you?
The fundamental issue is that performance has to come from something other than gigahertz… We’ve gotten to the limit we can, so you’ve got to do something else, which is multiple cores, and then it’s either just partitioning solutions between cores of the same type or partitioning solutions between heterogeneous cores on the same chip.

You see, everybody’s kind of looking at the same thing, which is, ‘How do I mix and match a CPU- and a GPU-type core, or six of these and two of those, and how do you have the software solution to go hand-in-hand?’

So what do you think of the competition coming from Nvidia lately?
At least someone is making very verbal comments about the competition anyway.

Do you see Nvidia as more of a competitor than AMD? How do you see the competitive landscape now?
We still operate under the Andy Grove scenario that only the paranoid survive, so we tend to be paranoid about where competition comes from any direction. If you look at the Intel history, our major competitor over the years has been everybody from IBM to NEC to Sun to AMD to you-name-it. So the competition continually changes, just as the flavor of technology changes.

As visualization becomes more important—and visualization is key to what you and consumers want—then is it the CPU that’s important, or the GPU, or what combination of the two and how do you get the best visualization? The competitive landscape changes daily. Nvidia is obviously more of a competitor today than they were five years ago. AMD is still a competitor.

Would you say the same competitive philosophy applies to the mobile space?
Two different areas, obviously. The netbook is really kind of a slimmed down laptop. The Atom processor takes us in that space nicely from a power/performance standpoint. Atom allows you to go down farther in this kind of fuzzy area in between netbooks, MIDs [mobile internet devices] and smartphones. The question there is, ‘What does the consumer want?’

The issue is, ‘What is the ultimate device in that space?’ …Is it gonna be an extension of the internet coming down, or there gonna be an upgrowth of the cellphone coming up?

Are you planning on playing more directly in phones, then?
Those MIDs look more and more like smartphones to me…All they need to do is shrink down a little bit and they’re a damn good smartphone. They have the capability of being a full-internet-functionality smartphone as opposed to an ARM-based one—maybe it looks like the internet you’re used to or, maybe it doesn’t.

Intel and Microsoft “won” the PC Revolution. There’s a computer on basically every office desk in the country. What’s beyond that? Mobile, developing countries?
Well, it’s a combination. There’s an overriding trend toward mobility for convenience. We can shrink the capability down to put it in a mobile form factor, and the cost is not that much more than a desktop, point one. Point two, if you go to the emerging economies where you think that mobile might be lacking, really the only way to get good broadband connectivity in most of the emerging markets is not with wired connectivity or fixed point connectivity, it’s gonna be broadband wireless and that facilitates mobile in emerging markets as well.

So where does that take Intel going in the next five years?
It’s pushing things like broadband wireless, WiMax…It’s broadband wireless capability, that’s the connectivity part. It’s mobility with more compute power and lower energy consumption to facilitate battery life and all that good stuff. And it’s better graphics. That’s kind of Larrabee and that whole push.

You’ve passed AMD on every CPU innovation that it had before you did, such as on-die memory controllers, focus on performance per watt, etc. How do you plan to stay ahead?
The basic way you stay ahead is that you have to set yourself with aggressive expectations. There’s nothing in life that comes free. You’re successful when you set your expectations high enough to beat the competition. And I think the best thing that we have going for us is…the Moore’s Law deal.

As long as we basically don’t lose sight of that, and continue to push all of our roadmaps, all of our product plans and such to follow along Gordon’s law, then we have the opportunity to stay ahead. That doubling every 18 months or so is the sort of expectation level you have to set for yourself to be successful.

Would you consider that the guiding philosophy, the banner on the wall?
That’s the roadmap! That is the roadmap we have. If you dissect a bit, you tend to find that the older you get, the more conservative you get typically and you kinda start to worry about Moore’s Law not happening. But if you bring the bright young talent and say, ‘Hey, bright young talent, we old guys made Moore’s Law happen for 40 years, don’t screw it up,’ they’re smart enough to figure it out.