Apple’s A6 CPU actually clocked at around 1.3GHz, per new Geekbench report

Apple's A6 CPU actually clocked at around 13GHz, per new Geekbench report

As the initial wave of iPhone 5 reviews hit, it looked as if Apple’s dual-core A6 processor was sporting a clock speed of around 1GHz. We saw reports (and confirmed with our own handset) ranging between 1.00 and 1.02GHz, but a new Geekbench build (v2.3.6) has today revealed a horse of a different color. According to Primate Labs’ own John Poole, the latest version of the app — which landed on the App Store today — “features a dramatically improved processor frequency detection algorithm, which consistently reports the A6’s frequency as 1.3GHz.” In speaking with us, he affirmed that “earlier versions of Geekbench had trouble determining the A6’s frequency, which lead to people claiming the A6’s frequency as 1.0GHz as it was the most common value Geekbench reported.”

When we asked if he felt that the A6 was capable of dynamically overclocking itself for more demanding tasks, he added: “I don’t believe the A6 has any form of processor boost. In our testing, we found the 1.3GHz was constant regardless of whether one core or both cores were busy.” Our own in-house iPhone 5 is regularly displaying 1.29GHz, while a tipster’s screenshot (hosted after the break) clearly display 1.30GHz. Oh, and if anyone wants to dip their iPhone 5 in a vat of liquid nitrogen while trying to push things well over the 2GHz level, we certainly wouldn’t try to dissuade your efforts.

[Thanks, Bruno]

Continue reading Apple’s A6 CPU actually clocked at around 1.3GHz, per new Geekbench report

Filed under: ,

Apple’s A6 CPU actually clocked at around 1.3GHz, per new Geekbench report originally appeared on Engadget on Wed, 26 Sep 2012 19:31:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcePrimate Labs, Geekbench (App Store)  | Email this | Comments

Texas Instruments sidelines phone and tablet chip business

Texas Instruments has announced it will scale back its mobile processor business, no longer targeting smartphones and tablets, but instead eyeing the embedded systems market. The surprise news follows further contractions in TI’s business, most recently seeing long-standing customer Motorola pick up Intel’s Medfield for the RAZR i, though TI says it will continue to support its existing clients, Reuters reports.

Nonetheless, that sounds like something of a stop-gap measure as TI exits the mobile chip industry. “We believe that opportunity is less attractive as we go forward” TI senior VP for embedded processing Greg Delagi said of phone and tablet chipsets; the company will no longer invest to the same extent in its customers roadmaps for upcoming models.

Exactly what sort of timescale TI has in mind for that scaled-back involvement is unclear, and the market has already reacted pessimistically. Delagi conceded that the embedded systems market would take more time to develop than the hotly-contested wireless market, but insisted that the transition would “generate a more stable, profitable long-term business” for TI as a result.

TI’s embedded chips are finding their way into increasingly complex in-car systems, driving internet-connected navigation and entertainment, as well as other industries where the gigahertz-chasing of mobile isn’t such an issue. The rapid evolution of wireless chipsets, as well as companies like Apple and Samsung opting to make their own ARM-based processors rather than externally source them, means competition has grown significantly in recent months.

Nonetheless, it’s an unusual decision to have made, and one TI’s partners are likely looking at with no small degree of suspicion. Barnes & Noble’s new NOOK HD and HD+ tablets, for instance, are based on Texas Instrument’s OMAP4xxx series of chipsets, and RIM has used TI chips for its BlackBerry PlayBook tablet. Many of TI’s more recent high-profile launches have shifted away from such applications, however, including a push into the so-called “internet of things.”


Texas Instruments sidelines phone and tablet chip business is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


Wii U developers already complaining about underperforming CPU

Nintendo’s consoles aren’t known for raw performance, and the upcoming Wii U won’t change that. But Akihiro Suzuki, developer of the popular–and graphically intense–Dynasty Warriors, has already started to complain about what he considers an underpowered CPU:

“One of the weaknesses of the Wii U compared to PS3 and Xbox 360 is the CPU power is a little bit less. So for games in the Warriors series, including Dynasty Warriors and Warriors Orochi, when you have a lot of enemies coming at you at once, the performance tends to be affected because of the CPU… Dealing with that is a challenge.”

Although the exact specifications of the IBM-made CPU remain a mystery for those outside of Kyoto, it’s known that it has 3 Power PC cores and was known to be relatively less powerful than the Wii U’s custom AMD 7 series GPU. It’s entirely possible that the Wii U could have better graphics, but worse performance. Suzuki is speaking from experience: his team has been developing a Dynasty Warriors game  that will launch alongside the Wii U on November 18.

Of course, the Wii U is a new console and developers will probably require some time to get used to the system and its developer tools. Optimizations that work well on one console architecture can doom performance on another. But it’s still not a great sign for Nintendo that its third-party developers are already calling the Wii U’s processor a “challenge.”

By Ubergizmo. Related articles: Nintendo Wii U launch for Europe might be brought forward to 23rd November thanks to advertising glitch, Nintendo Wii U will reportedly be region-locked,

Intel: LTE Medfield by end of 2012 plus dualcore incoming

Intel’s first LTE-capable Medfield smartphone chips are in the pipeline, along with multicore versions of the processor, with the first examples due before the end of 2012. Intel will be “shipping some LTE products later this year and ramping into 2013″ director of product marketing Sumeet Syal told TechCrunch, and in the meantime the company is working on fettling more Android apps to suit the x86 architecture.

That software hiccup could be a headache to Intel and its manufacturer partners, with devices like the freshly announced Motorola RAZR i unable to run certain software available through Google’s Play store. Google’s own Chrome browser, for instance, currently won’t work on the RAZR i, though Motorola has confirmed it should be functional by the time the midrange smartphone actually reaches the market.

“We’re not quoting any numbers” Syal says, “but the majority of all the apps we’ve tested work just fine.” The company’s team responsible for software has been working “constantly round the clock to make sure that all these apps work” and the number of compatible titles increases every day.

As for multicore, initially that will mean dualcore Medfield, with Intel not ready to talk about quadcore Atom processors for phones as yet. Intel, though, is in no great rush Syal insists, content with its hyper threading system that milks two threads out of a single core.

“You have to take a look at how many instructions per clock can the architecture handle — our belief is that others are throwing cores at the issue in terms  of getting more performance.  We make that determination based on our architecture so we felt very comfortable coming out with a single core dual-threaded for our first product, and as we’re able to get more and more performance in the right implementation of the architecture we believe putting in dual-core would be the right thing for our next generation product” Sumeet Syal, Intel

Timelines for the dualcore Medfield versions have not been revealed yet, and nor has Intel disclosed when the first Atom-based smartphones might arrive in the US.


Intel: LTE Medfield by end of 2012 plus dualcore incoming is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


AMD Piledriver CPU pre-order pricing leaks out

AMD Piledriver CPU preorder pricing leaks out

It’s always just been a matter of “when” and “how much,” but it looks as if PC gamers looking to score a powerplant upgrade can start planning on specific amounts. AMD’s impending FX Piledriver CPUs are now up for pre-order at ShopBLT, an outlet that has proven reliable in the past when it comes to nailing down processor pricing. For those in need of a refresher, these are built using the Vishera design, with the range including between four and eight CPU cores. We’re expecting ’em to best the Bulldozer family, and if all goes well, they could be available to the earliest of adopters in October. Presently, the FX-4300 ($131.62), FX-6300 ($175.77), FX-8320 ($242.05) and FX-8350 ($253.06) are listed, but CPU World seems to think launch day quotes will actually be a bit lower. Only one way to find out, right?

Filed under:

AMD Piledriver CPU pre-order pricing leaks out originally appeared on Engadget on Fri, 21 Sep 2012 18:41:00 EDT. Please see our terms for use of feeds.

Permalink CPU World  |  sourceShopBLT  | Email this | Comments

Wii U’s slow CPU “a challenge” for one launch developer

These days, we have a better idea of what the Wii U is packing under the hood. While there are some aspects of the Wii U that are clearly better than the Xbox 360 or PS3, the CPU isn’t one of them. We don’t know everything about the Wii U’s CPU just yet (clock speed, for instance, is still a mystery), but we do know that it comes from IBM and features three Power PC cores.


That underwhelming CPU is giving one Wii U launch developer some trouble. During the Tokyo Game Show, Eurogamer sat down with Warriors Orochi 3 Hyper producer Akihiro Suzuki, who says that the Wii U’s CPU tends to have some issues when there are multiple characters on screen, which is pretty much always the case when playing a Dynasty Warriors game. “One of the weaknesses of the Wii U compared to PS3 and Xbox 360 is the CPU power is a little bit less,” Suzuki said. “So for games in the Warriors series, including Dynasty Warriors and Warriors Orochi, when you have a lot of enemies coming at you at once, the performance tends to be affected because of the CPU.”

Suzuki followed up by saying that dealing with those performance issues can be “a challenge,” but did also point out that as far as sheer graphics power is concerned, the Wii U has the 360 and PS3 beat. Not only does the Wii U feature what is believed to be a custom AMD 7 series GPU, but it’s been confirmed to house 1GB of RAM that is dedicated to games, which is twice the amount the 360 and PS3 can boast. This means games which are more GPU-intensive will shine on Wii U, while those that require some significant CPU power risk falling flat.

It’s important to keep in mind that as time goes on, developers will figure out how to squeeze the most power out of the Wii U’s CPU. All you need to do is look at this generation to see that much is true – compare titles like The Last of Us or Uncharted 3 to games that launched at the beginning of the generation, and you’ll surely notice a sizable boost in overall quality. It seems safe to assume that we can expect a similar progression with games on Wii U, so this is just probably one of those launch hurdles that most developers have to deal with. Check our story timeline below for more on the Wii U!


Wii U’s slow CPU “a challenge” for one launch developer is written by Eric Abent & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.


Vellamo benchmark adds CPU and memory tests, here’s how it rates the One X and GS III

Vellamo benchmark updated to tests CPU and memory, here's how it rates the One X and GS III

Qualcomm’s Vellamo app has been a part of the furniture in our Android benchmarking suite for a while now, providing a fun little test of browsing and networking speeds on almost any Android device. Version 2.0 adds something extra, however: a section called “Metal” that is all about putting your processor and memory through the wringer.

As a quick taster, we ran the new HTML5 and Metal tests on the HTC One X (both global and AT&T) and the Galaxy S III (global and Sprint), settling on the average of three consecutive results. Conspiracy theorists who think that Qualcomm’s app favors its own processors will only find further ammunition in the CPU results, however the HTML5 scores actually give the QCOM devices much less of a lead than the old Vellamo did, scoring all four handsets roughly equally. You’ll find the table overleaf, along with a publicity video that explains the update.

Continue reading Vellamo benchmark adds CPU and memory tests, here’s how it rates the One X and GS III

Filed under: ,

Vellamo benchmark adds CPU and memory tests, here’s how it rates the One X and GS III originally appeared on Engadget on Fri, 21 Sep 2012 06:31:00 EDT. Please see our terms for use of feeds.

Permalink Phandroid  |  sourceVellamo (Google Play)  | Email this | Comments

Apple: A6 chip in iPhone 5 has 2x CPU power, 2x graphics performance, yet consumes less energy

iPhone 5's A6 chip has 2x CPU power, 2x graphics performance, yet consumes less energy

Every new iPhone needs a new engine, and Tim Cook has just made some bold claims about Apple’s latest silicon creation: the A6 processor. He hinted at a significant shrinkage in transistor size, allowing the chip to be 22 percent smaller than the A5 and hence more energy-efficient, while at the same time — he says — doubling all-round CPU and graphics capabilities. By way of practical benefits, the Apple CEO promises the Pages app will load up 2.1x faster than before, while Keynote attachments will hit the screen 1.7x faster. At this point we’re lacking any further detail about cores or clock speeds or indeed who actually fabricated the A6 (still Samsung, after all that bitterness?), but Apple does tend to be close-lipped on such things. In the meantime, bring on the benchmarks!

Check out all the coverage at our iPhone 2012 event hub!

Apple: A6 chip in iPhone 5 has 2x CPU power, 2x graphics performance, yet consumes less energy originally appeared on Engadget on Wed, 12 Sep 2012 13:34:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

AMD’s top-end Trinity desktop chip could cost just $130, the same as a budget Core i3

AMD's topend Trinity desktop chip could cost $130, the same as Intel's cheapest Core i3

You can’t get a Core i3 on Newegg right now for much less than $130 — a sum that’ll put you almost at the bottom of the Ivy Bridge league with a dual-core processor, 3MB cache and HD 2500 (i.e. not HD 4000) integrated graphics. That’s why it’s interesting to see these leaked AMD Trinity prices over at retailer BLT. If they’re accurate, they indicate that the same amount of cash might afford a top-end Trinity A10 processor with overclockable 3.8GHz quad-cores, 4MB cache and vastly superior Radeon HD 7660D graphics. At the other end of AMD’s range, a dual-core A4-5300 APU could cost as little as $60. The only catch we can see — aside from the issue of accuracy — is that by the time these processors actually become available rival Intel may well have seen fit to adjust its own prices. In fact, Chipzilla just launched some new Ivy Bridge processors over the weekend that brought the cost of entry down to $117 — which goes to show that nothing stands still for long. Head past the break for some official gaming benchmark claims about the A10, or see More Coverage for extras.

Continue reading AMD’s top-end Trinity desktop chip could cost just $130, the same as a budget Core i3

Filed under:

AMD’s top-end Trinity desktop chip could cost just $130, the same as a budget Core i3 originally appeared on Engadget on Mon, 03 Sep 2012 07:46:00 EDT. Please see our terms for use of feeds.

Permalink ZDNet  |  sourceBLT  | Email this | Comments

Your next Samsung could learn to love your smile

Heterogeneous System Architecture might not be a phrase that trips off your tongue right now, but if AMD, TI and – in a quiet addition – Samsung have their way, you could be taking advantage of it to interact with the computers of tomorrow. AMD VP Phil Rogers, president of the HSA Foundation, filled his IFA keynote with futurology and waxing lyrical about how PCs, tablets and other gadgets will react to not only touch and gestures, but body language and eye contact, among other things. Check out the concept demo after the cut.

Heterogeneous System Architecture is a catch-all for scalar CPU processing and parallel GPU processing, along with high-bandwidth memory access for boosting app performance while minimizing power consumption. In short, it’s what AMD has been pushing for with its APUs (and, elsewhere – though not involved with HSA – NVIDIA has with its CUDA cores), with the HSA seeing smartphones, desktops, laptops, consumer entertainment, cloud computing, and enterprise hardware all taking advantage of such a system.

While there were six new public additions to the Foundation, Samsung Electronics’ presence came as a surprise. The HSA was initially formed by AMD, ARM, Imagination Technologies, MediaTek, and Texas Instruments, but today’s presentation saw Samsung added to the slides and referred to as a founding member.

Samsung is no stranger to heterogeneous computing tech. Back in October 2011, the company created the Hybrid Memory Cube Consortium (along with Micron) to push a new, ultra-dense memory system that – running at 15x the speed of DDR3 and requiring 70-percent less energy per bit – would be capable of keeping up with multicore technologies. The Cubes would be formed of a 3D stack of silicon layers, formed on the logic layer and then with memory layers densely stacked on top.

As for the concept, Rogers described a system which could not only learn from a user’s routine, but react to whether they were smiling or not, whereabouts at the display they were looking, and to more mundane cues such as voice and gesture. Such a system could offer up a search result and then, if the user was seen to be smiling at it, learn from that reaction to better shape future suggestions.

Exactly when we can expect such technology to show up on our desktop (or, indeed, in laptops, phones and tablets) isn’t clear. However, Samsung has already been experimenting with devices that react to the user in basic ways; the Galaxy S III, for instance, uses eye-recognition to keep the screen active even if it’s not being touched, while its camera app includes face- and smile-recognition.


Your next Samsung could learn to love your smile is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.