GTC 2013: We’re Here!

It’s that time of year again for all those diehard gaming and graphics fans. The annual GPU Technology Conference has just kicked off live in San Jose in the heart of Silicon Valley. There’s plenty going on again this year, especially with NVIDIA‘s new GTX Titan graphics card taking front and center. Read on for more details on what to expect.

20130319_085131-XL

Obviously this will be all about developers, gaming, and high performance graphics, but we’re also expecting some exciting news on the NVIDIA GPU front (aka TITAN) as well as some details on their impressive mobile chipset recently announced, the NVIDIA Tegra 4 and Tegra 4i.

From Ray Tracing to Crysis 3 gaming and graphics will obviously be the star of the show here. From emerging technology, emerging companies and much more we’ll be here live with all the details.

The official GTC Keynote is about to begin here shortly this morning with NVIDIA’s own Jen-Hsun Huang taking the stage as usual to share some details. We’re expecting the focus to be on content, developers, partners, and a few nice announcements about the products mentioned above. Stay tuned for all the details live from SlashGear! Don’t forget to check out our Tegra Portal for more NVIDIA news.


GTC 2013: We’re Here! is written by Cory Gunther & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Wacom Cintiq 13HD takes digital art creation mobile

Wacom may be having a ball putting digital pen technology into bestselling mobile devices like the Galaxy Note II, but that doesn’t mean the company has forgotten its artistic roots. Fresh to the catalog is the Cintiq 13HD, a 13.3-inch Full HD graphics tablet intended for digital artists on the move. Based around a 1920 x 1080 wide-angle display mounted on an adjustable, three-position detachable stand, the Cintuq 13HD promises the flexibility of its bigger Cintiq siblings, but at a more affordable cost and scaled to fit in your bag.

wacom_cintiq_13hd_1

That’s not something you could really say about Wacom’s existing Cintiq models, the 22HD, 24HD, and 24HD touch, which kick off at a desk-bound 22-inches. In contrast, the Cintiq 13HD display is 14.75 x 9.75 x 0.5 inches and 2.65 pounds, and simply hooks up to your laptop as a second display.

Once you’ve done that – via the native HDMI, or DVI, VGA, or Mini DisplayPort with the right adapter – you get a 2048-pressure-level pen, 16.7m color support, and tilt recognition. The pen itself comes with nine different nibs – five standard, three felt, and one stroke – as well as a desk stand and a carry case. It has two side switches and an active eraser, as well as the pressure-sensitive nib.

Around the body of the 13HD there are four user-assignable shortcut buttons, the functions of which can automatically change depending on the active app, and a 4-position customizable rocker ring and home button. The screen itself has 250cd/m2 brightness and covers 75-percent of the Adobe RGB color gamut.

wacom_cintiq_13hd_9

As well as detaching the stand to use the new baby Cintiq in your lap, you can prop it up at one of three angles – 22-, 35-, and 50-degrees – and a regular USB 2.0 data connection is included in the three-in-one cable which also handles display signal and power.

The Wacom Cintiq 13HD will go on sale in early April, priced at $999 in the US and £749.99 in the UK.

wacom_cintiq_13hd_1
wacom_cintiq_13hd_2
wacom_cintiq_13hd_3
wacom_cintiq_13hd_4
wacom_cintiq_13hd_5
wacom_cintiq_13hd_10
wacom_cintiq_13hd_9
wacom_cintiq_13hd_8
wacom_cintiq_13hd_7
wacom_cintiq_13hd_6


Wacom Cintiq 13HD takes digital art creation mobile is written by Chris Davies & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

NVIDIA announces PhysX and APEX support for PS4

Heads up, folks. NVIDIA will be invading the PS4 as the company has announced PhysX and APEX support for the recently-announced gaming console. Both PhysX and APEX are software development kits from NVIDIA that will allow game developers to design new PS4 games with stunning graphics, similar to what we saw during the PS4 reveal last month.

NVIDIA-Logo-580x331

NVIDIA’s product manager for PhysX, Mike Skolones, says that “great physics technology is essential for delivering a better gaming experience and multiplatform support is critical for developers,” and “with PhysX and APEX support for PlayStation 4, customers can look forward to better games.” Indeed, both PhysX and APEX should make games more realistic with life-like movements and scenery.

PhysX is designed specifically to be used with hardware acceleration in processors and graphics cards, and the technology allows for more complex and detailed worlds in video games, including more-realistic explosions, clothes that react more naturally to the wind and body movements, and of course, better life-like motions of characters.

Both PhysX and APEX are already integrated into a handful of games. NVIDIA boasts that PhysX alone is featured in more than 150 games, and is used by over 10,000 developers. Some games that are taking advantage of NVIDIA’s technologies include Borderlands 2, the Batman Arkham series, Mirror’s Edge, and Metro 2033.


NVIDIA announces PhysX and APEX support for PS4 is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

NVIDIA’s new Quadro cards offer workstation performance for as low as $199

When you want workstation performance and reliability out of your computer, you usually have to pony up over a grand for a high-end graphics card. However, NVIDIA has released four new Quadro graphics cards that come in many different flavors, including one option called the K600 that costs only $199.

nvidia-quadro

The other three Kepler-based Quadro cards are the K4000, K2000, and the K2000D. The K4000 is the beast of the bunch, racking up 3GB of on-board memory and costing a whopping $1,269. The K2000 and K2000D are similar to one another, both of which are priced at $599 and come with 2GB of onboard memory, but the K2000D comes with native support for two dual-link DVI display connectors, which NVIDIA says is ideal for “interfacing with ultra-high-resolution medical imaging displays.”

Furthermore, the K4000 has 768 CUDA Cores, a memory bandwidth of 134 GB/s,and 1.246 teraflops. The K2000 has 384 CUDA Cores, 67 GB/s of memory bandwidth, and 733 gigaflops. The budget card of the bunch, K600 isn’t as fast, but for the money you’re paying, it’s not a bad deal. You’re looking at 192 CUDA Cores, a memory bandwidth of 29 GB/s, and 336 gigaflops.

Currently, the Quadro K5000 is the flagship card of the series, but for those who don’t need that much power out of their rig, you can grab cheaper versions that may be suited for more your speed. NVIDIA says these new cards deliver twice as much performance as previous-generation cards, and features larger and faster on-board memory to keep your graphics-intensive projects going strong.


NVIDIA’s new Quadro cards offer workstation performance for as low as $199 is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

NVIDIA shines a light on lower spec Quadro cards: K600 priced at $199, K4000 at $1,300

NVIDIA Embargo

Despite all the energy it’s been putting into mobile and gaming, NVIDIA hasn’t fallen out of love with its professional graphics customers. In fact, it’s in the process of trying to rekindle those sparks of romance through the clever use of chocolates, shoulder rubs and fresh additions to its Kepler-based Quadro lineup. We’ve already seen (and played with) the $2,249 K5000 flagship, but those of us on lower budgets will now be able to snag the K4000, K2000 or K600 as they begin to enter the retail channel.

Working from the top down, the $1,269 Quadro K4000 has 768 CUDA Cores, 3GB of RAM and a memory bandwidth of 134GB/s, which means it’ll crank out your architectural documents and video reels at a healthy 1.246 TFLOPs. The $599 K2000 has half the CUDA cores and memory bandwidth, with 2GB of RAM, and reaches a top speed of 733 GFLOPs. Lastly, the $199 K600 has 192 CUDA Cores, 1GB RAM, a memory bandwidth of 29GB/s and a top speed of 336 GFLOPs. If you’d like more details, you know where the PR’s at.

Filed under: ,

Comments

NVIDIA GeForce 314.14 beta drivers available now

NVIDIA never skimps on offering constant driver updates to its various graphics cards, and today the company released beta drivers that are optimized for the many games coming out this month, including SimCity, StarCraft 2: Heart of the Swarm, Resident Evil 6, and Hawken PhysX just to name a few.

NVIDIA-Logo

Version 314.14 brings optimization for upcoming games, as well as current games that could use a boost. Specifically, the new beta drivers deliver up to a 23% boost in Sniper Elite V2 and a 9% boost in Sleeping Dogs. In SLI mode, other games also get an increase in performance, including a 9% boost in StarCraft II and a 5% boost in Call of Duty: Black Ops II.

Today’s update comes just a couple weeks after NVIDIA unleashed version 314.07 of its GeForce drivers, which increases performance for a handful of intensive games, such as Crysis 3, Assassin’s Creed III, Civilization V, and Call of Duty: Black Ops II. Crysis 3 ended up with a 65% boost in performance, which is quite the improvement.

Other games that NVIDIA focused on for these beta drivers are Deus Ex: Human Revolution, Just Cause 2, The Elder Scrolls V: Skyrim, and Batman: Arkham City, all of which received 4% to 5% performance boosts. If you already have the GeForce Experience installed, the drivers are available for automatic downloading and installing right now.


NVIDIA GeForce 314.14 beta drivers available now is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

Because they’re worth it: game characters get AMD to do their hair

AMD brings better hair days to game characters with TressFX

Blocky, pixelated locks can really ruin a day of tomb-robbing, right? To put the feather back in those bangs, AMD’s just announced TressFX, software that’ll be seen in the 2013 release of Tomb Raider due on March 5th. The rendering tech offloads computation-heavy hair simulation to the graphics processor using Microsoft’s DirectCompute language, and was developed by AMD in partnership with Raider developer Crystal Dynamics — though it’ll work with any graphics card that supports DirectX 11, including those from arch-foe NVIDIA. The result is a coiffure that can move realistically in response to motion and external forces, detect collisions between strands, accurately reflect light and even allow for matting from moisture or rain. Lara may have preferred that AMD omit the latter, but anything’s better than the helmet-head look, no?

Filed under: ,

Comments

Via: Bit-Tech

Source: AMD

AVADirect is now offering the NVIDIA GTX Titan GPU

AVADirect is now offering the newly unveiled NVIDIA GTX Titan graphics processing unit in its high-quality, custom-built computers. AVA Direct is a custom computer manufacturer that builds high-end computers designed to meet the latest and greatest in the advances of computer technology. They’re on the same line as other custom system manufacturers like Cyberpower, Digital Storm, Falcon Northwest, Geekbox, IBUYPOWER, Maingear, Origin PC, Puget Systems, V3 Gaming, and Velocity Micro.

AVADirect is now offering the NVIDIA GTX TITAN GPU

So you can expect that when NVIDIA announced the Titan, AVADirect was all over it. The NVIDIA GTX Titan GPU includes and improves existing NVIDIA features, like NVIDIA adaptive veritical sync and NVIDIA Surround. It offers support for 3-way SLI and support for up to 4 displays. It also supports up to 4k resolutions and the DirectX 11.1 API. The GPU will cost $999, have 2,688 CUDA cores, 6GB of GDDR5 RAM, and 7.1 billion transistors.

The NVIDIA GTX Titan is a very powerful, and very efficient GPU, one that NVIDIA claims as “the most powerful GPU on the planet”. It is designed with pro-gaming in mind and will meet all the needs of even the most demanding games out there, like Crysis 3. The Titan will be one of the greatest tools in any pro-gamer’s arsenal.

Of course, at the $1000 price-tag, the Titan won’t be for everyone. AVADirect, however, will offer the Titan in customized computer builds that will still be relatively affordable. Its goal is to get the GPU to as many consumers as possible, because while it may be a highly coveted, powerful GPU, it shouldn’t be unattainable. Check AVADirect’s site out in the coming days to see its updated system configurations. Also, check out the timeline below to check the latest news regarding NVIDIA’s innovations.

[via AVADirect]


AVADirect is now offering the NVIDIA GTX Titan GPU is written by Brian Sin & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

NVIDIA unveils GTX Titan GPU with supercomputer performance

Remember the Titan supercomputer? Back in November, it became the world’s fastest supercomputer, and it’s powered by NVIDIA chips. Now you can get a piece of Titan in your own home because NVIDIA has announced the GTX Titan graphics card, a $1,000 GPU that sports 2,688 CUDA cores, 6GB of GDDR5 RAM, and 7.1 billion transistors.

Screen Shot 2013-02-19 at 8.22.44 AM

NVIDIA says that the new GTX Titan graphics card is “powered by the fastest GPU on the planet,” which we certainly can’t refute at this point. The graphics card itself is huge, measuring in at 10.5-inch long, and it’s capable of pushing 4,500 Gigaflops, which is quite impressive if we do say so ourselves.

Screen Shot 2013-02-19 at 8.24.24 AM

However, the GTX Titan falls just a tad short of NVIDIA’s current top-tier offering, the GTX 690, as far as raw specs and computing power are concerned, but efficiency is where the Titan really shines. The GTX Titan features over a thousand more CUDA cores than the GTX 690, but it requires less power, as well as generates less heat and runs quieter overall.

As far as availability goes, NVIDIA The Titan GPU will be available starting on February 25 from various partners, including ASUS, eVGA, Gigabyte, and MSI, at a price of around $1,000, which certainly isn’t going to want you to make an impulse purchase, but if you’re looking for supercomputer-like speeds with your gaming rig, this card may be well worth it.


NVIDIA unveils GTX Titan GPU with supercomputer performance is written by Craig Lloyd & originally posted on SlashGear.
© 2005 – 2012, SlashGear. All right reserved.

NVIDIA GeForce GTX Titan leaks, could cost a grand

NVIDIA GeForce GTX Titan reportedly set to take the GPU crown with 6GB of RAM

NVIDIA’s GeForce GTX 690 currently wears the world’s-fastest-graphics crown, unless you count the limited edition Ares II, by cramming two Kepler GPUs onto one mainstream board. When it comes to improving on that, some leaked European retailer listings suggest NVIDIA might not wait on a completely next-gen architecture, but may instead try to deliver similar performance through a less power-hungry single GPU design. The listings, gathered together by TechPowerUp and VideoCardz, point towards a pricey new flagship, the GeForce GTX Titan, that would be a graphics-focused adaptation of the beefy Tesla K20 computing card. It’d pack 2,688 shader units, a 384-bit memory bus and 6GB of RAM, all with one chip — for reference, the GTX 690 needs two GPUs to offer 3,072 shader units and has 4GB of RAM. There’s no confirmed unveiling date, and the primary leak on a Danish site has actually been pulled, but ASUS and EVGA are rumored to be launching their own GTX Titan variants as soon as next week, possibly in the $1,000 to $1,200 ball park. That’s a short wait for what could deliver a serious boost to game performance, not to mention bragging rights.

Filed under: , , , ,

Comments

Via: Bright Side of News, Bit-Tech

Source: TechPowerUp, VideoCardz.com, EuroSys