NVIDIA announces Titan X, a GPU featuring 12GB framebuffer and 8 billion transistors

Desmond

Destroy Erase Improve
Staff member
Admin
*cdn3.vox-cdn.com/thumbor/vpt7YpMljcy7yRoo0dtraFZeQB4=/0x0:1019x679/800x536/cdn0.vox-cdn.com/uploads/chorus_image/image/45815892/slack-imgs.com.0.0.jpeg

During today's Epic Games event at the Game Developers Conference 2015, NVIDIA co-founder Jen-Hsun Huang rushed the stage like a professional wrestling hero to announce the Titan X, NVIDIA's latest GPU. Huang claims it is the most powerful GPU on the planet. With 12GB frame buffer and 8 billion transistors, it is — on paper — a significant step past NVIDIA's current hardware. NVIDIA's Titan used to be its most powerful hardware.

Huang autographed the GPU's box and gifted the hardware to Epic Games' co-founder Tim Sweeney. "As a result of Tim calling me," said Huang, "and his phone call was, 'Can you save GDC' […] we decided to launch Titan X at the Epic event. This has never happened before in the history of our industry."

Huang came to the stage after Sweeney gave a lengthy speech about the convergence of photorealistic imagery, film, video games, architecture, industrial design, and virtual reality. "This GDC," said Huang, "is all about VR. And there are all these demos that are just unbelievable ... [it] needs an amazing GPU."

Huang implied NVIDIA's Titan X is a step in that direction.

Source: NVIDIA announces Titan X, the new most powerful GPU on the planet | The Verge
 
OP
Desmond

Desmond

Destroy Erase Improve
Staff member
Admin
You'll be lucky if you can afford one.

- - - Updated - - -

AMD officially announces the R9 390X in response to the Nvidia Titan X

AMD Officially Confirms New Radeon Flagship - R9 390X Ultra-Enthusiast Graphics Card In All Likelihood
 

REDHOTIRON2004

Journeyman
How many watts does it need?

I expect it to be efficient compared to previous titan gpus.
Still, would like to know how efficient is it right now?
 
OP
Desmond

Desmond

Destroy Erase Improve
Staff member
Admin
any real benches released??

Just been announced. The only one that was displayed was gifted to Epic's Tim Sweeney by Jen-Hsun Huang only yesterday.

Benchmarks should follow in the coming days once reviewers are given their models.

- - - Updated - - -

Official news reporter of TDF [MENTION=5007]DeSmOnD dAvId[/MENTION] :D

Lol.

But someone has to do it. Very few people post tech news here.
 

warfreak

Talk to the hand!!!
Can it run Crysis?

Serious question. Not trolling. The current Titan barely manages playable FPS on 4K. I mean point of buying this sort of GPU is to run games at 4K right? What good is a GPU if the flagship cannot run most demanding games at 4K?
 
OP
Desmond

Desmond

Destroy Erase Improve
Staff member
Admin
GTX 980 can run Crysis on 4K without must problems. Should not be a problem for this.
 

warfreak

Talk to the hand!!!
GTX 980 in SLI can run Crysis on 4K without must problems. Should not be a problem for this.

FTFY :)

A single discrete 980 would be more than enough for 1080P but for 4K, you'd have to turn down a few settings for the most demanding games.

The Titan is priced such that it would be more feasible to get two 980s in SLI rather than invest in a single Titan.
 
OP
Desmond

Desmond

Destroy Erase Improve
Staff member
Admin
Then its also possible that this one is just a novelty card, beyond enthusiast grade. Or perhaps not for the general end user altogether.

A 12GB framebuffer is overkill even for 4k displays.
 

kkn13

Cyber Genius FTW
another awesome gpu which most of us wont be able to buy!! :D
btw a bit off topic but why are nvidia gpus priced higher than AMD usually? or am i wrong?
 

REDHOTIRON2004

Journeyman
Or it could be that AMD prices them low due to their lower market share.

I would say that nvidia prices them much higher. Just look at the jump in price from a gtx970 to gtx 980 for barely 30% improvement.

There is no reason for that card to be twice as costly as a gtx970 at the same time being just 30% faster.
 

Nerevarine

Incarnate
This is still using DDR5, AMD's next flagship is coming with HBM, which Nvidia's next architecture is also using..
Im inclined to believe HBM will be a gamechanger, just like DDR5 was, 7 years ago
 

kkn13

Cyber Genius FTW
Laptops manufacturers are still using DDR3 and you people are excited about HBM.

I swear plus ive not seen my own 7730M bottleneck yet
or maybe im not playing all the newer games,runs skyrim like butter on high and temps dont cross 71c
still more is better i guess (performance not price):p
 

$hadow

Geek in making
I swear plus ive not seen my own 7730M bottleneck yet
or maybe im not playing all the newer games,runs skyrim like butter on high and temps dont cross 71c
still more is better i guess (performance not price):p

Yeah DDR5 came like 7 years ago and yet all I see is a handful of only high end laptops coming equipped with it..
 

sam_738844

Wise Old Owl
I would say that nvidia prices them much higher. Just look at the jump in price from a gtx970 to gtx 980 for barely 30% improvement.

There is no reason for that card to be twice as costly as a gtx970 at the same time being just 30% faster.

30% performance bar is a lot in silicon performance-space. To yield that amount of performance in same node, the engineering behind cores and memory doesnt come cheap. You are buying a piece of eletronics not Annapurna Aaata from grocery shop where twice the money is...twice the amount.
 

REDHOTIRON2004

Journeyman
30% performance bar is a lot in silicon performance-space. To yield that amount of performance in same node, the engineering behind cores and memory doesnt come cheap. You are buying a piece of eletronics not Annapurna Aaata from grocery shop where twice the money is...twice the amount.

You have no idea what you are saying or how these silicon chips are developed or manufactured. Use google and enlighten yourself before commenting like a troll.

Graphics chips or processors which belongs to a specific series are all the same with minute difference. The only difference is that the execution units or specific features are disabled on chips that goes into lower end parts.

And the chips that goes into higher end processor or graphic card are the best performing lot in the same batch. Or without any features disabled.
For eg an i5 4590 is same chip used in i5 4690 or 4690k. Intel just choosed to clock them lower for a price difference. An i5 is the same chip as i7 with just hyperthreading disabled in the same series like devil canyon.

This certainly dont mean that cost are increased for intel for manufacturing an i7 over i5 or i3. It just means that they demand a premium for the same chip.

Same goes for nvidia. The higher end card in the same series doesn't mean the additional cost of more than 2times.

Don't make yourself foolish by beleiving whatever your friends or shopkeepers have to say to sell the product. And before commenting do some research.
 
Last edited:
Top Bottom