GPU NEWS Channel

tkin

Back to school!!
No one posted this so here's some old news.

Nvidia Posts Picture of a Real Fermi Card

Just after ATI was beaming with the performance of their dual-GPU monster graphics card ATI Radeon HD 5970, Nvidia
has quietly decided to push in their Fermi-architecture based GPUs. Nvidia PR guys posted an image of Nvidia GeForce GF100 GPU based graphics card running Unigine Heaven DirectX 11 Benchmark on Twitter and Facebook.

Nvidia unveiled their next generation 40nm process technology involving Fermi architecture last month . As per the recent photo of GF100 GPU shown, the graphics card uses PCI-Express eight-pin adapter to the left and six-pin adapter to the right. This means existing power supplies could be used for supplying power to this monstrous card.

Nvidia GF100 GPU was being tested on Asus Rampage II Extreme supporting Intel LGA 1366 socket couple with Intel Core i7 CPU and DDR3 memory. However, there is speculation over the web that Nvidia will demo a working GeForce Fermi GPU at Super Computer Convention 2009 in Oregon, US.

*photos-g.ak.fbcdn.net/hphotos-ak-snc3/hs098.snc3/16531_177230903252_8409118252_2883759_2290687_n.jpg

From Fudzilla:
The image was uploaded at 9:45pm PST and depicts the Geforce desktop card running Uningine’s Heaven DirectX 11 benchmark on a Dell 24-inch monitor, so we can rationally assume that the benchmark resolution is 1920x1200. The core hardware configuration appears to be composed of an ASUS Rampage II Extreme LGA 1366 motherboard coupled with undetermined Core i7 processor and DDR3 memory. It comes as no surprise that the image was leaked around the same time that AMD posted its official Radeon HD 5970 press release. In the world of IT business marketing strategy, we can only assume that the green giant wants to ensure its ardent enthusiast consumers that Fermi-based Geforce desktop cards do exist and are confirmed to be working, especially after its "Fermi mock-up" debacle at GTC 2009. Upon close inspection in Photoshop and with the help of others, it appears that the 40nm Fermi-based GF100 monster is using a PCI-Express 8-pin adapter on the left and a 6-pin adapter on the right, so nothing is new in terms of PSU hardware requirements for enthusiast consumers. It is important to note that this particular engineering sample GPU is using the recently taped-out A2 silicon. Our multiple internal sources have previously confirmed that the company will move to A3 silicon for its final retail products. Two days ago, Nvidia publicly demonstrated its first working GPU samples based on Fermi architecture during SC 2009 (Super Computer Convention). SC is the international conference for high-performance computing, networking, storage and analysis, where the company unveiled the Tesla 20-series lineup priced respectively between $2,499 and $18,995. As previously stated, these Fermi GPUs catered toward the High Performance Computing (HPC) market segment are not expected to launch in Q2 2010, while the high-end Geforce desktop units as depicted in the image above are expected to be announced shortly after CES 2010 passes (January 7th – 10th) and will launch earlier, sometime in Q1 2010.

I believe its fake, like the one showed by Huang earlier. nVidia is surely in trouble.
-----------------------------------------
Posted again:
-----------------------------------------
Yeah, at 2560p. So, even that is :boink: now eh? I am happy to play Crysis @ 800x600 on my onboard HD3300, no lags. :grinsmack:
Crysis called, it wants its dignity back. :fc_bat:
 
Last edited:

asingh

Aspiring Novelist
Yeah, much better, but I think HD5850 CF can handle anything thrown at it, and 17x2=34k of HD5850CF is way VFM that 27x2=54k of HD5870CF

Yea true. But the kick of CF is -- take the two fastest single core boards, and join them together. What a feeling...! :razz::razz::razz::razz:
 

tkin

Back to school!!
Yea true. But the kick of CF is -- take the two fastest single core boards, and join them together. What a feeling...! :razz::razz::razz::razz:
A HD5870 can play Crysis 1920x1080 maxed out, no lags. Why need that extra power?? Unless you go 2560x1600.

*i303.photobucket.com/albums/nn134/Frodcord/Crysis/Crysis8x.png
 

tkin

Back to school!!
^^ That average is 28.65, if I read it correct. 60FPS is the holy grail..!
There is no GPU that can push Crysis Warhead to 60FPS with Enthusiast settings @ 8xAA and 16XCF, not even HD5970 quad CF, and unlike most games Crysis is very much playable @ 25-30FPS, they use some sort of motion compensation technology to smooth out the feeling.
 

asingh

Aspiring Novelist
^^
MCT is not being used in games as of now. It needs a predictive model to correctly work. By doing this, instead of getting FPS gain, the system gets a loss.

Game is much smoother, when the game is played at the same as the monitor refresh rate.

Here is someone with 'nice' Crysis FPS.
 

asingh

Aspiring Novelist
Also here is some news on the upcoming Fermi. Only God knows, when it will see the stores. Nevertheless, something we can anticipate about.

Link to Fermi:
*www.nvidia.com/object/fermi_architecture.html

White Paper Download and Architectural Design of Fermi:
*www.nvidia.com/content/PDF/fermi_white_papers/NVIDIA_Fermi_Compute_Architecture_Whitepaper.pdf

Basic comparison to the G80 (8xxx series) and GT200 series:
*img149.imageshack.us/img149/1517/fermi.jpg

It can be noted, that it looks "quite" similar to the design of a CPU. It has 2 level of internal cache, configurable shared memory, multi language support -- even "C".

Looks good. Hope to see this soon.
 

tkin

Back to school!!
Here's The Latest News:

NVIDIA kicks off GeForce 300-series range with GeForce 310

NVIDIA has quietly updated its product pages to include the first graphics card from the GeForce 300-series line.

Don't get too excited, though, as the first 300-series product is a low-end solution that's currently available to OEMs only. Dubbed the GeForce 310, the card isn't based on the upcoming Fermi architecture and is instead a basic GT200-series card with a GPU clocked at 589MHz and 16 stream processors clocked at 1,402MHz.

*img.hexus.net/v2/news/nvidia/geforce-310.jpg


Look familiar? It should do, as it's little more than a rebranded GeForce 210. As far as we can tell, there's no physical change, despite the massive jump in model number.

Seems as though NVIDIA might be padding out the low-end GeForce 300-series range with rebranded 200-series cards. Meanwhile, the company's first DirectX 11 cards, based on the all-new Fermi architecture and expected to fill the mid-range and high-end segments, are rumoured to have been delayed until the first quarter of fiscal 2011.

This isn't fermi, fermi launches mar-april 2009(hard launch), and this does'nt play games, a budget HTPC card.
-----------------------------------------
Posted again:
-----------------------------------------
The G51J 3D – first GeForce 3D Vision notebook from ASUS

ASUS has officially unveiled the first notebook to use NVIDIA’s GeForce 3D Vision. The G51J 3D notebook uses a 120Hz LCD and NVIDIA’s special glasses to add spatial depth to games like Borderlands or Arkham Asylum that support it. The 1.6GHz Intel Core i7 and the GeForce GTX 260M graphics combined make the 15.6-inch notebook run at its native resolution of 1,366×768 pixels. The usual practice for ASUS is to provide a range of possible options instead of concrete specifications or pricing information. The memory can reach 4GB of RAM and the hard disk drive can vary between 250GB and500GB. Options for optical drives are DVD and Blu-ray.

The G51J 3D notebook will be available together with the necessary transmitter and the special glasses. Expected date is in early December. Other companies are also planning to have their 3D notebooks available next year, such as MSI, as well as contractors like Clevo.

*www.htlounge.net/data/3/asusg51j3d.jpg

*www.blogcdn.com/www.engadget.com/media/2009/11/acer-g51j-3d-1.jpg


However although not nVidia 3D Vision, Acer launched one earlier.
*www.pcmag.com/article2/0,2817,2354515,00.asp
 
Last edited:

tkin

Back to school!!
Goddamnit. what's wrong with Nvidia. First GT3xx is renamed GT2xx card??? This is really really weird.
Well, let me confuse you a bit more, do you know first fermi GPU is called GF100?? Not even GF300 but back to 100, they have lost it and panicking. If this goes on then next year I'm getting HD5850.

*techreport.com/r.x/fermi-gpu/gf100_full_tr.png
 
Last edited:

ssk_the_gr8

Make Way the LORD is Here
Goddamnit. what's wrong with Nvidia. First GT3xx is renamed GT2xx card??? This is really really weird.

nvidia dont have new cards,they are panicking, i've heard that they have missed the clocks they wanted with fermi by as much as 20%
 

Krow

Crowman
I would not go by what Charlie @ Semi Accurate says usually. But this time, he seems to be right a good number of times. Hope nvidia is back though, as /me likes price wars.
 

NVIDIAGeek

Long Live Gojira!
Well, let me confuse you a bit more, do you know first fermi GPU is called GF100?? Not even GF300 but back to 100, they have lost it and panicking. If this goes on then next year I'm getting HD5850.*techreport.com/r.x/fermi-gpu/gf100_full_tr.png

Same here, if Fermi won't come in Q1 '10, I'll be goin' for HD5850, but darn! PhysX! NVIDIA, come on!
 

tkin

Back to school!!
I would not go by what Charlie @ Semi Accurate says usually. But this time, he seems to be right a good number of times. Hope nvidia is back though, as /me likes price wars.
The only thing Charlie ever was correct about was about the mock-up fermi board that huang showed in GPU tech, others are just speculations, before launch of GT200 charlie said the same things about missing clocks and how a 4870 will be miles ahead of GT200, well, we all know how that turned out, and even with missing clocks fermi DP output is faster than HD5870. The only issue plaguing nVidia(also ATI) is TSMC, if there was another competent fab out there or intel made one of their fabs public TSMC would've gone utterly bankrupt in no time, thanks to TSMC HD58xx is paper launch, so nVidia's not pushing it much I guess.
 

asingh

Aspiring Novelist
It will be severely bottle-necked due to the P45 Northbridge and Exxx CPU.

Hopefully mid-2010, I should be able to manage an i6/X58/6GB/2x5870/HAF 932/TX750.

Will sell of my system + ( Salary bonus where art though...! )

And Crysis 2 will be out too. Then I will benchmark to 60FPS at 60Hz monitor.

:)
 

tkin

Back to school!!
It will be severely bottle-necked due to the P45 Northbridge and Exxx CPU.

Hopefully mid-2010, I should be able to manage an i6/X58/6GB/2x5870/HAF 932/TX750.

Will sell of my system + ( Salary bonus where art though...! )

And Crysis 2 will be out too. Then I will benchmark to 60FPS at 60Hz monitor.

:)
Let me get a mop to wipe away my drool :eeek: anyway don't get your hopes high for Crysis 2, we all know what Console Ports turn out to be.

And offtopic: How will you sale your rig?? I'm interested to sell my CPU and GPU a few months later, let me know if you have any sources.
 

NVIDIAGeek

Long Live Gojira!
^Share that with me :D. I hopin' Crytek gives more attention to PC 'cause it's more powerful than consoles with DX11, I guess. It should be like this, PC to console, rather than console to PC. Crytek, I have high hopes on ye. Cevat!
 
Top Bottom