Technically yes, practically no. Check out the scores people got on the "The GRAPHICS CARDS List" for a good example of how badly the GeForce FX cards perform when put under extreme floating point math stress.
Watch your tongue buddy.
The high setting in Doom 3 only uses uncompressed color maps. It doesn't introduce any new shaders. Anyway, you should only run the high setting on top mainstream cards like the 9500 Pro /5600 Ultra and above cards. You'd be a fool to run the game at high settings on...
You can play the game with all those cool effects at a lower resolution just as fast as the Ti. With the Ti, you can't even see those effects though you're playing at a higher resolution.
In case you forgot, the old Half-Life 2 benchmarks revealed that the GeForce FX 5900 Ultra barely caught up with the Radeon 9600 PRO even when using partial precision shaders and got completely owned at full precision. What's your point?
*tech-report.com/etc/2003q3/hl2bench/index.x?pg=2
Firstly, there is no such thing as optimising for the Radeon cards because :-
1. I've used DirectX (unlike your stinky OpenGL, it's a standard spec) and all shaders are written for shader model 2.0 specs.
2. Radeon cards run all shaders at 24 bit fp precision, unlike the FX cards that slyly...
The 5200 can run fragment shaders that the Ti can only dream of. All of the special effects like heat haze need a DX9 level card. So what if the FX 5200 can run the game decently only at 800x600? The Ti ain't gonna be able to come close to the FX 5200 visually anyway!
I suggest staying away from DX8.1 cards, be it the GeForce 4 Ti or the Radeon 9200. The GeForce FX 5200 is a good choice for someone with a low budget, but the X300 is even better.
Oh sure, would you kindly get to the thread "The GRAPHICS CARDS List" and see how every nVIDIA card is getting owned! I'm an ATI fan only because they're better. If nVIDIA truly gets ahead of them anytime in the future and stops cheating, I'd obviously support them. That's because I'm a...
Hmm... yeah sure... In case you didn't know, nVIDIA's "new" architecture has been inherited from the GeForce FX series. Plus, nVIDIA is governed by PR whereas ATI is guided by true willingness to improve.
Result: nVIDIA = Intel. ATI = AMD.
Low-k memories and higher clock speed. But this is only valid at low resolutions. At high resolutions (> 1024x768), the 128-bit bus becomes a bandwidth bottleneck and the 9600 XT starts falling behind rapidly.
Mayur, if you don't like the forums, get the "fish" out of here.
Hmm... now that we've gotten that out of the way...
Actually, the Radeon 9600 outperforms even the GeForce FX 5900 Ultra when taking DX9 into consideration.
Oh yes. They're absolutely optimized for every game out there! Most...
HeHeHe... yeah, gotta chill out... :lol:! Just that this chap denies the evidence, that ATI is currently doing better than nVIDIA, even after it's been tatooed onto his eyeballs :lol:!
The backbuffer is an off-screen surface to which the scene is rendered. The contents of the back and front buffers are then swapped to display the contents on the monitor.
Cause nVIDIA generally shifts to partial precision (16-bit) to get decent framerates, whereas ATI runs all shading calculations at 24-bit precision. That along with shader replacements leads to the overall degradation in the quality. Not to speak of ATI's superior anti-aliasing and anisotropic...
Or perhaps nVIDIA needs too much time to incorporate all their shader replacements... one thousand shaders per game, and a 100 such games on the market... that means that the nVIDIA boys write 100,000 replacement shaders... and that takes a lot of time :lol:!
Hi Guest we just wanted to alert you to a major change in the forum. We will no longer be allowing the posting of outgoing links. Please use the attachment feature to attach media to your posts.