Search results

  1. A

    gforce ti

    Technically yes, practically no. Check out the scores people got on the "The GRAPHICS CARDS List" for a good example of how badly the GeForce FX cards perform when put under extreme floating point math stress.
  2. A

    gforce ti

    Watch your tongue buddy. The high setting in Doom 3 only uses uncompressed color maps. It doesn't introduce any new shaders. Anyway, you should only run the high setting on top mainstream cards like the 9500 Pro /5600 Ultra and above cards. You'd be a fool to run the game at high settings on...
  3. A

    Is any fault in my graphics card?

    Please, Please and Please benchmark your 6800 with the app supplied in the "The GRAPHICS CARDS List" thread. Thanks.
  4. A

    gforce ti

    You can play the game with all those cool effects at a lower resolution just as fast as the Ti. With the Ti, you can't even see those effects though you're playing at a higher resolution.
  5. A

    fanATIcs vs NVidiots

    In case you forgot, the old Half-Life 2 benchmarks revealed that the GeForce FX 5900 Ultra barely caught up with the Radeon 9600 PRO even when using partial precision shaders and got completely owned at full precision. What's your point? *tech-report.com/etc/2003q3/hl2bench/index.x?pg=2
  6. A

    For sale two ASUS Gforce4Ti4200 and one MSI FX5200

    Ti4200 for 7500 bucks? You gotta be kidding me!
  7. A

    fanATIcs vs NVidiots

    Firstly, there is no such thing as optimising for the Radeon cards because :- 1. I've used DirectX (unlike your stinky OpenGL, it's a standard spec) and all shaders are written for shader model 2.0 specs. 2. Radeon cards run all shaders at 24 bit fp precision, unlike the FX cards that slyly...
  8. A

    fanATIcs vs NVidiots

    Thanks for your support Cru :).
  9. A

    gforce ti

    The 5200 can run fragment shaders that the Ti can only dream of. All of the special effects like heat haze need a DX9 level card. So what if the FX 5200 can run the game decently only at 800x600? The Ti ain't gonna be able to come close to the FX 5200 visually anyway!
  10. A

    gforce ti

    I suggest staying away from DX8.1 cards, be it the GeForce 4 Ti or the Radeon 9200. The GeForce FX 5200 is a good choice for someone with a low budget, but the X300 is even better.
  11. A

    fanATIcs vs NVidiots

    Oh sure, would you kindly get to the thread "The GRAPHICS CARDS List" and see how every nVIDIA card is getting owned! I'm an ATI fan only because they're better. If nVIDIA truly gets ahead of them anytime in the future and stops cheating, I'd obviously support them. That's because I'm a...
  12. A

    Review of Hercules 3D prophet 9600XT

    FX 53? I thought you owned the FX 51!
  13. A

    fanATIcs vs NVidiots

    Hmm... yeah sure... In case you didn't know, nVIDIA's "new" architecture has been inherited from the GeForce FX series. Plus, nVIDIA is governed by PR whereas ATI is guided by true willingness to improve. Result: nVIDIA = Intel. ATI = AMD.
  14. A

    fanATIcs vs NVidiots

    That's dumb. That's like claiming that, if you have a Pentium 2, it's better than someone else's Athlon FX 53 :lol:!
  15. A

    Review of Hercules 3D prophet 9600XT

    Low-k memories and higher clock speed. But this is only valid at low resolutions. At high resolutions (> 1024x768), the 128-bit bus becomes a bandwidth bottleneck and the 9600 XT starts falling behind rapidly.
  16. A

    fanATIcs vs NVidiots

    Mayur, if you don't like the forums, get the "fish" out of here. Hmm... now that we've gotten that out of the way... Actually, the Radeon 9600 outperforms even the GeForce FX 5900 Ultra when taking DX9 into consideration. Oh yes. They're absolutely optimized for every game out there! Most...
  17. A

    fanATIcs vs NVidiots

    HeHeHe... yeah, gotta chill out... :lol:! Just that this chap denies the evidence, that ATI is currently doing better than nVIDIA, even after it's been tatooed onto his eyeballs :lol:!
  18. A

    The Big Question ?????

    The backbuffer is an off-screen surface to which the scene is rendered. The contents of the back and front buffers are then swapped to display the contents on the monitor.
  19. A

    fanATIcs vs NVidiots

    Cause nVIDIA generally shifts to partial precision (16-bit) to get decent framerates, whereas ATI runs all shading calculations at 24-bit precision. That along with shader replacements leads to the overall degradation in the quality. Not to speak of ATI's superior anti-aliasing and anisotropic...
  20. A

    fanATIcs vs NVidiots

    Or perhaps nVIDIA needs too much time to incorporate all their shader replacements... one thousand shaders per game, and a 100 such games on the market... that means that the nVIDIA boys write 100,000 replacement shaders... and that takes a lot of time :lol:!
Top Bottom