fanATIcs vs NVidiots

Status
Not open for further replies.

crusader77

Broken In
4Get it ANIDEX it semms u r ATi DIe HaRd FaN.

dude, "READ" the entire thread,, he's not just a fan, there are enough reasons and explanations and proofs alrite... he knows what he is talkin about .. ok?
Also, i think we should end this here, i think everything under this topic has been covered; )
 
G

gxsaurav

Guest
I don't have the source code of your app, so I can't say it is not optmised for Radeons
 
OP
P

plasmafire

Journeyman
ok u can have the src code..i think..

well aren't u the guy who photoshopped the FPS and name o ur card.. ooh i remembr u now...

@mods well i believe the warning and editing was needed..but banning.. lol looks like POTA 2 me..and to ur PAYING customers too??

well i'm getting me a 9800 pro fm my nxt month's salary..

NVidia=poor man's choice coz it runs games..playable..but 16 bit precision,so more fps..abt 7-8k
ATI= runs games fully..with all 24 bits of glory..buy if u got money 4 a really fast card..abt 10k
 

it_waaznt_me

Coming back to life ..
Raaabo said:
NO PERSONAL ATACKS | NO BAD LANGUAGE

Batty: Don't edit anything, let them do it themselves or be banned! They will learn to correct their mistakes themselves!

Ok .. I got it .. But can I issue warnings ...?
 

gamefreak14

Journeyman
crackshot said:
nVIDIA wins the battle everywhere be it Motherboard chipsets or GFX cards.
ATi ones are also good but 6800 or 6600 based cards will easily eat out X800 or X700 based cracds.

Reason better support in both hardware and software. nVidia puts update's quickly online. That's a different thing that broadband iNet is still a dream 4 INdia whereas in other countries its outdated.
Its something like AMD(=nVidia) wins over INTEL(=ATi). But don't intel was once the ruler of DESKTOP in performance and numbers as well same is the case with ATi. when it newly launched Radeon despite the tough competition of GeForce Series, it churned not only higher FPS but also quality.

It's jst the matter of time.

Good....finally! i hope we've wrapped up this discussion...I mean argument. :D
 
G

gxsaurav

Guest
go here, now anidex is claming that nvidia & FM are cheating, how the hell, look at the links I gave below, XBIT labs has proved & publically ATI has stated that they are also doing optimisations

Now he will say that MS, NVIDIA & FM are all working together against the FanATIcs

the same old saying

Angur khette hain
 

anidex

Broken In
I don't have the source code of your app, so I can't say it is not optmised for Radeons
Firstly, there is no such thing as optimising for the Radeon cards because :-

1. I've used DirectX (unlike your stinky OpenGL, it's a standard spec) and all shaders are written for shader model 2.0 specs.
2. Radeon cards run all shaders at 24 bit fp precision, unlike the FX cards that slyly shift to 16-bit precision.

Secondly, it is kind of open source. The shader file is included along with the app.

Thirdly, everyone knows that you reduced the geometry count of the scene and perhaps changed all the float variables to half variables to improve performance and further edited your dismal scores with Photoshop to show something decent. So, you really can't point a finger.
 
G

gxsaurav

Guest
if FX cards revart to 16-bit presicion, then I should have got better performance then Radeon
 

anidex

Broken In
if FX cards revart to 16-bit presicion, then I should have got better performance then Radeon
In case you forgot, the old Half-Life 2 benchmarks revealed that the GeForce FX 5900 Ultra barely caught up with the Radeon 9600 PRO even when using partial precision shaders and got completely owned at full precision. What's your point?

*tech-report.com/etc/2003q3/hl2bench/index.x?pg=2
 
OP
P

plasmafire

Journeyman
i get it..b4 the detonator drivers are released for a game,..NVidia perf.. sux in that game.. suddenly with release of newer drivers,.. there is a perf boost in the game..

the perf boost is by reducing quality of the game....

hey i'm enlightened..btw..i have a FX5700 pure.. will sell after 3'oct..any takers??
 
OP
P

plasmafire

Journeyman
thanx 4 the link GX..

Quote: anidex Sun Sep 26, 2004 4:48 pm

In our testing, all identified detection mechanisms stopped working when we altered the
benchmark code just trivially and without changing any of the actual benchmark workload. With
this altered benchmark, NVIDIA’s certain products had a performance drop of as much as
24.1% while competition’s products performance drop stayed within the margin of error of 3%.

Aren’t These Cheats Just Optimizations That Also Benefit General Game Play
Performance?
No. There are two reasons.
Firstly, these driver cheats increase benchmark performance at the expense of image quality.
Only the user and the game developer should decide how a game is meant to be experienced,
and not the hardware developer. An act by hardware developer to force a different experience
than the developer or the user intended, is an act that may mislead consumers, the OEMs and the
media who look to our benchmark to help them make purchase decisions.
Secondly, in well-designed benchmarks like 3DMark03, all cards are instructed to do the same
amount of work. Artificially reducing one card’s workload, for example, by using pre-set clip planes
or using a lower precision shader against the program’s instructions, is only aimed to artificially
manipulate the benchmark test result. Please note, that the cheating described here is totally
different from optimization. Optimizing the driver code to increase efficiency is a technique often
used to enhance game performance and carries greater legitimacy, since the rendered image is
exactly what the developer intended.

What Are The Identified Cheats?

1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver
to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the
workload. However, if the loading screen is rendered in a different manner, the driver seems
to fail to detect 3DMark03, and performs the back buffer clear command as instructed.
2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case
the driver uses instructions contained in the driver to determine when to obey the back buffer
clear command and when not to. If the back buffer would not be cleared at all in game test 2,
the stars in the view of outer space in some cameras would appear smeared as have been
reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so
that the back buffer is cleared only when the default benchmark cameras show outer space.
In free camera mode one can keep the camera outside the spaceship through the entire test,
and see how the sky smearing is turned on and off.
3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds
two static clipping planes to reduce the workload. The clipping planes are placed so that the
sky is cut out just beyond what is visible in the default camera angles. Again, using the free
camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also
reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51
drivers as far as we know.
4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this
detection to artificially achieve a large performance boost - more than doubling the early
frame rate on some systems. In our inspection we noticed a difference in the rendering when
compared either to the DirectX reference rasterizer or to those of other hardware. It appears
the water shader is being totally discarded and replaced with an alternative more efficient
shader implemented in the drivers themselves. The drivers produce a similar looking
rendering, but not an identical one.
5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the
shader is being totally discarded and replaced with an alternative more efficient shader in a
similar fashion to the water pixel shader above. The rendering looks similar, but it is not
identical.
6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection
proved to reduce the frame rate with these drivers, but we have not yet determined the cause.
7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection
drops the scores with these drivers. This cheat causes the back buffer clearing to be
disregarded; we are not yet aware of any other cheats.
8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the
driver. When we prevented this detection, the performance dropped by more than a factor of
two in the 2.0 pixel shader test.

What Is the Performance Difference Due to These Cheats?
A test system with GeForceFX 5900 Ultra and the 44.03 drivers gets 5806 3DMarks with
3DMark03 build 320.
The new build 330 of 3DMark03 in which 44.03 drivers cannot identify 3DMark03 or the tests in
that build gets 4679 3DMarks – a 24.1% drop.

What Happens Now?
When 3DMark03 is altered slightly, NVIDIA drivers do not recognize 3DMark03 anymore, and the
performance drops. The same slightly altered 3DMark03 version can be run on other hardware
and the results remains the same.
 

theraven

Technomancer
it_waaznt_me said:
Raaabo said:
NO PERSONAL ATACKS | NO BAD LANGUAGE

Batty: Don't edit anything, let them do it themselves or be banned! They will learn to correct their mistakes themselves!

Ok .. I got it .. But can I issue warnings ...?
and i quote Raabo when i say this
and remember guys ... a warning from a member is also considered as a warnin
tho this is really gettin outta hand and spreadin to other topics as well
 

AlphaOmega

Journeyman
Man, I really don’t get people who swear by certain companies and brands no matter what, be it Intel vs. AMD or (as in this case), nVIDIA vs. ATI.

What you guys (the ones fighting) don’t realize is that any two companies in competition will invariably gain and lose the top spot. We have seen it happen over and over again. It is more apparent in the hardware industry because of their incredibly small product cycles.

Take the example of Intel and AMD. Intel was unchallenged till the time of P3. The P3 was a good product; however it was beaten by Athlon. In the next generation, the P4 (I am not considering the Willamette, only the Northwood) proved to be superior (at least in most benchmarks) than the Athlon XP. The again, in the current generation, AMD FX 64 kicks the Prescott’s @$$.

Similarly, in the graphics industry, nVIDIA was totally unchallenged up till the time of the GeForce4 Ti (despite some competition from the Radeon 8500). Then it got complacent, and ATI was able to steal its thunder with the Radeon 9700. Due to nVIDIA resting on its laurels, their FX chip was unable to beat the Radeon 9xxx series (except FX 5200 vs. 9200). Now the 6800 chip beats the X800 chip in almost all tests (as proven by multiple sites). In fact, the X600 is nothing more than an updated version of the 9600 chip.

My point is that no company can dish out market leading products for long, because the other guys are not fools. The necessity of survival dictates that they will try their best to make their next product better than the other company's. A man fights best when he is backed into a corner.

If you consider pure performance, I think it would be foolish to deny that the Radeon 9xxx series is superior to the FX series in every market segment, especially when pixel shader performance was concerned. nVIDIA cards were and are better at OpenGL than ATI cards. While nVIDIA does offer some innovative AA techniques (like Quincunx), ATI’s renderer does not take as big a hit as nVIDIA’s when AA is turned on.

All this has changed with the release of NV40. This core is far superior to anything in the market. While the X800 is a really powerful card, it can’t touch the 6800, with regards to both, raw power and feature set. If you want numbers go over to *www.thetechlounge.com/review.php?directory=xfx_geforce_6800_gt_256&page=6 for Doom3 card comparison or to *www.thetechlounge.com/review.php?directory=xfx_geforce_6800_gt_256&page=7 for HL2 (Counter-Strike: Source) card comparison.

Anidex, while the 9600 is a good card, it is not comparable to a 5900. It is the best card in the mid range segment, beating the 5600 and 5700. Saying that the PS 3.0 is not relevant because no games are coming of it is wrong. When buying anything new, it is wise to look at today and tomorrow. Anyway, there are games available which support PS 3.0 today, like Far Cry (with patch 1.2). Also, nVIDIA drivers are superior to ATI’s. OpenGL is not cr@p. In fact, the almighty Carmack endorses it! DX and OpenGL both have their advantages and disadvangates.

ATI does not have as strong a market presence as nVIDIA, especially in India. Ditto with AMD. People will buy a brand with higher visibility, regardless of performance and/or price. That’s why you will find more FX cards than Radeon 9xxx (despite the latter being superior).

nVIDIA has been cheating in the FX drivers (53 to 56), like in 3DMark03. The guys at Future mark not only confirmed it, they even released a new build to counter this problem. nVIDIA also cheated with Far Cry. The FX cards got a significant FPS drop if you renamed the FarCry.exe file to something else. This was wrong of nVIDIA. However, ATI cheated with Quake III benchmarks, dropping the quality slightly to get a FPS boost. I don’t fully remember what happened, but it had to do with Radeon 8500 (I think).
NOTE: there have been no report of either nVIDIA or ATI cheating with their new cards.

Moreover, almost the entire industry agrees that the 6800 is faster and better than the X800 (excluding certain cases e.g. Battlefield: Vietnam).
 

tarey_g

Hanging, since 2004..
what the ...........**** is going on!!!!!!!!!



*www.nvnews.net/vbulletin/images/smilies/locked.gif



****=hell :wink:
 
G

gxsaurav

Guest
Look for yourself, AlphaOmega saying are true, no compony can donminate the market for long, it was ATI now it's NVIDIA but fanATIcs don't admit, while I admitd that FX cards don't have good performance to comparable radeons

What I hate in ANIDEX , is that he says, OpenGL is dead for gaming, it is useless & bad, & nvidia is the only one doing cheating & shader optimisation, as I said earlier,it doesn't matter shader replacement ot optimisation, as long as I get good quality I will enable them, I don't mindoptimizing my GFX Card & core.
 
G

gxsaurav

Guest
Yaar U R a coder, everybody isn't U know what is the difference between real Trilinear Filtring & Billiner, 16-bit or 32-bit, every doesn't.

NVIDIA FX series support both 32-bit & 16-bit precision while ATI only supports 24-BIt, NVIDIA does 32-bit every where, but does 16-bit insted of 32-bit if 32-bit is not reallt required somewhere
 
G

gxsaurav

Guest
Where is ANIDEX now, I think underground, it's hard to belive for him that ATI is also doing it, well he will soon come back & say me, @~$3#S@ & all things like that

oh! & NVIDIOT too
 
Status
Not open for further replies.
Top Bottom