X1800XT vs. 7800 GTX

Status
Not open for further replies.

venkat1605

Broken In
Hey Icecoolz,I an really sorry for what happened between us.I personally wanted to ask u a question.My brother is coming back from Canada & i have asked him to get a GFX card, he personally recommended a X1800XT (512 MB) which is to be shortly released,but i also have the 7800 GTX in my mind.Which one should i go for.If its 7800 GTX i would use it in SLI & if its a X1800XT i would use it in CROSSFIRE.Hope u reply soon.This is opened to others also.
 

enoonmai

Cyborg Agent
The X1800XT is better in Crossfire mode or even in Single Card mode when it comes to the 7800 GTX, but things aren't really crystal clear. While the ATI cards are better when it comes to D3D games, the Nvidias are better when it comes to OpenGL games. Although the performance difference between them otherwise isn't all that great. Of course, the X1800's 512-bit internal memory architecture, is of course, vastly superior. You've got to get yourself a Crossfire system of course, and a superior top of the line CPU, like an FX-57 or a dual core 4400+ to fully push these babies to their max.

Just wondering, casually, of course, is it possible to use two Dual 7800 GTs in SLI mode, kinda like dual SLI. Just wondering, and a bit too lazy to Google.
 

sahil_blues

In the zone
@venkat probably you can remove the "Attn Icecoolz" part and let it be just "X1800XT vs. 7800 GTX"....i think it'll help you get a lot more reviews....i hope you don't mind....it is only a suggestion....

@enoonmai did u check out my link??....
 

enoonmai

Cyborg Agent
@Sahil: Yes, I am well aware of the benchmarks that thrash the 7800 GTXs. However, its undeniable that Nvidia has an edge over ATI in OpenGL games, and ATI to have an edge over Nvidias in D3D games. And most new game developers and publishers are "aligning" with either of these companies and optimizing their code to run better on specific cards. For example, no matter what you try, you cannot run Doom 3 as perfectly on a X800 as you can on a 6800, and vice versa for HL2. There will always be a subtle, but definitely noticeable, difference. In the end, benchmarks are nothing but "show off" numbers. True, they are indicative of performance to a degree, but a higher score does not necessarily translate to a better performance, especially when it comes to "hardware optimized" games like D3, HL2 etc. that align themselves with a particular vendor.

In the end, it really doesn't matter how much the X1800 scores over the 7800s. Because Nvidia has already won this round. Apart from a magnificient paper and actual release of the card, the 7800 series cards are very widely available everywhere, unlike the phantom series ATI cards, that are available only in limited quantities. A customer might just be tempted to go in for a 7800 GTX SLI setup rather than miss out on the X1800 XT Crossfire setup and sit and watch while his friends play at glorious resolutions.
 
G

gxsaurav

Guest
haven't u read alll the benchmarks posted all over the internet

7800GTX has won the round already, it's available today, at a price far lower then even a X1800XL, X1600XT is launched to beat 6600GT but it costs as much as 6800nU ($250), which beats it by fair margin

F.E.A.R is out now, which is directX 9c based, but since there are no X1800XT to be found, people has no choice but to buy a 7800GTX to play it at max settings

Quake 4 is out now, which is OpenGL based, which playes best on 7800GTX

one thing to note, none of the benchmarks, are compleately fair, they are benchmarking a 512 MB X1800XT with 256 7800GTX, because of whch Z1800XT beats it by about 10 frames max, at the most common playing settings of 1024X768 with AA & Anis, 7800GTX beats X1800XT easily, despite having low RAM

ennonmai

Matrox parhelia already had 512 bit internal interface a long time back, which wasen't used properly at that time, so it came out as a failure, i guess this might happen with ATI too
 

Nemesis

Wise Old Owl
From these two synthetic and four "real world" game engines you can see that ATI has taken the 16 pixel shader and 8 vertex shader pipelines from the X850 generation and massively overhauled them into a highly efficient system. In many of the tests at 10x7 and 16x12 resolutions, the 24 and eight pipelines of the NVIDIA GeForce 7800 GTX could not keep up with the ATI X1800XT.

One of the most drastic leaps was in 3DMark 2005 where the 16x12 4xAA 8xAF score lead the NVIDIA 7800 GTX by over 1,100 marks.

Source: THG

While the ATi card has more memory, you seem to conveniently ignore that the fact that it uses only 16 pipelines as compared with 24 for the 7800GTX. Besides, when you can afford such high-end cards, a price difference of $50-100 makes no difference if you can get better performance.
 

blade_runner

Cyborg Agent
Between the x1800xt and 7800gtx i would suggest going for the x1800xt series. Its clearly faster with amazing IQ, angle independent AF, HDR+AA and avivo. Plus remember the fact that it beats the 7800gtx series without getting proper drivers. Yes, the drivers havent been optimised as yet since it is a fairly new architecture. once the drivers mature you shud see a significant increase in performance. I'll ask the same question once Cats 6 are out ;) we'll see who's faster then.

Nemesis pointed out this thread to me and i cudn't resist. :D

EDIT: Also last week a small registry tweak gave a 30% increase to x1800 cards in doom 3 with aa at higher resolutions. This is just the tip of the iceberg here. ;)
 

deathvirus_me

Wise Old Owl
7800GTX anyday .. much better architecture .... moreover the extra pipes surely helps when u turn on aa,af .....

Moreover .. having 512 MB ram over a 256 bit mem. interface doesn't give a big improvement .. if it did then the 512 MB 6800U would have been better than the 7800GT ......
 

AlphaOmega

Journeyman
Who told you guys that the X1800 'beats' the 7800? Everywhere, where I found the comparison, people are saying that the new card from ATi is a disappointment (compared to the 7800). While it finally brings ATi up to par with nVIDIA in terms of feature set, the performance does not make it a clear cut winner.
Below are some numbers from AnandTech:
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9341.png
Here the 7800 (both) is a clear winner, though Doom3/OpenGL has always been kinder on nVIDIA.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9342.png
This one is interesting. The 7800 actually takes a lead in DoD (albeit an insignificant one. But me thinks that a nVIDIA card one-upping an ATi one in a game based on Source is still impressive. A driver update might change that.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9343.png
Far Cry is still one tough engine, though, as far as I know, there aren't any games coming out based on it. ATi wins here.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9344.png
SC: CT is the newest game, and features support for every new feature there is. ATi fairly whoops nVIDIA's @ss. Though the situation changes when you push higher, as AnandTech did later. Read on...

All tests below are to check the future validity of these cards. The kids gloves are off:
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9345.png
nVIDIA still the king here. No surprise here.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9346.png
This I found intriguing. nVIDIA actually pull away when the settings go through the stratosphere! In Source! Did Valve have a fall out with ATi?

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9347.png
ATi again wins, but by a miniscule difference of .2 FPS. With the increase in resolution nVIDIA catches up. The difference is so insignificant, that it can't be called a difference. And the thing is, since ATi had a sizable lead at lower resolutions, losing ground at higher resolutions is a major drawback.


*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9348.png
The same as with Far Cry, ATi lets nVIDIA catch up at this resolution. Not good for ATi.

*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9349.png
nVIDIA wins here, but it probably could have gone either way.

So, going by AnandTech, X1800 is not quite the 7800 killer ATi has made it out to be. Specially, losing ground at higher settings is a big blow to ATi, since it raises questions of future proofing.

Please see: the X1800 wins at 3DMark05, with a lead of 700-1000 marks approx. However, going by synthetic benchmarks is very risky. My rig (though it barely qualifies as a 'rig' :lol: ), P4 2.4c (with HT), i865GBF, Kingston 2x256MB PC-3200 Dual Channel, XFX nVIDIA 6600GT 128MB DDR3, Seagate 80GB 7200 gets around 3400-3450 in 3DMark05. This is really strange since a system with AMD 64 3400+, Corsair 1GB RAM, XFX 6600GT 128MB DDR3 gets approx. 3171 marks. So does that make my system better? Of course not, as in real world tests that system hangs mine to dry.

enoonmai: I don't recall reading anywhere that the X1800 is a 512bit card, but if you say so... While a higher bit memory interface is never a bad thing, it does not guarantee a better card. Case in point, I was reading a comparison between 6600GT 128MB (with 128bit interface) and the X800 GTO 256MB (with 256bit interface)(some vendors simply call it the X800GT). The 6600GT beat the card is most tests and came out the winner, with the exception of 3DMark05, where is lagged behind a bit. The core should have enough throughput to actually match the bandwidth provided with a 512bit interface. And the X1800 features 'only' 16 pixel pipes, compared to the 24 and 20 7800GTX and 7800GT have, respectively. The higher the no. of pixel pipes (among other things), the higher the memory bandwidth requirements.

The ATi's CrossFire technology is nowhere near SLI. You can't just pickup 2 ATi cards a put them together to get a CrossFire system, like you can with nVIDIA's SLI. You need a special CrossFire Edition master card that acts as the compositing engine and output device for the system. This card will not only cost more, but you will have a hard time finding it. Moreover, with the CrossFire in place you get limited to 1600x1200 resolution, with 60Hz. If you are putting 2 extremely expensive cards together, you wouldn’t want to be limited in anyway, especially to tear-my-eyes-out 60Hz refresh.
The initial versions of nVIDIA's SLI supported either a single slot/card running at 16X PCX or a 2 card SLI setup with both running at 8X PCX. This is what the CrossFire offers even now, when nVIDIA has already updated SLI so that both PCI-e slots run at a full 16X. This will provide a drawback to CrossFire, since no way and 8X PCX can meet with the complete bandwidth requirements of a 256bit 7800, leave alone a 512bit card. Also, as far as I know, CrossFire enabled MoBos are quite hard to find, but I could be wrong.

So, in my opinion the 7800GTX (single or SLI) is a better choice of today and tomorrow, despite the X1800 being a very impressive card. The future might be favourable to the 7800 cards, with their higher pixel pipe no.

I wish ATi could have given us the same card that they developed for the XBOX 360, with the interchangeable on demand Pixel and Vertex pipelines. Now THAT would have given nVIDIA nightmares.
 

goobimama

 Macboy
The thing about the 7800 is that its available. Nvidia has made sure there is a proper distribution of its new card. The 1800, God alone knows when India will be able to use it...
 

AlphaOmega

Journeyman
goobimama said:
The thing about the 7800 is that its available. Nvidia has made sure there is a proper distribution of its new card. The 1800, God alone knows when India will be able to use it...

The availability of 7800 is a plus point for nVIDIA. But if Venkat's chooses it over the 7800 and his brother can get it for him, then avilability becomes a moot point, atleast for this discussion.
 

Major-Minor

Broken In
AlphaOmega said:
enoonmai: I don't recall reading anywhere that the X1800 is a 512bit card, but if you say so...


The X1800 has 512bit internal memory ring bus, externally it still has a 256bit memory interface.

Oh and if anyone is interested, I just found out yesterday that BigByte has stopped stocking the XFX cards, they will now be selling only their own brand of cards, the BIG brand, if you were wondering.
I also got the rates for both the brands (inclusive of taxes) -
XFX 7800GTX - 31k
XFX 7800GT - 26k
(Rates from Rashi Peripherals)

BIG 7800GTX - 29k
BIG 7800GT - 24k
(Rates from BigByte)

I was also told by Mr. Vikas at BigByte that the BIG 7600 should probably be available in Dec.
 

funkymonkey

Journeyman
Fir let the nvidia announce 7600 and then we will see ;)
About the 7800 series.
It has its own series of issues. Both cards are more than powerful enough to run your game sat max settings. I own 7800GT. And the same problem of alpha textures that exsisted in GF6 series is there with GF7 series.
What does it mean?
Well where game uses alpha estures to render shadows there will be huge problems with 7800GTX or GT.
The shadows dont get rendered correctly and appear as courrupted blocky testure.
Same things are rendered beautifully on any ati card from 9700 to X1800 series.
Thats disappoint to see. IQ wise ATI has upper hand at this moment. Its your choice what to pick.
Given me the choice to pic wither 7800GTX or X1800XT i would pick X1800XT any given day.
 
G

gxsaurav

Guest
even I m waiting for 7600 series, since it supports OpenGL 2.0 which i need, at last a viable upgrade from my 5900XT
 

AlphaOmega

Journeyman
gxsaurav said:
even I m waiting for 7600 series, since it supports OpenGL 2.0 which i need, at last a viable upgrade from my 5900XT


Doesn't the GF 6x00 support OpenGL 2.0? I think it does, as my 6600GT box says so, and so does the XFX, eVGA, BioStar site.
*www.biostar.com.tw/products/vga/GeForce_6600_Series/index.php3
*www.xpcgear.com/evga6600tx.html
*www.xfxforce.com/web/product/listConfigurationDetails.jspa?productConfigurationId=1084


funkymonkey said:
Well where game uses alpha estures to render shadows there will be huge problems
Well, the 7800 is the first, and currently only, card to support AA in alpha textures. Traditionally, card can only remove the jaggies from the edges of textures, but the new AlphaAA can remove them from inside textures, where they are transparent.
The blocky shadow problem was also effecting ATi cards, at least in UT 2004, as far as I know. That problem was removed from BF2 by using ForceWare 77 or higher. ATi ruled the roost, in IQ, during the time of the Radeon 9x00 series. Even the lowly 9200 was visibly better than comparable nVIDIA cards (in Doom3, which I have seen on 9200 and 5200/5600). But now, with the 6 and 7 series, ATi and nVIDIA are more or less on par. Also, pushing quality settings, AA and AF on the 7800 will be easier, due to its 8 extra pixel pipes.
 

asdf1223

Journeyman
comparing anandtech and toms benchmark the 7800gtx is the fastest(1600x1200 no aa) but it seems x1800xt takes the lead with aa/af all the way up. but atis problems are availablity of x1800 xts/master cards,power consumption and the fact that nvidia can beat them any day with an ultra version.
 

AlphaOmega

Journeyman
asdf1223 said:
comparing anandtech and toms benchmark the 7800gtx is the fastest(1600x1200 no aa) but it seems x1800xt takes the lead with aa/af all the way up. but atis problems are availablity of x1800 xts/master cards,power consumption and the fact that nvidia can beat them any day with an ultra version.

ATi has really created a super-efficient engine, something nVIDIA should try to emulate. But I wonder, how far will an engine that is just efficient be able to go against an engine that is much more powerful, with 50% extra of the no. of pixel pipes? Especially when games are getting more and more shader intensive, like the upcoming Unreal Engine 3.
This can be seen when the resolution is really increased, which is even more stressing than enabling AA, as when AnandTech sets it to 2048x1536, the 7800 either increases its lead or catches up to the X1800.

Maybe, in the following generation, ATi will marry its efficiency to a powerful engine and then we will have another case like AMD and Intel, where AMD pulled away with sheer efficiency during the Athlon XP, but could not keep up with the amped up Northwood later. Then AMD not only made an efficient core, but one that was also super powerful, the Athlon64. Intel is still reeling in the aftermath...
IMO, ATi already has such a core, the XBOX 360 chip. I am betting that the only thing holding ATi back from releasing it now is some kind of agreement with Microsoft. The next gen ATi card for PCs will most likely carry that architecture, but that is still 6 months away :(
 

Nemesis

Wise Old Owl
Correct me if I'm wrong, but aren't ATi already working on the R580? If that's the case, then we can expect to see a terrific card from ATi that builds up on the R520. As far as I know, the 360 is just using a modified R520 - even Nintendo will be using a custom R520 chip. I doubt if ATi will be allowed to release these custom chips for the PC market.
 
Status
Not open for further replies.
Top Bottom