Who told you guys that the X1800 'beats' the 7800? Everywhere, where I found the comparison, people are saying that the new card from ATi is a disappointment (compared to the 7800). While it finally brings ATi up to par with nVIDIA in terms of feature set, the performance does not make it a clear cut winner.
Below are some numbers from AnandTech:
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9341.png
Here the 7800 (both) is a clear winner, though Doom3/OpenGL has always been kinder on nVIDIA.
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9342.png
This one is interesting. The 7800 actually takes a lead in DoD (albeit an insignificant one. But me thinks that a nVIDIA card one-upping an ATi one in a game based on Source is still impressive. A driver update might change that.
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9343.png
Far Cry is still one tough engine, though, as far as I know, there aren't any games coming out based on it. ATi wins here.
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9344.png
SC: CT is the newest game, and features support for every new feature there is. ATi fairly whoops nVIDIA's @ss. Though the situation changes when you push higher, as AnandTech did later. Read on...
All tests below are to check the future validity of these cards. The kids gloves are off:
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9345.png
nVIDIA still the king here. No surprise here.
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9346.png
This I found intriguing. nVIDIA actually pull away when the settings go through the stratosphere! In Source! Did Valve have a fall out with ATi?
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9347.png
ATi again wins, but by a miniscule difference of .2 FPS. With the increase in resolution nVIDIA catches up. The difference is so insignificant, that it can't be called a difference. And the thing is, since ATi had a sizable lead at lower resolutions, losing ground at higher resolutions is a major drawback.
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9348.png
The same as with Far Cry, ATi lets nVIDIA catch up at this resolution. Not good for ATi.
*images.anandtech.com/graphs/ati%20radeon%20x1000%20launch_10050581026/9349.png
nVIDIA wins here, but it probably could have gone either way.
So, going by AnandTech, X1800 is not quite the 7800 killer ATi has made it out to be. Specially, losing ground at higher settings is a big blow to ATi, since it raises questions of future proofing.
Please see: the X1800 wins at 3DMark05, with a lead of 700-1000 marks approx. However, going by synthetic benchmarks is very risky. My rig (though it barely qualifies as a 'rig'
), P4 2.4c (with HT), i865GBF, Kingston 2x256MB PC-3200 Dual Channel, XFX nVIDIA 6600GT 128MB DDR3, Seagate 80GB 7200 gets around 3400-3450 in 3DMark05. This is really strange since a system with AMD 64 3400+, Corsair 1GB RAM, XFX 6600GT 128MB DDR3 gets approx. 3171 marks. So does that make my system better? Of course not, as in real world tests that system hangs mine to dry.
enoonmai: I don't recall reading anywhere that the X1800 is a 512
bit card, but if you say so... While a higher bit memory interface is never a bad thing, it does not guarantee a better card. Case in point, I was reading a comparison between 6600GT 128MB (with 128bit interface) and the X800 GTO 256MB (with 256bit interface)(some vendors simply call it the X800
GT). The 6600GT beat the card is most tests and came out the winner, with the exception of 3DMark05, where is lagged behind a bit. The core should have enough throughput to actually match the bandwidth provided with a 512bit interface. And the X1800 features 'only' 16 pixel pipes, compared to the 24 and 20 7800GTX and 7800GT have, respectively. The higher the no. of pixel pipes (among other things), the higher the memory bandwidth requirements.
The ATi's CrossFire technology is nowhere near SLI. You can't just pickup 2 ATi cards a put them together to get a CrossFire system, like you can with nVIDIA's SLI. You need a special CrossFire Edition master card that acts as the compositing engine and output device for the system. This card will not only cost more, but you will have a hard time finding it. Moreover, with the CrossFire in place you get limited to 1600x1200 resolution, with 60Hz. If you are putting 2 extremely expensive cards together, you wouldn’t want to be limited in anyway, especially to tear-my-eyes-out 60Hz refresh.
The initial versions of nVIDIA's SLI supported either a single slot/card running at 16X PCX or a 2 card SLI setup with both running at 8X PCX. This is what the CrossFire offers even now, when nVIDIA has already updated SLI so that both PCI-e slots run at a full 16X. This will provide a drawback to CrossFire, since no way and 8X PCX can meet with the complete bandwidth requirements of a 256bit 7800, leave alone a 512bit card. Also, as far as I know, CrossFire enabled MoBos are quite hard to find, but I could be wrong.
So, in my opinion the 7800GTX (single or SLI) is a better choice of today and tomorrow, despite the X1800 being a very impressive card. The future might be favourable to the 7800 cards, with their higher pixel pipe no.
I wish ATi could have given us the same card that they developed for the XBOX 360, with the interchangeable on demand Pixel and Vertex pipelines. Now THAT would have given nVIDIA nightmares.