Hardware price list/spec sheet

Extreme Gamer

僕はガンダム!
Vendor
the amount of bandwidth available to your cards will be less than optimal.

this will result in an upto 10% performance drop in SLI(from what you would previously get in x16/x16 mode and 5% in Single GPU.

today's best GPUs cannot fill the x16 bandwidth.
They cant even fill an x8's bandwidth, but the lanes available decreases, so it has to change the amount of data being sent via each lane, resulting in more error-prone data transfer, correcting which increases the processing time by a small amount.
 

MegaMind

Human Spambot
the amount of bandwidth available to your cards will be less than optimal.

this will result in an upto 10% performance drop in SLI(from what you would previously get in x16/x16 mode and 5% in Single GPU.

Nope... @2560x1600 or below, an x8/x8 SLI or CFX configuration will perform the same as a x16/x8 or x16/x16 configuration.
 

Extreme Gamer

僕はガンダム!
Vendor
@megamind:do you run CF/SLI?

I do. the performance difference is there. but you cant really notice the 5-10% difference without recording the framerates.

old cards like the 8 series, 9 series, and GT200 cannot even saturate 8x. cards like GTX 560 Ti,570,580,590,GTX470,480, which do manage to saturate x8 lanes in heavy scenes do show the difference.
 
Last edited:

MegaMind

Human Spambot
the performance difference is there. but you cant really notice the 5-10% difference without recording the framerates.
It seems that at 2560x1600, even with 4X AA, there was absolutely no difference between x16/x16 and x8/x8. This is good news if you game at x8/x8 on a single display configuration at 2560x1600 and below. You simply are not missing anything, and moving up to x16/x8 or x16/x16 will yield no performance improvements or gameplay differences, even on the fastest GTX 480 SLI.

SOURCE
 

Skud

Super Moderator
Staff member
Some more:

InsideHW - CrossFire x8/x8 or x16/x4: The Ongoing Dilemma

Again at fullHD res, hardly matters.
 

topgear

Super Moderator
Staff member
Guys I think you better take a look at here ;-)

*media.bestofmicro.com/6/J/284059/original/image021.png

*media.bestofmicro.com/6/K/284060/original/image022.png

*media.bestofmicro.com/6/L/284061/original/image023.png

pics courtesy of tom's hardware

Source and The Article
 

Extreme Gamer

僕はガンダム!
Vendor
Well, it cannot be noticed even with recording framerates... :???:
average isnt decided by (max+min)/2, but by total no. of framerates/total no. of frames.

so isnt there a performance loss if you are at minimum framerate for a longer period of time?

its a 2-4% loss for the 570/6950. expect a higher loss with 580 due to more bandwidth usage.

Enthusiasts on lower budgets will find a general performance loss of 2% to 4% when switching from an x16 to an x8 slot. That's not altogether bad. Some of that difference will accumulate in SLI and CrossFire configurations, and we will address that topic in a later article.

maybe GTX400 doesnt show it but it is there in GTX 500.I was assuming that the similar architecture allowed similar bandwidth. Clearly that is not the case according to the HardOCP article.

Skud, you linked us to an article using old cards. Nowhere have I said that the 5870 can saturate x8 lanes :?
 

Skud

Super Moderator
Staff member
As per the Toms article link tg posted: the better card (570) is taking less performance hit while moving from x16 to x8 to x4, particularly with increasing resolution. By the same, a 580 should take even lesser hit.

And in the 2nd scenario, from whatever online articles I have seen, in dual card config, it hardly matters if it's x16/x16, x16/x8, x8/x8 or even x16/x4 - the fps loss is there, but at 1080p resolution, not significant enough.
 

MegaMind

Human Spambot
average isnt decided by (max+min)/2, but by total no. of framerates/total no. of frames.

so isnt there a performance loss if you are at minimum framerate for a longer period of time?

its a 2-4% loss for the 570/6950. expect a higher loss with 580 due to more bandwidth usage.

But the min. framerate is also the same @/below 2560x1600 (considering SLI scenario)

*www.hardocp.com/images/articles/1282534990Cnhf3iYXfv_1_5.gif

*www.hardocp.com/images/articles/1282534990Cnhf3iYXfv_1_4.gif

*www.hardocp.com/images/articles/1282534990Cnhf3iYXfv_1_3.gif

The link given by Topgear is of single GPU scaling @ diff. bandwidth ... And the talk here is about SLI...

@megamind:do you run CF/SLI?

I do. the performance difference is there. but you cant really notice the 5-10% difference without recording the framerates.

@Extreme Gamer, can u provide any source/link which says x16x16 vs x8x8 has considerable performance(framerate) diff. ??
 

Skud

Super Moderator
Staff member
I couldn't get one point: people buying cards to play games or record framerates? If there's no difference of gameplay experience, do we really need to be bothered about x16/x8/x4 or any other thing for that matter?

As long as the gameplay is fluid at the highest settings, nothing else matters.
 

Extreme Gamer

僕はガンダム!
Vendor
@Megamind: read my whole post:
maybe GTX400 doesnt show it but it is there in GTX 500.I was assuming that the similar architecture allowed similar bandwidth. Clearly that is not the case according to the HardOCP article.

I said that the performance decrease is there, but I never said it is considerable.10% isnt very comprehensible. 480s in SLI spit a lot of FPS, so at the upper levels, after ~120fps you cant really notice

There was a post by a guy named Discordia at the nvidia forums who runs 580s. I dont remember topic name so have been unsuccessful in finding it so far :(
I assure you that I did search for it, but the post was made many months ago.

@Skud: I was referring to %age performance drop. obviously it will be faster than 570.
 

Skud

Super Moderator
Staff member
@EG:

I was also referring to the %age performance drop only. Check the pics that topgear posted, 570 has a drop of 1-2% from x16 to x8, whereas 6950 has a drop of 4-6%. So the faster card is obviously less hit by it. Guess 580 would be hit even lesser.
 
Top Bottom