the amount of bandwidth available to your cards will be less than optimal.
this will result in an upto 10% performance drop in SLI(from what you would previously get in x16/x16 mode and 5% in Single GPU.
negligible difference performance wise. x8 x8 is enough for two gpu sli/cfx.@Cilus, @Vickybat and others-what is the diff. between x16 x16 sli/cfx and x16x8 SLI/CFX.
the performance difference is there. but you cant really notice the 5-10% difference without recording the framerates.
It seems that at 2560x1600, even with 4X AA, there was absolutely no difference between x16/x16 and x8/x8. This is good news if you game at x8/x8 on a single display configuration at 2560x1600 and below. You simply are not missing anything, and moving up to x16/x8 or x16/x16 will yield no performance improvements or gameplay differences, even on the fastest GTX 480 SLI.
you cant really notice without recording framerates
Well, it cannot be noticed even with recording framerates...
average isnt decided by (max+min)/2, but by total no. of framerates/total no. of frames.Well, it cannot be noticed even with recording framerates...
Enthusiasts on lower budgets will find a general performance loss of 2% to 4% when switching from an x16 to an x8 slot. That's not altogether bad. Some of that difference will accumulate in SLI and CrossFire configurations, and we will address that topic in a later article.
average isnt decided by (max+min)/2, but by total no. of framerates/total no. of frames.
so isnt there a performance loss if you are at minimum framerate for a longer period of time?
its a 2-4% loss for the 570/6950. expect a higher loss with 580 due to more bandwidth usage.
@megamind:do you run CF/SLI?
I do. the performance difference is there. but you cant really notice the 5-10% difference without recording the framerates.
maybe GTX400 doesnt show it but it is there in GTX 500.I was assuming that the similar architecture allowed similar bandwidth. Clearly that is not the case according to the HardOCP article.