Hardware price list/spec sheet

MegaMind

Human Spambot
I said that the performance decrease is there, but I never said it is considerable.10% isnt very comprehensible. 480s in SLI spit a lot of FPS, so at the upper levels, after ~120fps you cant really notice.

Still 10% diff.?? I dont think so.. May be 0.1-1% diff..(@/below 2560x1600)

I was also referring to the %age performance drop only. Check the pics that topgear posted, 570 has a drop of 1-2% from x16 to x8, whereas 6950 has a drop of 4-6%. So the faster card is obviously less hit by it. Guess 580 would be hit even lesser.

Buddy, if i'm right, almost all mobo today with single GPU installed operates @ bandwidth of x16...

maybe GTX400 doesnt show it but it is there in GTX 500.I was assuming that the similar architecture allowed similar bandwidth. Clearly that is not the case according to the HardOCP article.

Any proof?
 

Cilus

laborare est orare
Guys, initially I was running Crossfire @ X8-X8 config in my MSI 890GXM-G65 and yesterday after assembling my Rig with the Sabertooth, now running Crossfire in X16-X16 mode. I have just tested one game so far, BFBC2 and there is no performance difference in 1080P resolution. I have also ran Unigen benchmark and tested it with the previous value....the performance difference was around 1.7%.
Normally in Full HD there is maximum of 2 to 3% performance drop between X16-X16 and X8-X8 mode. EVen with the X16-X4 setting, the performance drop is less than 7% in 99% cases in Full HD. But moving upwards can hit performance a little more, but still X8-X8 is more than enough, compared to the performance avantage of X16-X16.
I have a X4 slot...will test and post the result once I get time.
 

Skud

Super Moderator
Staff member
Buddy, if i'm right, almost all mobo today with single GPU installed operates @ bandwidth of x16...


You are right for sure. :) Point is, If someone actually plugs the card in the x8 slot (for whatever reason, ignorance etc.), he is not going to miss anything. Which is actually good to know.
 

Extreme Gamer

僕はガンダム!
Vendor
@megamind:I said i was "assuming", and admitted that the HardOCP article showed otherwise(so I was wrong).
But GTX 580 is a different story.
Also, 6870 CF wont show it because it cannot saturate an x8 bus.

And yes, in single GPU all current mobos have PCIE 2.0 x16.
 

MegaMind

Human Spambot
@megamind:I said i was "assuming", and admitted that the HardOCP article showed otherwise(so I was wrong).
But GTX 580 is a different story.

580 might be an exception.... Jus wanna make sure what i understood was right :toast:

You are right for sure. :) Point is, If someone actually plugs the card in the x8 slot (for whatever reason, ignorance etc.), he is not going to miss anything. Which is actually good to know.

Yep, good to know.. :)
 

Extreme Gamer

僕はガンダム!
Vendor
:toast:

The performance difference is seen on cards that can saturate an x8 bus. :thumbup:

Its like how people said you cant use more than 1.5GB and I kept saying that I use 1.6 in GTA4 and 1.8 in Crysis 2 :D
 

Extreme Gamer

僕はガンダム!
Vendor
yeah. I set norestrictions and nomemrestrict.

I have so much volatile memory (12GB RAM + 3GB VRAM (x2 but SLI mirrors the buffer)
 

Cilus

laborare est orare
May be HD6870 can't saturate the X8 bus, but in the test in hardocp and the other tests, as far I can remember, GTX 480 is used. Are you sure that even a GTX 480 can't saturate the X8 bus.
 

vickybat

I am the night...I am...
yeah. I set norestrictions and nomemrestrict.

I have so much volatile memory (12GB RAM + 3GB VRAM (x2 but SLI mirrors the buffer)

Could you please elaborate how and where you set the values? How did you monitor the vram usage?

I don't think the gpu needs more vram to render at 1680x1080 resolution. A 580 1.5gb sli will be absolutely equal to 580 3gb sli in that resolution even when running in x8-x8 mode.
 

Extreme Gamer

僕はガンダム!
Vendor
Performance, maybe. but memory usage differs in my case. I have stated previously that for most games I set monstrous AA levels.

I used MSI afterburner to check.

you have to create a shortcut for launchGTAIV.exe (or launchEFLC.exe) and there set the command line parameters in its target properties.
you can find the whole list in the readme.
and cilus, I know what you meant.
 

SFC10

Broken In
i have a budget of 20k
i am upgrading my desktop i already have an LCD

Please suggest some config for basic gaming

CPU:
MOTHERBOARD:
CABINET:
PSU:
RAM:
HDD:

Optional Graphics Card Under 5k:
 

topgear

Super Moderator
Staff member
Ok guys the 3 images I've posted earlier was for single gpu and their scaling to get some proper idea. Now here's the Real Deal for CF and SLI ;-)

CF :

*media.bestofmicro.com/G/8/287000/original/image045.png

*media.bestofmicro.com/G/9/287001/original/image046.png

*media.bestofmicro.com/G/A/287002/original/image047.png

and SLI :

*media.bestofmicro.com/F/L/286977/original/image021.png

*media.bestofmicro.com/F/M/286978/original/image022.png

*media.bestofmicro.com/F/N/286979/original/image023.png

all pics are courtesy of Tom's HW and here's the source ;-)
P67, X58, And NF200: The Best Platform For CrossFire And SLI : Force Versus Finesse

I think this will clear some confusions very well for the users who are planing on multi gpu setup and decide them buy what they want ;-)

i have a budget of 20k
i am upgrading my desktop i already have an LCD

Please suggest some config for basic gaming

CPU:
MOTHERBOARD:
CABINET:
PSU:
RAM:
HDD:

Optional Graphics Card Under 5k:

Create a separate thread and fill this up ;-)

for a 5k gfx card get MSI HD6670 around ~5.5k.
 
Top Bottom