Convert a XFX 8600GT to XFX 8600GTS

Status
Not open for further replies.

wizrulz

GUNNING DOWN TEAMS
assasin said:
^^^ 8600GT is a good card.u can play games at medium to high settings depending on the game.ur proccy will be a bottleneck.oc ur proccy this sud solve the prob to some extent.

any other solution to overcome proccy bottleneck other than ocing proccy???
I cant overclock mine...as my mobo does not supports it :(
 

p_d5010

Journeyman
Hey so would you suggest me to buy 8600gt now or wait for another month to see if better mainstream card comes till then? And what should i expect from the card if I run the game Lost Planet: Extreme condition (dx10)....at high details and 1280 resolution?
 
OP
assasin

assasin

Banned
Choto Cheeta said:
Whats the Reliablity ??

as see with OCing the Procy like C2D ur X2, its quite reliabe that u will not burn it :lol:

So whats up with this one ?? Is it reliable ??

And nice post Yaar, (Whats ur Real name by the way ??) will try for sure, with the 8600 GT which one of my frnd will get this week :p

as far as reliability is concerned i never faced any probs in games,no artifacts in graphics,no lags.
have idle temps of 55-57C and full load temps of 61-68C on stock cooling.so i dont think theres any chance of burning the GPU cuz 8600GTS has the same gpu only the clk speed is more.

*img2.freeimagehosting.net/uploads/f0dfbd821e.jpg

my real name is Shuvadeep.

@ wizrulz to overcome the bottleneck either change the proccy or get a mobo with oc capabilities.else buy the card now and play and later on upgrade mobo or proccy.

@p_d5010 even if u wait for another month u wont get a better DX10 card in that range.G92 will be released sometime in Q4,so prices of 8800GTS 320MB wont fall by much.so its better to buy a 8600GT.u'll be able to play Lost Planet @ 1280 cuz i play @ 1440.
 
Last edited:

p_d5010

Journeyman
assasin said:
@p_d5010 even if u wait for another month u wont get a better DX10 card in that range.G92 will be released sometime in Q4,so prices of 8800GTS 320MB wont fall by much.so its better to buy a 8600GT.u'll be able to play Lost Planet @ 1280 cuz i play @ 1440.


Hey, can you please tell me at what detail settings you play lost planet? and u use 8600gt right? also, can you please post a screenshot of the game with FPS view on from the settings in 1280 resolution please? I am asking this coz i have seen many reviews that 8600gt is a very bad card, even if compared to 7600gt!
 

Harvik780

ToTheBeatOfUrHeart
If u mod an 8800 gtx to an 8800 ultra then u then the temperatures will soar to unwanted levels I think the same is the case with 8600 gt to gts mod but its more risky as the gt version of the 8600 cards misses out on the power connector front.:)
 
Last edited:

freshseasons

King of my own Castle
p_d5010 said:
Hey, can you please tell me at what detail settings you play lost planet? and u use 8600gt right? also, can you please post a screenshot of the game with FPS view on from the settings in 1280 resolution please? I am asking this coz i have seen many reviews that 8600gt is a very bad card, even if compared to 7600gt!

For the Value of money there is no better card than the 8600Gt series!
Go ahead its really good and if you can overclock it there is nothing like one.Upto 30% Increase with stock cooling.

If u mod an 8800 gtx to an 8800 ultra then u then the temperatures will soar to unwanted levels I think the same is the case is with 8600 gt to gts mod but its more risky as the gt version of the 8600 cards misses out on the power connector front.
The 8600Gt doesnt have 6 pin or any power connector except for the PCIe power that it gets.
So the over clocking takes a hit when we over clock it.So if one wants insanely overclocked Card this is not the one .Higher Volt mods are really out of question.But you are right this is the safest card to clock.
Still its a great deal for 8600Gt .Later when you are done we can even do SLI with it.
 
Last edited:
OP
assasin

assasin

Banned
Harvik780 said:
If u mod an 8800 gtx to an 8800 ultra then u then the temperatures will soar to unwanted levels I think the same is the case is with 8600 gt to gts mod but its more risky as the gt version of the 8600 cards misses out on the power connector front.:)

cant say bout 8800GTX and ultra but i can say that oc'ing a 8600GT is not risky cuz temps only inc by 2-5C which i think is acceptable.
so far i didnt face any probs due pci-e pwr connector being not there.
 

Pathik

Google Bot
hey this is very similar to ocing/... @ assasin did u try running any torture tests??? are there any frequent crashes or nything??
 

p_d5010

Journeyman
Hey has anyone except assasin tried to overclock using this method?

And ASSASIN....can you please tell me your XFX product code?

And please please post a screenshot of Lost planet FPS by keeping the details to high and 1280 resolution..please do this as it would help me to consider buying this card..
thanks in advance

What do you suggest betweent ati hd2600xt and nvidia 8600gt? hd 2600xt has greater clock speed....
 
Last edited:

Who

Guess Who's Back
i tried this method, & like i said it gave me 7 more fps in oblivion also 8600 GT is better card even though HD 2600xt has a faster clock, in every way its better than HD 2600 XT.
 

Rollercoaster

-The BlacKCoaT Operative-
dont u obviously need a much bigger heatsink/fan assembly. If u turn up the clocks as equal to the gts counterpart then ur card will automatically turndown the clock as it gets hot or burn out. And according to the FPS increase mentioned i would guess that ur card is downclocking automatically as a GTS would give much more fps difference and generate much more heat. check out performance reviews on websites like tomshardware.com and see the % difference and compare it with urs.

best u can do with the stock heatsink is get clocks as high as the 8600GT XXX version as it comes with the same heatsink. Which is cool too.

going to GTS is obviously too high a risk on the GT heatsink and i am not sure that the architecture is same in the GT and GTS cards.

The clocks u should use are exactly the ones from " XFX GeForce 8600GT 256MB DDR3 DUAL DVI XXX " -
Clock rate: 620 MHz
Memory Clock: 1.6 GHz
Shader Clock: 1355 MHz
Voltage - u will have to find out.
assuming u have a "XFX GeForce 8600GT 256MB DDR3 DUAL DVI" and that the architecture is exactly the same in both versions.

note that i am only making educated guesses.
 
Last edited:

p_d5010

Journeyman
so should i overclock the card through bios or use Riva tuner to overclock the card? i think this will be the safest method but i cant imagine what performance boost it can give...
 

Who

Guess Who's Back
you can't shader overclock or voltage modfication on riva so you have to use bios for that , & both of them are worth it , & @ Roller when i am playing heavy system games (oblivion,company of heroes) my card temp goes around 63 C which is safe.
 

Rollercoaster

-The BlacKCoaT Operative-
i would advise to monitor the core clock then temperature to check the stability in this specific case if possible.
 

Rollercoaster

-The BlacKCoaT Operative-
p_d5010 said:
so should i overclock the card through bios or use Riva tuner to overclock the card? i think this will be the safest method but i cant imagine what performance boost it can give...
theoretically if u OC thru software then u cant change the voltages.. so u cant do the same level of OCing when modding the bios.
 

p_d5010

Journeyman
ok ,,,,,,, but according to smit and assasin, it safely ocs to 8600gts....and within temperature range....but just in case, if the card is not able to handle the oc'ed speed, then is there any chance that we go back to the original bios b4 card gets damaged? or would it stop responding directly if it cannot handle the overclock?
 

Rollercoaster

-The BlacKCoaT Operative-
i have no idea what happens if the OC is in bios and the OC makes the card unstable. The software OC is applied after the system boots into windows so if there is a problem u can go to safe mode or something but in case of bad bios OC the system might not be able to boot... for that sole purpose motherboards have a reset jumper..but a gfx card doesnt..

may be others can make more heads or tails of this..

hey just noticed.. google search for "xfx 8600gts" gets this thread on the first page
 
Last edited:
Status
Not open for further replies.
Top Bottom