Info/Discussion : How to configure a Graphics Card dedicated for Phyx, Baseline and Aux

Status
Not open for further replies.

sam_738844

Wise Old Owl
:oops: May sound Stupid, but this thread requires information as the title suggests. I have seen members in this forum having gaming rigs with high-end GPU's as a baseline along with another GPU aligned only for Phyx. Even scnearios where the Baseline Card a Top-notch AMD beast (say HD7970) and the other card,an Nvidia one not so dazzling (say GTX260) but enough to serve the phyx purpose. So question is

1. How can the system know that the 2nd card is for Phyx while the first will drive the main liner.
2. Drivers and S/W specifications, tweaks or tunings needed for the above. Two drivers are obviously different so how to?
3. Installation and hardware requirement.If its similar to SLI/CF then motherboard and cooling requirements.
4. Is the above worth investing?

Please enlighten me.
 

Myth

Cyborg Agent
What is your system config ?
What do you currently own and what do you plan to purchase ?

If primary card are nvidia, setup can be done in the nvidia control panel.
If primary gpu is amd -> Hybrid PhysX mod v1.03 / v1.05ff
 

ico

Super Moderator
Staff member
4. Is the above worth investing?
No.

How many games are using "nVidia PhysX" anyway and are worth playing?

Both the Batman games, Borderlands 2 and Mafia II. I won't spend for an additional nVidia card to only play these games with some extra effects.

If you're thinking that the second nVidia card will help you out in every game, then you are mistaken. Only 4-5 selective games. That's it.

Buy a card which performs better in every game.
 
OP
sam_738844

sam_738844

Wise Old Owl
^^ Well its true that few games actually use it. the reason that I had it n my mind is i found AMD and its 12.11 is ripping games apart with post HD7950 while nVidia trailing with GTX6XX. But AMD is least bothered with Phyx and nVidia has this feather up their cap..so why not use the best of both within a convenient budget if possible.As Phyx wont be a "Priority" as you highlighted, so did i think about a cheaper card to serve the purpose. But if phyx and its visible significance turns to be very feeble in future, i just might have to think twice.

I don't have a rig right now but planning to buy one next year. there are so many useful threads out there, hence comparative study :) budget can vary from 50K to even 100K.

What is your system config ?
What do you currently own and what do you plan to purchase ?

If primary card are nvidia, setup can be done in the nvidia control panel.
If primary gpu is amd -> Hybrid PhysX mod v1.03 / v1.05ff

Thanks for the info :)
 

anirbandd

Conversation Architect
No.

How many games are using "nVidia PhysX" anyway and are worth playing?

Both the Batman games, Borderlands 2 and Mafia II. I won't spend for an additional nVidia card to only play these games with some extra effects.

If you're thinking that the second nVidia card will help you out in every game, then you are mistaken. Only 4-5 selective games. That's it.

+1

add to it the power consumption in the nvidia card even when its not serving any purpose...
 

ico

Super Moderator
Staff member
^^ Well its true that few games actually use it. the reason that I had it n my mind is i found AMD and its 12.11 is ripping games apart with post HD7950 while nVidia trailing with GTX6XX. But AMD is least bothered with Phyx and nVidia has this feather up their cap..so why not use the best of both within a convenient budget if possible.As Phyx wont be a "Priority" as you highlighted, so did i think about a cheaper card to serve the purpose. But if phyx and its visible significance turns to be very feeble in future, i just might have to think twice.
See, first of all.

Every game has some sort of Physics engine. Almost every developer chooses to run it off the CPU. Reason being, they want a neutral experience for everyone. It's not like AMD is not bothered. It's just that they don't believe in gimmicks and fragmentation. Same isthe case with 99.5% of PC game developers.

In "nVidia PhysX's" case, may be some developers got into exclusive deal with nVidia to implement "nVidia only" effects. That's the case.

One more thing. Earlier nVidia was claiming that their "PhysX" is faster than Physics calculations running off the CPU. This was pure FUD. They used to run outdated X87 instructions on CPU instead of SSE x.

I'd advise you to not waste money in gimmicks. But if you want to, AMD + nVidia card is better for this. If you get a single nVidia card and turn on "PhysX" for those select few games, performance drops to 60%.

Obviously had the developer gone the neutral way of implementing these effects, performance drop wouldn't have been this large. In any case, a typical customer is fooled by marketing and wastes his money imho.
 
OP
sam_738844

sam_738844

Wise Old Owl
See, first of all.

Every game has some sort of Physics engine. Almost every developer chooses to run it off the CPU. Reason being, they want a neutral experience for everyone. It's not like AMD is not bothered. It's just that they don't believe in gimmicks and fragmentation. Same isthe case with 99.5% of PC game developers.

In "nVidia PhysX's" case, may be some developers got into exclusive deal with nVidia to implement "nVidia only" effects. That's the case.



One more thing. Earlier nVidia was claiming that their "PhysX" is faster than Physics calculations running off the CPU. This was pure FUD. They used to run outdated X87 instructions on CPU instead of SSE x.

I'd advise you to not waste money in gimmicks. But if you want to, AMD + nVidia card is better for this. If you get a single nVidia card and turn on "PhysX" for those select few games, performance drops to 60%.

Obviously had the developer gone the neutral way of implementing these effects, performance drop wouldn't have been this large. In any case, a typical customer is fooled by marketing and wastes his money imho.


All confusion neutralized. Thanks a ton to ico:thumbs::wow: superb explanation.

Also another question based on above, if i have a system with an i7 processor and one 'low/mid-range' graphics card say a notebook card like GT435M in my laptop, can i assume that my CPU will do better calculation if i let the i7 handle the phyx part instead of the GPU??, there is an option in nVidia control panel to select phyx handler.
 
Last edited:

Cilus

laborare est orare
Ya, Nvidia PhysX was highlighted like a new era of GPU computing power of Nvidia CUDA architecture. In reality, most of in game Physics Engines which runs on CPU like HAVOC, BULLET and their in-house implementation in different games (Crysis used a HAVOC based Physics Engine in original Crysis), can equal or better quality than PhysX. It is like making the game looking bad deliberately without PhysX and then bringing it to normal level by using PhysX, not what people think; Game is already good and PhysX will make it better.
Since I have a dedicated PhysX card, I observed different games with and without enabling PhysX and found out only in handful of them can actually get visually enhanced with PhysX enabled. Mafia II, Batman series are example of it whereas most of the games with PhysX implementation just doesn't show any real improvement. Also the Physics effects created by PhysX at its best is not at all anything superior to other engine, original Crysis, Half Life 2 Series, BF3 offer superior Physics quality with CPU bound Physics Engine.
 
OP
sam_738844

sam_738844

Wise Old Owl
:yuck:Seriously now!, I began to think what glitter can be put on falsehood such as these manufacturers did over last many years via marketing! now i think back of years past, when Aegia released a card named "Phyx Card" which made us wonder "wow..look at that card, it says it will handle the physics alone" and cried "A stand-alone card just for physics! we can't afford that!" and thought our gaming would be screwed just because we could not install that card in our old-school mobo's and dint know all that stuff back then...now i see how unfailingly lame it was.
 

Cilus

laborare est orare
Ageia's design was a neutral Physics Processing unit, can work with both Nvidia and AMD cards. Also the architecture of Ageia's design was not similar to Nvidia CUDA design and if were implemented properly, we would have been seeing CPu bound Physics engines to run on it, resulting better quality and faster processing. But Nvidia bought Ageia and has changed the programming API in such a way that it works with only CUDA architecture, irrespective of the capability of the underlying hardware.
 
OP
sam_738844

sam_738844

Wise Old Owl
^^learned a lot, can you also tell me about how AA in game is dependant on the bus-width mentioned on the graphics card specs...recently i have read many discussions about the same?
 

vickybat

I am the night...I am...
^^ Read this buddy. It will clear all your doubts.

PhysX: x87 and SSE | PhysXInfo.com - PhysX News

The problem is since physx use x87 which is an older instruction set, the code isn't optimized to run properly in cpu coz modern cpu's use SSE.
Now the physx code itself has several sections or parts which has to run on cpu but due to the presence of x87, it gets a performance hit.

Nvidia has said that it will support SSE fully in future sdk's and if that happens, implementation in games would increase.
But it hasn't seen the light of day yet.
 
Last edited:

ico

Super Moderator
Staff member
:yuck:Seriously now!, I began to think what glitter can be put on falsehood such as these manufacturers did over last many years via marketing! now i think back of years past, when Aegia released a card named "Phyx Card" which made us wonder "wow..look at that card, it says it will handle the physics alone" and cried "A stand-alone card just for physics! we can't afford that!" and thought our gaming would be screwed just because we could not install that card in our old-school mobo's and dint know all that stuff back then...now i see how unfailingly lame it was.
This sums it up well.

PhysX is dead. It was already dead to begin with. However, marketing was not.
 

Cilus

laborare est orare
have a read here:
Bullet Physics ? The Future of GPU-Accelerated Physics? | bit-tech.net
Popular Physics Engines comparison: PhysX, Havok and ODE | PhysXInfo.com - PhysX Articles

Currently Bullet is popular in many sense. Some portions of it already have OpenCL acceleration, AMD is working with the team to implement GPU based Bullet engine and most popular 3D benchmark software 3DMark 2011.
 
Status
Not open for further replies.
Top Bottom