i have told this 100 times. physx is nothing more than a gimmick by nvidia to make u buy their cards. it just enables a few cosmetic effects in 4-5 games. no effect on gameplay whatsoever.
and enabling physx on nvidia card takes a heavy toll on your fps and it will decrease ~50%. unless u have a really high-end card like gtx 570 or gtx 580.
another point..game developers could have very well implemented those effects without using this proprietary technology from nvidia.
lastly, ur friend has no idea about he is taking about. there are various other reasons for nvidia fans to prefer a nvidia card. sadly physx is not one of them. as of now it is gimmicky and hasnt taken off and never will. no game-dev is ever going to implement it in the core of the game as they want their game to run equally well on both amd and nvidia cards. but then u do find some devs who get into deals and implement some cosmetic features/effects.
i can only name 4-5 games which utilize physx - mafia 2, metro 2033, mirror's edge, batman arkham asylum. all it does in these games is, it enables some random debris and gunshot effects which have no effect on the *gameplay*. just good looking cosmetic effects with a not-worth toll on frame rate. the only upcoming game utilizing physx this year i can name is, batman: arkham city.
I seriously beg to differ. How many physx enabled games have you played buddy? The ones you've named above , did you really play them with physx on? If you had, then i don't think you'd be saying all that. Gpu accelerated physics code is the future rather than the older cpu based methods due to the more flexible parallel architecture that can process complex algorithms better than a general purpose cpu.
Yes , physx is proprietery and there's absolutely nothing wrong with it. Check the following screenshot:
*img219.imageshack.us/img219/2/220pxmafiaphysx.jpg
Its from mafia 2. The top shot is done with
apex physx on which actually work on particle, clothing, vegetation,destruction & turbulence effects.
Compare that to the bottom one with the feature turned off. What you see is most of the glass debris have dissappeared giving a quite unrealistic feeling.
What you call simple cosmetic effects are actually shaping up to provide some realistic and real world experience to the overall gameplay. In other words , its part of the gameplay.
Imagine turning of all post processing effects in a shader heavy game and play it. You will get 100+ fps and absolutely no change in gameplay. But tell me, is the game really developed to play like that?
The answer is a big no. We need realism but its not everybody's cup of tea. But people who can pay extra to get them, should get them.
You don't need 100+fps in all games and buy accelerators accordingly. The game should be playable and people should enjoy the game in the way its meant to be played.
You are saying physx will never take off, but the way i see it, its already taking off. Physx 3.0 is on the horizon and it will now along with the desktop platform, will be featured in the tablet and smartphone platform courtesy, nvidia's tegra 2 based soc's and their derivatives.
You don't like physx, thats fine. But don't say that the games look stupid or there's no much difference in the gameplay aspect and all that stuff. It was never made to change gameplay but to change the look and feel of the game and pave way for realism.
Open source physics engine like bullet,open dynamics engine etc. are also great and will be embraced by the community no doubt.
Joker said:
cuda and ati stream are now equally well supported by most applications. so this point is moot. comparing the video transcoding performance of both cuda and ati stream in Anandtech's sandy bridge review, ati stream outperformed cuda greatly in terms of video quality. although it was a bit slow, only a bit. nvidia cuda greatly helps if you are into Folding@Home (google) but these people are soon going release an updated client which makes good use of ati stream. but i guess this doesn't matter to u as u are not into folding. adobe and autodesk's applications are utilizing both cuda and stream. so an even contest.
This part is not worth discussing from a gaming point of view. But cuda is a more mature platform than stream and much more app support currently. The folding example you gave is one of them. Nvidia is actually one of the pioneers of gpgpu computing. Sure streams support is growing leaps and bounds but it still has some catching to do.
Joker said:
next comes multi-monitor setup. ati eyeinifinity is better than nvidia surround in this case. it supports more monitors per gpu and is more mature.
next is "3d gaming" in which you use a 3d monitor and then 3d glasses. in case of nvidia, u enable "3d vision" from the driver and then buy a "nvidia 3d vision" certified monitor and then "nvidia 3d vision kit" which again costs 10k or ~$150. in case of amd "hd3d" u buy 3rd party $20 worth software which applies stereoscopic effects. u can buy any 3d monitor/television supporting hdmi 1.4 and any cheap 3d glasses. ur wish.
You've got to admit that amd is way behind nvidia currently as far as 3d gaming is concerned. Amd's current support is paltry and require specific hardware which are not easily available. If you never followed, nvidia has announced cheap wired 3d glasses for the 3d vision kit. Thats gonna cut cost down. Besides you don't need nvidia certified monitor for 3d vision. Any monitor having 120mhz refresh rate and 3d capable can work fine with 3d vision. Same applies to all 3dtv's available in the market. They are all compatible.
AFAIK amd hd3d needs specific monitors for 3d to work which are scarcely available. But blueray 3d playback is supported in all 3dtv's using amd's hd3d.
Nvidia is the only platform currently to give eyefinity+3d which they call 3d surround. Currently, this is not possible with amd i guess. But that said, hd3d has immense potential and expect this to be almost equal to 3d vision in the times to come.
Joker said:
summing everything up.
my choice here will be hd 6950 2gb. why? 2gb vram will be better in the long run. it will let you play at higher resolutions with high antialiasing due to higher VRAM compared to 1GB vram of gtx 560 ti.
and now if anyone asks these useless questions, link this post of mine to them.
.
My sum of all things is pretty similar with
ICO. Flip a coin and decide between 6950 and gtx 560-ti. 2gb vram is never useful in fullhd and that is not at all a usp of 6950.
But it becomes useful when you use two of them in tandem using cf and play in higher resolutions. More vram is required in these scenarios.
At fullhd, 560-ti gives similar performance as a 69502gb. But its in multigpu setup, the latter has an advantage due to frame buffer.
So if playing at resolutions like 2560x1600 or in an eyefinity setup or planning to crossfire and play in a mutimonitor setup, a 69502gb will be worthwhile over a gtx 560-ti.
Else, its again a toss of a coin.
@vickybat : Dude look at the costs also naa .. I have to buy each and every game for PS3 ... yet the same game cost 1/4 for PC ... Also jailbreaking PS3 is also an option but it has several cons like bricking it ....
I think I`ll go wid GFX card ... Its value for money ... I`ll save money for PS3 ... or maybe PS4 ... who knows ???
Buddy its totally upto you. If you are an avid gamer, then the ps3 exclusives i mentioned are worth playing.
Sorry, we can't discuss any regarding piracy or hacking the ps3 to play backups in this forum. Its beyond the rules.
Its totally upto you and your usage.