A New Grapgic Card for my Gear

Status
Not open for further replies.

rchi84

In the zone
@Dark knight Dude, I will give you the honest answer. Today, AMD and Nvidia are well matched in most price segments, so choosing any card from either of them will not hurt you at all. It comes to down to which company you prefer, really.

On the lower mid range, you have the Radeon 6870 and Geforce 560. In the mid range, you have cards like the 560Ti, 6950 and their OCed counterparts. Then in the upper mid range, you have the Geforce 570 and Radeon 6970. Beyond that, there is only one true high end card which is the Geforce 580 which is way beyond budget and practicality. Then we get into the crazy range with the 6990 and 590s.

As for the other issues mentioned, 3D is a gimmick which gives you a headache after 30 minutes (sometimes less) and also requires an expensive monitor/Tv. Unless they find a way to make it more comfortable, it is useless to make a 3D gaming rig for a serious gamer. Who wants to spend 70-80K on a gaming setup, and only play for 1 hour at a time?

Physx, there are a few AAA titles like Mirror's Edge, Batman AA, Mafia 2, Metro 2033, Alice: Madness Returns which use the effects well. the rest of the title, imho, don't make effective usage of Physx and it doesn't add much to the experience.

But, developers know Physx usage is not widespread, so the Physx effects are not included as main gameplay elements. It's more like salad dressing, and not the vegetables.

There are people who like Physx and those who don't like it. For me, if Battlefield Bad Company 2 and BF3 can have destructible environments on an awesome scale, all through CPU physics, i don't see the point of GPU physx, beyond some debris, dust and cloth effects.

Driver issues are nonsense for 99% of users who don't face any troubles on either side. If you are not into crossfiring, SLI, multimonitors then both companies drivers are rock solid.

AMD drivers used to be below average in Linux in the past, but I can tell you from experience, that both Nvidia and AMD make great drivers for Linux now. Of course, there's no point in using linux for gaming, which makes driver comparisons pointless on linux at least.

So choose any card which sticks to your budget and rest assured that you will have a decent gaming experience for a long while. There are people here who are gaming on medium-high details on 720p with a 4 year old system. comfortably.

Looking forward to the pics in the new purchase thread :)

flamers, fire away :)
 

varunb

Working in an IT company
Re: A New Graphic Card for my Gear

Duhhhh...for those who still dont know. If you don't have an Nvidia card, then it doesn't means you won't see the Physx effects (you wont regret even if you don't install Physx software). Lack of an Nvidia card means that the game will force the CPU for the Physx processing. So people can chill & buy AMD cards with peace.

As for the OP, you can go & buy any 6950 2GB card that is easily available near your area or which you won't have trouble ordering online. Its not like you will get a sky-rocket performance out of a particular brand's 6950 card. Its all depends on the user's config. So MSI twin Frozr III, Sapphire, etc any will do fine.

If you still wanna assure yourself, then there are various benchmarks/reviews scattered throughout the internet which can be easily found.
 

mithun_mrg

Cyborg Agent
Re: A New Graphic Card for my Gear

Duhhhh...for those who still dont know. If you don't have an Nvidia card, then it doesn't means you won't see the Physx effects (you wont regret even if you don't install Physx software). Lack of an Nvidia card means that the game will force the CPU for the Physx processing. So people can chill & buy AMD cards with peace.


read this article
NVIDIA Sheds Light On Lack Of PhysX CPU Optimizations - HotHardware
 

varunb

Working in an IT company
I am fully aware of that article but what exactly are you trying to suggest to the OP with that ? That article in short clearly indicated to me that Physx is just a gimmick that Nvidia is using to fool customers into buying their products. Nothing else. Also, with the exception of Mafia 2, every game featuring Physx can be played smoothly without an Nvidia card.

Anyways guys lets not carry this topic to somewhere else. Rchi84 has pretty much summed up the main points. The OP asked the price & the card he can get for that price.
 

mithun_mrg

Cyborg Agent
I am fully aware of that article but what exactly are you trying to suggest to the OP with that ? That article in short clearly indicated to me that Physx is just a gimmick that Nvidia is using to fool customers into buying their products. Nothing else. Also, with the exception of Mafia 2, every game featuring Physx can be played smoothly without an Nvidia card.

Anyways guys lets not carry this topic to somewhere else. Rchi84 has pretty much summed up the main points. The OP asked the price & the card he can get for that price.

again i am saying physx is not a gimmick its a technology & u read the article wrong it meant that the physx code is optimised to run smoothly on nvidia gpu's only

graphics is all about eye candy & physx just adds to that without any extra cost try playing metro2033,dark void, crytorysis, darkest of days , mafia2 with physx on & then off u will notice the difference & customer nowadays r not fools to buy a gpu for just taking physx into consideration

also op has already made his choice there is no point suggesting him now this thread can cool off for now
 

varunb

Working in an IT company
again i am saying physx is not a gimmick its a technology & u read the article wrong it meant that the physx code is optimised to run smoothly on nvidia gpu's only

graphics is all about eye candy & physx just adds to that without any extra cost try playing metro2033,dark void, crytorysis, darkest of days , mafia2 with physx on & then off u will notice the difference & customer nowadays r not fools to buy a gpu for just taking physx into consideration

also op has already made his choice there is no point suggesting him now this thread can cool off for now

Its a gimmickkkkkkkkkkk alright. It appears you misunderstood me & you are not looking beyond the physx effects. Sure the games look good with Physx on. I am not denying it. By gimmick, I meant the trick to attract customers by saying "Oh look at the graphics, particle effects, etc we have got to offer....blah blah blah..." I suggest you do a research about why many of the devs have not yet adopted it, why Nvidia never meant the code to run as efficiently on the CPU which many websites have stated.

Nvidia is giving a competition to DX11 DirectCompute & OpenCL by releasing CUDA & Physx to draw customers by saying that "only we can give you hardware Physx but you gotta own our card". If there is no Nvidia card installed, then no hardware physx for the gamer. NV is being an ass about this by not sharing Physx or making it open source. It is always about the competition buddy never about the graphics. Do not forget this fact. If its not the gimmick or should I say competition then why are they not removing the GPU check for the PPU ?

The games you mentioned can be played with an ATI card with physx on. The only exception which I mentioned earlier is Mafia 2. If the OP wants to get an Nvidia card for Physx, then I won't discourage him at all. Anyways, the OP has already decided so I tapping out now.
 
Last edited:

Skud

Super Moderator
Staff member
PhysX is gimmick because you cannot simply use it in each and every games, unlike MLAA/FXAA/Eyefinity etc., even on an nVIDIA card. You need to specifically code for PhysX, and very handful of developers actually take the pain to implement it, because at the end of the day, that means catering to only a handful of customers.

That not to say PhysX is bad, far from it, but nVIDIA's monopoly has almost killed it.
 

mithun_mrg

Cyborg Agent
why blame nvidia don't u think even if ati had acquired Ageia they would have done the same thing moreover thank nvidia that they provided it for free not charged for any extra for hardware or software lastly i also feel that if they want physx a success as a tecnology they should or have to make it platform independent

see the current situation price & perfomance ratio difference is almost minimal between the two manufacturers but those value added features count for the end users
 

Skud

Super Moderator
Staff member
I would have thanked nVIDIA if PhysX worked when I paired a nVIDIA card with a AMD (main) card without any hack etc. That PhysX doesn't work in this kind of setup is good enough reason to blame nVIDIA.
 

Cilus

laborare est orare
AMD always emphasizes technologies based on Open architecture, not proprietary design which only supports some specific design. That's why their APP or Advanced Parallel Processing technology, the competitor of Nvidia CUDA is based on Direct Compute architectur
which can be used with any Video card with DirectCompute support, irrespective of the crad manufacturer. I have plenty of experience on it.

Condier the example of CoreAvc Video Decoder, the fastest H264 or AVC decoder available. AMD has provided support for DXVA GPU acceleration on it and it can be used with Nvidia cards too. On the other hand, enabling CUDA on CoreAVC requires Nvidia card only.
The apporch Nvidia is taking basically slows down the development of advanced software design due to the proprietary approch taking by them. On the other hand any open box software design grows far faster due to the number of developers can work on it, easily available and highly customizable software libariries to work with and large number of supported hardware.

Yes, I cannot deny that games with highly optimized PhysX design looks better, but it does a very little to improve the game playing experience. It cannot be considered deciding factor while purchasing a card, one should look at the FPS counter, AF and AA performance etc while making his decesion of the Graphics card. I have a dedicated PhysX card (enabled through PhysX Mod to work with AMD cards) and beleive me, only a few games look better with PhysX enabled.
Homefront, Bullet Storm, Batman Arkham Asylum, Metro 2033 have very little effect while PhysX is switched on. The only game which looks significantly better is Mafia II. Other thing is PhysX can run in CPU very smoothly if the code has been optimized to use SSE (1,2,3 or 4) instruction sets of CPU but Nvidia has used unoptimized X87 code for CPU PhysX execution to reduce the CPU PhysX performance forcefully to glorify their cards. It is not only my word but all the major review sites like Guru3d, Toms Hardware, Anandtech have said those words over and over.

AMD is currenly helping the advancement of HAVOC and BULLET Physics engine for GPU acceleration. Again they are forcing on open architecture so that those engines can be accelerated by any Gfx card.
 

vickybat

I am the night...I am...
I would have thanked nVIDIA if PhysX worked when I paired a nVIDIA card with a AMD (main) card without any hack etc. That PhysX doesn't work in this kind of setup is good enough reason to blame nVIDIA.

Its no one to blame. Why in the hell would nvidia support amd? Its their proprietary code and should work on their cards only. Its a very common business strategy.

About physx, i would say its a first step to enable physics code to run in a gpu.
Gpu physics should not be cpu optimized because for that , the floating point math has to converted to fixed point logic because the cpu is more comfortable with the latter.

For eg- 1.68 in floating point will be represented as 1680.

Fixed point math operations take away the level of precision when applying physics logic to a code. Amd will follow nvida's path but like cilus said, it will cater to open standards rather than something proprietary which is good imo.

Future games will see the rise in gpu physics and will bring some incredible affect to the table.
 

Skud

Super Moderator
Staff member
Its no one to blame. Why in the hell would nvidia support amd? Its their proprietary code and should work on their cards only. Its a very common business strategy.

About physx, i would say its a first step to enable physics code to run in a gpu.
Gpu physics should not be cpu optimized because for that , the floating point math has to converted to fixed point logic because the cpu is more comfortable with the latter.

For eg- 1.68 in floating point will be represented as 1680.

Fixed point math operations take away the level of precision when applying physics logic to a code. Amd will follow nvida's path but like cilus said, it will cater to open standards rather than something proprietary which is good imo.

Future games will see the rise in gpu physics and will bring some incredible affect to the table.


Its not about supporting AMD, its about supporting games. Why not each and every game have support for PhysX effects even with nVIDIA cards? Not even all "the way its meant to be played" games come out with PhysX support. And regarding future games, even Aegia predicted the same, and we are yet to see that future. And by the time that future comes, will these cards be able to handle those games, let aside with full PhysX effects?

At this point, just like 3D, PhysX is a moot point (read gimmick). It cannot be used as a reference or a deal breaker/maker unless someone specifically asks for, as there's very few games to actually take the benefit of it. As rchi84 rightly said, it's more like salad dressing rather than the actual vegetable.
 

vickybat

I am the night...I am...
Its not about supporting AMD, its about supporting games. Why not each and every game have support for PhysX effects even with nVIDIA cards? Not even all "the way its meant to be played" games come out with PhysX support. And regarding future games, even Aegia predicted the same, and we are yet to see that future. And by the time that future comes, will these cards be able to handle those games, let aside with full PhysX effects?

At this point, just like 3D, PhysX is a moot point (read gimmick). It cannot be used as a reference or a deal breaker/maker unless someone specifically asks for, as there's very few games to actually take the benefit of it. As rchi84 rightly said, it's more like salad dressing rather than the actual vegetable.

Who in the blue hell told you that 3d is a gimmick? Its one of the most talked about feature and a next step towards virtual reality. Amd is desperately trying to catch up with nvidia in 3d.

Read this.

And talking about physx not all games are supported because of lack of gpu horse power. In shader heavy "nvidia titles" like crysis 2 or even the upcoming battlefield 3, the games are so demanding that implementing physx will take a toll with the overall perfomance. So cpu physics is used as a compensation. Unreal engine 3 is best suited for gpu physics currently because its light and not resource heavy so that the gpu can compute physics code along with rendering simultaneously. You would want to mention metro as its a shader heavy title as well but enabling physx takes a toll on performance and besides its not used extensively as the game has less open environments to show scattering except a few. Metro last light will also incorporate physx. Check this & this.

As gpu's become more powerful, we will see more and more games supporting gpu physics and hopefully amd will too join the bandwagon.


Contradicting rchi84's statement "he never saw the big picture", even cpu physics is a salad dressing. Afterall, garnishing and dressing enhances the looks of any food and also gives it higher nutritional values. Physics is the same ,whether cpu or gpu physics.

So before saying anything simply a gimmick, think twice.
 
Last edited:

skeletor

Chosen of the Omnissiah
There we again see pseudogamedevs who don't work in the field/nor intend to commenting on whether PhysX is a gimmick or not.

Excessive tessellation myth in Crysis 2 was busted not quite long ago when Crytek tessellated a plan plain concrete slab unnecessarily for actually no change in visual quality. Also a mesh of tessellated water was flowing beneath the ground. Waste of GPU horsepower for no benefit shall I say? Hate to say it, but AMD was right about "excessive" tessellation is not required.

The fact that you take decent hit when you turn on PhysX tells me there are better ways to implement those flying paper and glass break effects. After all they are mere flying paper and glass breaks? How many games using it are worth playing? I can name only 4. Metro 2033 doesn't qualify as worth playing as I have quite high standards.

Do have a look at The Witcher 2. A game without any gimmicks from either side. Runs on DirectX 9. Best looking game till now. That's what games should be like without using any proprietary gimmick like PhysX. It also busts the DirectX 11 is OMFG EPIC myth. At the end of the day, it all depends on the gamedev, how he chooses to develop his games using various APIs.

In the end, I am willing to spend Rs. 500 to play 100 games faster than choosing to spend Rs. 500 only to play 4 games (irrespective of the card manufacturer) with added effects which devs could have implemented through other ways. PhysX is a gimmick.

I'd rather take John Carmack's take on PhysX rather than some random poster.

Coming back to the topic, spend more and get HD 6950 2GB reference or factory OCed cards. That extra 1 GB of VRAM is going to do you a more favour in the long run than gimmicks of either GPU camps.
 
Last edited:

AcceleratorX

Youngling
IMO, NVIDIA's antics of late are due to desperation, i.e. heavy competition from AMD and their exit from the chipset market which is costing them revenue.

That being said, these days both NVIDIA and AMD GPUs are pretty competitive in price as well as performance. A lot could be said about NVIDIA, but the latest PhysX versions *do* have optimized multicore PhysX (including SSE2 if I remember correctly) and efforts to get PhysX working on Radeon cards were thwarted by AMD itself (by not offering any help whatsoever) and not NVIDIA who remained apathetic (read: Neutral) to the whole situation. Also note that NVIDIA never sued the group trying to make it work.

After that issue was forgotten, guess what? AMD announces partnership to get Havok and Bullet working through DirectCompute. Results are preliminary at best, almost two years later.

And what about the good? The real truth is that both companies have done good things as well. NVIDIA, having helped many small developers test and debug their games with their specialized TWIMTBP labs in Moscow testing for all kinds of bugs, graphics related or otherwise (even if they did add vendor specific crap). And AMD, which has added features that work on all GPUs including those of the competition.

There are issues with hardware, such as the angle-dependent Anisotropic filtering of NVIDIA or the less-than-optimal quality driver tweaking and possible AF bugs in AMD hardware.

Regardless of all this, the fact is that both companies have advanced semiconductor technology, enabled new features, and allowed PC gaming to advance and possibly survive. This is why both companies are important, historically or otherwise.

Also, the fact is that NVIDIA was the first to start the push for GPU computing, and still excels at it, something AMD has always traditionally followed NVIDIA at rather than leading it (I'm talking feature wise here, reaction time, etc.).

So, given this basic rundown of the two companies, I really feel that at any price point, one GPU is as good as the other and there should be no inherent bias against AMD or NVIDIA - just get the product that has the best price to performance ratio! :)

As for the topic, I do feel Radeon HD 6950 is slightly better than GTX 560 Ti, but you should have a good look at the price difference between them. If prices are close, go for the 6950, else go for the 560 Ti since performance difference is not very significant.
 

Skud

Super Moderator
Staff member
Who in the blue hell told you that 3d is a gimmick? Its one of the most talked about feature and a next step towards virtual reality. Amd is desperately trying to catch up with nvidia in 3d.

Read this.

And talking about physx not all games are supported because of lack of gpu horse power. In shader heavy "nvidia titles" like crysis 2 or even the upcoming battlefield 3, the games are so demanding that implementing physx will take a toll with the overall perfomance. So cpu physics is used as a compensation. Unreal engine 3 is best suited for gpu physics currently because its light and not resource heavy so that the gpu can compute physics code along with rendering simultaneously. You would want to mention metro as its a shader heavy title as well but enabling physx takes a toll on performance and besides its not used extensively as the game has less open environments to show scattering except a few. Metro last light will also incorporate physx. Check this & this.

As gpu's become more powerful, we will see more and more games supporting gpu physics and hopefully amd will too join the bandwagon.


Contradicting rchi84's statement "he never saw the big picture", even cpu physics is a salad dressing. Afterall, garnishing and dressing enhances the looks of any food and also gives it higher nutritional values. Physics is the same ,whether cpu or gpu physics.

So before saying anything simply a gimmick, think twice.


I have read that Toms article twice already but what really you want to convey? 3D is commonplace or it's every gamers' next target? Or something else? And regarding lack of PhysX, if it is lack of GPU horsepower then why recommend someone a card only because it supports PhysX?
 

vickybat

I am the night...I am...
I have read that Toms article twice already but what really you want to convey? 3D is commonplace or it's every gamers' next target? Or something else? And regarding lack of PhysX, if it is lack of GPU horsepower then why recommend someone a card only because it supports PhysX?

If it would have been a commonplace or every gamer's next target, you would have agreed that it isn't a gimmick right? Just because its expensive at the moment or "everybody doesn't have a 3d display " does not make 3d a gimmick.

Like a said, think twice.

Can you please quote who recommend a card here just because it supports physx? Do you even differentiate between physx and physics?
 
J

Joker

Guest
im with skud, ICO, cilus and acceleratorx on this one.

now GPU is fast for floating point calcs...u dont have to have use PhysX API for offloading them....there are other ways too and that DOES NOT mean running physics calcs on the CPU. once u sit and use a game development stack like DirectX 11 or OpenGL + SDL u find plenty of ways.

PhysX is a gimmick. educated ppl know this. those who drink the nvidia koolaid don't.

AcceleratorX said:
but the latest PhysX versions *do* have optimized multicore PhysX (including SSE2 if I remember correctly)
i think nvidia is still using x87 for physx. another reason why they run slow on modern CPUs...SIMD based instructions like SSE2, 3 and 4 are much faster on CPUs.
 

vickybat

I am the night...I am...
There we again see pseudogamedevs who don't work in the field/nor intend to commenting on whether PhysX is a gimmick or not.

Excessive tessellation myth in Crysis 2 was busted not quite long ago when Crytek tessellated a plan plain concrete slab unnecessarily for actually no change in visual quality. Also a mesh of tessellated water was flowing beneath the ground. Waste of GPU horsepower for no benefit shall I say? Hate to say it, but AMD was right about "excessive" tessellation is not required.

The fact that you take decent hit when you turn on PhysX tells me there are better ways to implement those flying paper and glass break effects. After all they are mere flying paper and glass breaks? How many games using it are worth playing? I can name only 4. Metro 2033 doesn't qualify as worth playing as I have quite high standards.

In the end, I am willing to spend Rs. 500 to play 100 games faster than choosing to spend Rs. 500 only to play 4 games (irrespective of the card manufacturer) with added effects which devs could have implemented through other ways. PhysX is a gimmick.

I'd rather take John Carmack's take on PhysX rather than some random poster.

Coming back to the topic, spend more and get HD 6950 2GB reference or factory OCed cards. That extra 1 GB of VRAM is going to do you a more favour in the long run than gimmicks of either GPU camps.

Hey welcome back.:smile:

I couldn't get your bold comments. Can you please throw some light?


Do have a look at The Witcher 2. A game without any gimmicks from either side. Runs on DirectX 9. Best looking game till now. That's what games should be like without using any proprietary gimmick like PhysX. It also busts the DirectX 11 is OMFG EPIC myth. At the end of the day, it all depends on the gamedev, how he chooses to develop his games using various APIs.

Well there's a game called battlefield 3 in case you are not aware. Wonder how witcher 2 stands next to it in the visual department.Sorry to say but its high in tesselation. Only time will tell if its done right or not. Tesselation code in crysis 2 was not part of original development but was patched later. That's why the there were unnecessary overuse of it.

Btw witcher 2 is using havoc physics(cpu) which is not proprietary physics engine and amd is busy implementing it on their gpu's. Yes you heard right, havoc physics code will be processed by a gpu.

So the point was not physx but physics to be implemented on a gpu rather than cpu and we had a debate before on it before.

im with skud, ICO, cilus and acceleratorx on this one.

now GPU is fast for floating point calcs...u dont have to have use PhysX API for offloading them....there are other ways too and that DOES NOT mean running physics calcs on the CPU. once u sit and use a game development stack like DirectX 11 or OpenGL + SDL u find plenty of ways.

O really?? Can you tell what's this mate?


AMD demonstrates Havok with GPU acceleration


What say now? open-gl api's are not meant for gpu physics but its open-cl rather and amd is pushing hard.
Good that its not proprietary like physx.

PhysX is a gimmick. educated ppl know this. those who drink the nvidia koolaid don't.

I don't think physx is a gimmick because i don't see it in the way you see. I say that its a first step to implement physics ina gpu rather than cpu and since its proprietary, they chose to unoptimize it for other platforms including the cpu. If you read the above link, you will see nvidia's APEX tool set is similar to havok. So both are trying to achieve the same but taking a different route.

I don't think i'm uneducated and neither do i drink nvidia koolaid(seriously, i don't know what that means). Adding instruction set support doesn't make a code optimized but the actual execution units matter. Instruction sets are a medium.
 
Last edited:
Status
Not open for further replies.
Top Bottom