A New Grapgic Card for my Gear

Status
Not open for further replies.

vickybat

I am the night...I am...
^^ Hey jas thanks a lot man. Now i'm starting to think that i'm not that educated i thought to be a little while ago.:grin:

@ everyone

Guys take a chill pill and lets stop it right here. Op will be confused further. I guess op has decided on a 6950 and that's a good decision imo.
 
J

Joker

Guest
lol @ AMD demonstrates Havok running on gpu. 2.5 year old vapourware link. i have got nothing to make of it since every game dev implemens physics in some form or the other whether it runs off cpu or gpu.

PhysX is a marketing gimmick...GPU physics is not. ;)

lolbtw..dont blow the trumpet on BF3...hype can make you taste sour grapes..lile it has done with crysis 2. :lol: much improved The Withcer 2 version 2 launched yesterday. :p
 

Jaskanwar Singh

Aspiring Novelist
^^ Hey jas thanks a lot man. Now i'm starting to think that i'm not that educated i thought to be a little while ago.:grin:

@ everyone

Guys take a chill pill and lets stop it right here. Op will be confused further. I guess op has decided on a 6950 and that's a good decision imo.

you are welcome :)
 

vickybat

I am the night...I am...
@ joker

Yeah i had much higher expectations with crysis 2 but it wasn't like its predecessor. But visually it was stunning to be honest.

I guess you were following the crysis 2 thread seriously?:smile:

But physx is gpu physics right? Only difference is that its proprietary.
I guess that doesn't make it a gimmick. There will be future development on it and more and more games are finding support.
 

AcceleratorX

Youngling
I think NVIDIA did try hard for PhysX to become a universal API for GPU physics, but neither Intel nor AMD supported it. It's not heavily documented, but some hints can be found at press releases:

Nvidia Helps Porting PhysX on Radeon - Softpedia

Softpedia said:
Yet, AMD was rumored to try developing its own PhysX a few weeks ago.

Softpedia said:
There's no doubt that Nvidia is more than delighted to see its API working on cards made by its strongest competitor, not to mention the threat it may represent to Havok.

Softpedia said:
AMD still refuses to provide access to any HD 4800 hardware, the support from other people allowed to almost complete its CUDA Radeon library. All that is left to do, besides the huge amount of work the porting requires, it to convince AMD to aid the project, since its approval is mandatory at "developer and PR level".

Guess who didn't cooperate?

I'm sure the issues were a lot more complex than what's presented in a few press releases, but the fact is that AMD is not a saint in the woods either :D

As for PhysX vs. no PhysX, I do believe PhysX will not completely die since it is by far the cheapest physics engine to integrate into any game (given it's feature level compared to open source ones).
 

skeletor

Chosen of the Omnissiah
I think NVIDIA did try hard for PhysX to become a universal API for GPU physics, but neither Intel nor AMD supported it. It's not heavily documented, but some hints can be found at press releases:

Guess who didn't cooperate?

I'm sure the issues were a lot more complex than what's presented in a few press releases, but the fact is that AMD is not a saint in the woods either :D
There is a reason why AMD didn't accept. It would have hurt their sweet spot strategy. When AMD designs a GPU...they target a particular 'sweet spot' die size and design their GPUs to fit in. They only add features which the engineering team thinks are worth adding. Intel has a similar funda for CPUs after the Netburst fiasco...only that circuitry will be added which yields > 2% performance increase for every 1% power consumption increase.

Ever since the successful RV570/80 and the RV600 fiasco, AMD's strategy is to develop chips around maximum performance per mm^2, maximum performance per watt, to maximize yield and have such designs which can be easily scaled down from low-end to high-end. They don't care about having huge 500 mm^2 monolithic dies with everything in them like nVidia does.

When they went from RV770 to RV870, they also removed features like Sideport which were eating up space only meet the desired size.

Why doens't nVidia release PhysX SDK and documentations under GPL or BSD license? Only that would make PhysX open. All other licenses are proprietary or commercial.
 
Last edited:

Liverpool_fan

Sami Hyypiä, LFC legend
Guess who didn't cooperate?

I'm sure the issues were a lot more complex than what's presented in a few press releases, but the fact is that AMD is not a saint in the woods either :D

As for PhysX vs. no PhysX, I do believe PhysX will not completely die since it is by far the cheapest physics engine to integrate into any game (given it's feature level compared to open source ones).
Why should AMD accept PhysX which is owned and controlled by NVIDIA?
Why should they trust NVIDIA who rather have a fishy record?
Why should they look for "hints" with incomplete documentation and no guarantee of first class support?
 

vickybat

I am the night...I am...
Ok guys now check what physx 3.0 brings to the table. Finally physx 3.0 is supporting multicore cpu's with latest SSE instruction set to take care of floating point math operations. They claim to reach a broad user spectrum by this move.

Check this & this

Nvidia Releases PhysX 3.0

guru3d

It doesn't use the older x87 instruction set.

Arguably more noteworthy is a new Task Manager and managed thread pool, which "allows games to take advantage of multi-core processors on all platforms." You might recall that, last year, we discovered that certain games completely fail to implement PhysX in a way that takes advantage of multiple CPU cores—or even modern instruction sets like SSE. PhysX 3.0, it seems, is tackling that issue.



Why doens't nVidia release PhysX SDK and documentations under GPL or BSD license? Only that would make PhysX open. All other licenses are proprietary or commercial.

This will never happen.
 
Last edited:

vickybat

I am the night...I am...
^^ Yeah you're right mate. I'll pm a supermoderator and hopefully he'll shift physx based posts to a new thread.
 

skeletor

Chosen of the Omnissiah
X87 for their GPU. So that they can say.."FPU calculations on GPUs are so much faster than CPUs" when CPUs clearly not support X87.

Now when it comes to supporting CPU, they use use SSE because they _had_ to. :rolleyes: I wonder what problem there was for them to use SSE on GPUs itself? Perhaps no benefit over CPU? :lol: But then they wouldn't have been able to say that GPU are better than CPU for physics calculations. Self created myth and cult + clever marketing.

Again tells me, we are arguing over a non-issue and PhysX is nothing more than a gimmick. nVidia fagboys don't seem to believe this. Gamers thinking they are gamedevs is the last thing which should happen to this world.
 
J

Joker

Guest
physx now natively supporting multicore CPUs?? acceptance of defeat there. beats the purpose why nvidia gpu physx was born??

edit: +1 to what ICO said.
 

vickybat

I am the night...I am...
@ joker

No its far from that. You still need an nvidia gpu to make it work and this time both cpu and gpu will work together.

You know its all about diversification. Physx sdk now supports wider hardware i.e from pc's to game consoles like ps3 and xbox 360.
Look at the big picture before posting. Nvidia is not targeting amd or any other (physics)competitor but aiming for a wider audience.

Great move btw.

*physxinfo.com/wiki/PhysX_SDK_3.x

I don't know much about this, but ico's posts do make more sense.

If you don't know much then how can you say his posts make more sense?
Stop trolling please.
 
Last edited:

Liverpool_fan

Sami Hyypiä, LFC legend
@ joker

No its far from that. You still need an nvidia gpu to make it work and this time both cpu and gpu will work together.

You know its all about diversification. Physx sdk now supports wider hardware i.e from pc's to game consoles like ps3 and xbox 360.
Look at the big picture before posting. Nvidia is not targeting amd or any other (physics)competitor but aiming for a wider audience.

Great move btw.

PhysX SDK 3.x - PhysX Wiki
How the hell is NVIDIA targeting a wider audience if you need both CPU and GPU to work together? This makes absolutely no sense.
I don't see the real of point of NVIDIA PhysX in portable devices to be honest either if that's the wider audience they are targeting.
 

skeletor

Chosen of the Omnissiah
why not use SSE on the gpu then? why x87?
here is the reason:
Other thing is PhysX can run in CPU very smoothly if the code has been optimized to use SSE (1,2,3 or 4) instruction sets of CPU but Nvidia has used unoptimized X87 code for CPU PhysX execution to reduce the CPU PhysX performance forcefully to glorify their cards. It is not only my word but all the major review sites like Guru3d, Toms Hardware, Anandtech have said those words over and over.
good post here by Cilus. I had somehow missed your post. :oops: This is exactly what I also meant in my previous post. (#71)

X87 is antique and became outdated 12 years ago on CPUs when SSE succeded it. nVidia has only used X87 on GPU only to falsely glorify themselves. Using SSE on GPU would have resulted in no marketing aura and no pseudobenefits.

This is a terrific read if you guys haven't read it: Real World Technologies - PhysX87: Software Deficiency
 

Skud

Super Moderator
Staff member
LOL. This thread is in Page 3 and half of the posts are related to PhysX, which OP never asked for. I guess, by this time, even OP has completely forgotten why he created this thread in the first place. Chill guys. ;)
 

mithun_mrg

Cyborg Agent
i will post a small comment here if physx was a gimmick one of the most knowledgeable members of TDF i.e cilus won't have used a dedicated card
the problem is that to take full advantage of physx u need the power of a 8800/9800 card.
it generally enhances the look & feel & most importantly game-play the impact physx has is much better than havoc or frostbrite can u deny this
Game physics is as important a graphics remember the first time you played Maxpayne2
 
Status
Not open for further replies.
Top Bottom