The Big Question ?????

Status
Not open for further replies.

[digitt]

Broken In
"Every other day, there's a chap out here posting about which graphics card to buy. This has made me, and quite a few others, pretty irritable since people simply don't bother to scroll down the page and get the answers they seek. "

These are the first 3 line of the topic The Complete Graphics Card List.
Now anyone who has in-depth knowlegde about graphioc cards will be frustraed, but what about those who do not have any knowledge
watsoever. Guyz like me who want to play all the latest games but do not know much about graphics cards. So can we have something which explains in simple words everything about graphics cards so that we can understand the difference between a geforce fx 5200 and 5700 and also understand the whole argument about opengl and directx......and also some topics in this forum as well because they are simply too technical for a normal guy to understand
 

theraven

Technomancer
first of all welcome to the forum
second
theres a thread already started by gxsaurav for all graphics cards needs ..
check it out ..
as for the dx9 and OGL argument ... it was there in another thread dunno
anyways gxsaurav and anidex had a whole argument on that :D
so that should be pretty clear
just look ard ull find it
 
OP
D

[digitt]

Broken In
i surely went through the whole thread and i understood somethings, but there r some things which are still too technical. But wat i m saying is that can we have something which can explain all these things in simple manner so that we dont have to as the same question again and again
 

theraven

Technomancer
well wait for the experts if they are willing :)
i just pointed u in the right direction ..
wish i could be of more help
 

anidex

Broken In
Post your list of questions and I'll try to answer them to the best of my abilities (of course, there's always good old gxsaurav if I falter :lol:).
 
OP
D

[digitt]

Broken In
if u could explain AntiAliasing, Anisotropic Filtering, Texture Preference , MipMap Detail ,Vertical-Sync, shaders and all that stuff.
I also want to know wat all the scores in the graphics cards lists were all about and that image thing as well. :?: :?: :?: :?:
 
G

gxsaurav

Guest
Fehw, after along day of reinstalling Windows I M back, did I missed something here?

So, U want to know thing about Graphics cards, well it has a long list of definitions, starting from the year 1995 or even before that, but I will tell U what U asked just now

1) Anti Aliasing

U should know that when a game is made it isn't made totally in 3D, look at a wall or box in the game, it has a texture or a color on it, or a lifelike cover on it, consider a box brown in color, so that brown color where does it comes from, it is made in software like Photoshop, they make it & save it as a texture which they can apply to any cube shaped object & it will look like a box in a 3D environment, make a cube & apply the texture to it, just like make a BIG square & apply a wall texture to it to make it look like wall, now when seen in 2D this has all the edges flat cos the image is not rotating , remember the edge of every image or texture is 99% flat or square shaped, so in 3D world, when we look at it from the front it looks flat edged but when we move we do not look at it the way it was made in Photoshop, we look at it from a different angle, this makes the edges jaggy or zigzag, here anti aliasing comes, basically it blurs the edges of everything in the 3D environment so it looks smooth from a distance, now even if U look at a box from an angle the edge is still blurred, giving U a look that the image is smooth on the edges, this aliasing is done in such a way that although after blurring edges, the overall image quality is not reduced

2) Anis Tropic filtering, better ask ANIDEX, I don't have a complete explanation, it's jumbled

3) Texture Preference

This is like to set how sharp the texture will look, in a 3d world if U look at a ultra clear texture then it will eat a lot of performance, but still providing a bit optimized view of texture will save a lot of work by gfx card, it’s something just like Optimizing a file when saving for web, quality is reduced by 10% but files size decreases a lot, ultra clear texture is non compressed

4) Vertical Sync

Usually reviews disable it in the benchmarking process, so people think that it should be disabled when playing games too, but it should be enabled

No matter how good your gfx card is, it will not draw frames higher then your Monitors refresh rate, if it is set to 85 FPS, then no matter what U do, your in game FPS will never go over 85 FPS, people don't realize & buy gfx cards based on benchmark, but they don't know that the best monitors don't go over 120 Hz, so they will get a max of 120 fps on screen no matter what

With V-Sync enabled, which stands for Video-Sync, your maximum frame rates are locked to 85 Hz or the refresh rate of your monitor, U will get smoothest frame rate, cos the gfx card is drawing only 85 fps & monitor is also getting 85 fps to show on the screen, there is no tearing, in image as seen when V-Sync is disabled, cos if it is disabled then your fps rate will be drawn on the screen higher then your monitor so they will be out of sync & U will see tearing, Like 30% upper part of the screen is from the last frame & rest 70% from the current frame,

Modern benchmarking software disable V-Sync when benchmarking automatically if not set by the drivers, but U should enable V-Sync in the game when playing it not benchmarking, cos if U enable V-Sync when benchmarking then no matter what U will never get more then 85 FPS

Another thing is playing at Higher refresh rate, Windows XP SP2 automatically locks the game, that any DX based game will never go above 75 FPS & any OpenGL game will never go above 60 FPS, no matter what, however this is disabled when benchmarking, but not when playing, so U can easily force the refresh rate to something above 75 Hz by the dxdiag.exe control panel, I have set it to 85 Hz, as I play all the games at 1024X768, 2XQ AA & 2X Anis on FX5900XT with 85 Hz refresh rate fixed & forced by Dxdiag, so I get about 52 fps when benchmarking with Doom3 but when I play I even get fps of 60 fps, & when there are many daemons on the screen the game gets slow but still gives about 30 fps, I have V-Sync enabled

5) Pixel & Vertex Shader

Pixel shaders are small programs inside the games, which are made to do something on the fly in the game, like ripple on water, or some new effect or calculation,

Vertex shaders are geometry based, all the geometry calculation are done in vertex shaders, U can write a small program which makes a cube a circle in the game, & this small program will be called A vertex shader

The scores U saw were just demonstration of where our gfx card stands, the benchmarking software simply run some high pixel & vertex shaders & find how fast is the gfx card in doing the calculation of this, the faster the card, the higher it scores however as I said above in the V-Sync column, this doesn't implies in real world gaming, where the fps are locked

Anything else?...
-------------------------------------------------------------------------
Raboo is going to kill be for writing this long
---------------------------------------------------------------------------
 
OP
D

[digitt]

Broken In
thanks for the info gxsaurav(u work hard man)
i have some more questions.
wat is agp4x or 8x :?:
wat is the differnce between the technology which nvidia uses and the tech which ati uses :?:
last but not the least
wat is difference between opngl and directx :?:
 
G

gxsaurav

Guest
Ok, more question, here goes

1) AGP & 4X, 8X

AGP is a bus specifically made by Intel about 9 years back only for gfx cards, before that we had PCI based gfx cards, AGP starts where PCI left, the maximum bandwidth (data from CPU to RAM to PCI) was 133 mbps, while AGP started at 266 mbps with AGP1X, then came AGP2X (512 mbps), AGP 4X (1.1 GBps) & AGP 8X (2.14 GBps)

The difference between AGP 4X & 8X is something which is a lot asked, basically it depends on the game, modern games don't even use the bandwidth provided by AGP 8X completely, this is the reason AGP 8X card when installed on AGP8X motherboard perform the same compared to when install on AGP 4X motherboard, if U have a AGP4X motherboard then stick to it & get a new card, which will be AGP8X, but if U R buying a new system, then U won't get AGP4X anymore, AGP8X rules the market

2) ATI vs. NVIDIA Technology wise

The complete explanation can be found on the net, basically they both are DX9 & OpenGL Compliant, but it's just that ATI uses some different tricks then NVIDIA to do the same thing, so in the end, we get the same image quality, almost same in both cards but the way it is been done in the card differs.

3) DirectX & OpenGL

They are both graphics APIs & they both the same thing on screen

DirectX is made by Microsoft, which is only for Windows; it is made in such a way that DirectX works best with Windows, so it is not supported on other OS, like Mac & UNIX. DirectX is not only for gaming, look at your start menu or anything even those system things re made by DirectX, not directly, it is an engine which runs anything visual & shows it to U on the monitor

OpenGL is an Open Graphics API; it was made by SGI in 1992, to make a standard graphics API back then, so that all the manufacturers who were saying that their own API is better then other can follow a slandered API while still keeping their API for their GFX card

Back the we had many GFX Chip makers, like 3DFX, ATI, NVIDIA, Paramedia, 3DLabs, Matrox, Trident, S3 etc, now we have come to only 2, many of the companies were bought by other companies, like NVIDIA bought 3DFX, & ATI bought ArtX, some are still working but in lower market, like Trident is still good in making 2D for Laptops & PDA, S3 was bought by VIA & is now making the onboard VIA graphics

OpenGL has an ARB, which is a board of many manufactures, such as Apple, SGI, S3, HP, ATI, NVIDIA, they work together & develop a common standard so that the API which comes is compatible with all the manufactures card & not only one, remember 3DFX Glide, it was made for 3DFX cards only although it worked really good at that time, in 3DFX cards, but it didn't worked on other cards, so developers were forced to make the game engine twice, ones in 3DFX & Ones & other API, that’s Y they preferred OpenGL & DirectX as the standard after that, which everyone must follow. Unreal Tournament 1 was also made in Glide, DirectX & a language for S3 graphics

OpenGL, is platform independent, it Runs on Windows, Mac, Linux , Unix, Solaris, anything, this is the reason, all the 3D Software like Maya 3dsMax, are all OpenGL, cos it will run on anything with only 5% change required. Even the Scientific apps, like Satellite simulation, super computing data is made in OpenGL

The interface of Mac is considered to be most beautiful it is based on OpenGL, as well as the interface of Linux is also OpenGL, games made in OpenGL will also work on Linux & Mac, with as I said only 5% change required, but anything made in DirectX will not run in Mac or Linux

As a software, they both don't differ a lot, about the same, DX8.0 bought Pixel Shader support & now we have PS3.0 with DX9c, but OpenGL 2.0 is also out which also has support for PS3.0 just like DX9c, so now it's up to the game developer, He can use DX which is already optimized for Windows environment, or use OpenGL in which the game made can run on both Windows, Mac or Linux, capturing a big market, & also the growing market of Linux gaming.

If he is using DirectX then he don't have to optimize it for anything, but if he is using OpenGL, then he will write 95% code the same, & then the rest of 5% will be again optimized by him, for Windows, Mac & Linux separately, because optimization for Windows won't work on Mac, as Windows runs on X86 while Mac runs on PowerPC
 
OP
D

[digitt]

Broken In
thanks for all this gxgaurav :D
one last small question
i just want to know this
i have p4 1.7ghz with 845 intel chipset which has agp4x (doesnt support ddr)
i m going to get geforcefx5200 128mb but i think its 8x only
so shud i change my mobo to a 8x one(will it make any difference in terms of performance)
 
G

gxsaurav

Guest
Oh my god, U asked this much just to know what will be good

Anyway, go for FX5200, won't have any trouble performance wise
 
OP
D

[digitt]

Broken In
hey man thanks for the info...
i asked this question because u said it wont effect performance if u have a agp4x mobo and a 8x gfx cards...
By the way all the info u gave helped a lot in understanding lots of things.....
 

anidex

Broken In
Sorry for being late, but I'm glad to see that gxsaurav answered most of your questions :). Anyway, here are my thoughts :-

Antialiasing - At low resolutions the pixel sizes are higher than those at higher resolutions. Because of this,

when a 3D scene is rasterized, the edge between 2 adjacent objects seems coarse. To minimize this, 2 popular

methods have been developed :-

1. Supersampling - The original image is rendered at a much higher resolution and then scaled down to the

required size. Since this procedure takes up a lot of video memory, it isn't currently viable in real-time

(though nVIDIA provides this option on the NV4x series, it is limited to 2X oversampling).

2. Multisampling - Here, one simple "samples" all neighbouring pixels around the current pixel and sums them up

to get an average value. This is much cheaper to do and provides quite good results. This is the so called

"anti-aliasing" feature that we see on today's cards.

Anisotropic Filtering :- Anisotropy is basically the distortion of pixels of an object that is oriented at some

arbitrary angle to the screen. To minimize this distortion, we use anisotropic filtering.

Texture Preference and MipMap Detail:- Textures are usually authored at high resolutions like 1024x1024 or

2048x2048. They take up huge amounts of memory and so it would be stupid to use such high resolution textures

for objects that are far away or those that occupy a very small area of the screen. So, the textures are scaled

down multiple times, each "scaled down" copy is called a mip-map. Based on the above 2 factors, the apropriate

mip-map is used for rendering. Higher MipMap detail settings indicate that the original textures will be scaled

down to a lesser extent whereas lower MipMap detail settings indicate that the original textures will be scaled

down considerably.

Vertical-Sync :- The graphics card has a memory surface which represents the contents being displayed on the

monitor. This memory surface is called the front buffer. Everytime the monitor is refreshed, the front buffer

data is displayed on the monitor. But the refresh rate of a monitor is very low (60 Hz, for example). If the

front buffer is updated while the monitor is being refreshed, the image displayed on the monitor will appear to

be cut in the middle, the upper portion containing the new image, and the lower portion containing the old one.

This is called tearing. This can be prevented by 2 methods, Vertical Sync and Back Buffering. With VSync

enabled, the front buffer updates will not be seen on the monitor until it is completely refreshed.

Shaders :- Real-time shaders are classified into 2 groups, vertex and pixel shaders. Shaders are small programs that run on the graphics card and define how each vertex and pixel should be processed. Before programmable graphics cards were invented, hardware manufacturers used to code such techniques into the hardware. But with programmable graphics technology, programmers can specify how each vertex on every object within the game should be processed (using vertex shaders) and how each pixel on the screen (that make up those objects) should be processed (using pixel shaders).

The application in that post simply runs a couple of complex vertex and pixel shaders on a graphics card to determine how well it can take the heat ;). The framerates indicated by the app give a good estimation as to how good one's graphics card is.

Hope that helped :).
 
G

gxsaurav

Guest
Hmm, that did helped, now only if U can stop abusing us NVIDIA fan
 
G

gxsaurav

Guest
oh! by the way, don't U think this is still technical to understand for someone, n00b
 
G

gxsaurav

Guest
& now we again have GX vs. Anidex

write the same thing again, but this time use something simpler then my language, hey, give me tution, what is Back buffer, oh! I get it some FanATIc term, hm, ATI is better, mmm, all again, well, I won't fight this time, so U are unofficially the winner
 
OP
D

[digitt]

Broken In
thankx for all the info ..
maybe u guyz shud write a book(gxsaurav vs anidex wud b a good title)
just feel free to share anything any beginner might need to know
 

anidex

Broken In
The backbuffer is an off-screen surface to which the scene is rendered. The contents of the back and front buffers are then swapped to display the contents on the monitor.
 

plasmafire

Journeyman
in back buffering.. there is a virtual monitor, that can support a much higher reso n refresh..coz it is not hardware..juz s/w.

then we see what happens on that monitor like a movie on the real hardware monitor..

this is an answer 4 noobs
 
Status
Not open for further replies.
Top Bottom