FPS Vs. Refresh Rate

Status
Not open for further replies.

siriusb

Cyborg Agent
Your that question has been answered by theraven.
suppose ur monitor frequency is 60 hz
anything in game above 60 fps is just redundant !

And the reason why we have faster cards:
If a card can render at 100fps while another card can render at only 60fps on the same scene, then the 100fps card can render much tougher scenes at 60fps while the other card renders at 20fps (it does not follow this linear rule, of course).
AND, the 100fps thing for a game is only the average of the fps rates over the entire game.

Since this is fun, let me try to tell you in a completely different way:
Say your monitor is set at 60Hz refresh rate and that you are running a game. The game engine calculates the frames using the gpu and the end result (to be displayed) is stored on the "frame buffer". This is an important thing to understand. The game may run at even 1000fps, but only the most recent frame is placed in the frame buffer. The video card contains another component called a RAMDAC. This component periodically accesses the frame buffer, converts the digital bitmap representation to analog signal, and sends it to the analog monitor. This periodic checking of the ramdac is in-sync with the refresh rate of the monitor, which here is 60 Hz.

If that was new to you, here's a fun fact too: Some games support the "V-Sync" option wherein, they synch the game engine to produce the frames in-synch with the refresh rate. This is done to prevent 'screen-tearing', an effect that occurs when the game engine updates the frame buffer before the ramdac has finished reading the frame buffer's previous content. The result is that the top portion is from the previous frame and the bottom portion is from the next screen.
 
OP
D

d

Journeyman
about RAMDAC's and v-synching, i knew, mate...but the one about frame tearing sounds cool....thnx anyway....i cn sleep peacefully....knowing tht i was right..
 
Status
Not open for further replies.
Top Bottom