anomit said:
You are totally confused over the meaning and significance of the terms.
The OP is asking the question
because he is confused
When a cinema projector or a VCR is playing a movie, they are showing in sequence individual pictures that were already stored as full frames. But when a graphics card is processing a picture, it has to continuously draw every single pixel dot by dot. In an 800x600 screen that's 480,000 pixels, and each pixel has its own colour and brightness information which also have to be dealt with by the GPU. That's a lot of work, especially when the game is constantly sending a stream of ever-changing information. The more subtle details the image contains, the more work the GPU has to do and takes more time to finish drawing each frame. That's where a slow GPU falters and can draw only a few frames per second (fps), resulting in jerky movements on the screen. A high fps number gives a smooth display of moving objects.
The refresh rate is the number of times per second the full frame is shown on the monitor screen, no matter how quickly or slowly the pixels are drawn. If the fps is low, the same picture is simply displayed again on the next frame.
d is right to some extent in that, for the same game at the same settings, it makes little difference if you get 300 fps or 150fps. The monitor cannot change what it shows within a single refresh cycle, and the human eye cannot follow rapid changes above a certain frequency. Very high fps values simply show the capabilities of a video card, and determine what reserve power it has for higher settings or for future games with ever-increasing requirements.