I've had this niggle for years....and I've never been able to find a satisfactory xplanation for it. I've even posted this on the forum a few months back but to no avail...anyway here's trying again.
Let me clear up a few meanings first
(a) Frame rate - it is the number of frames , or still pictures per second generated by the video card to give an illusion of motion
(b) Refresh rate - it is the number of times per second the monitor re-draws the image on the screen according to the signal it receives.
Fine, that's cleared up. Now, most of the CRT's these days, even professional monitors like MAG deliver max refresh rates of upto 80-100 Hz. IT means the monitor draws the image on the screen 80 times a second. Now what's the use of having graphics cards, that deliver a frame rate that is more than your monitor's refresh rate?
As far as we, the user is concerned, it only matters what we can see on the screen, and that is what the monitor draws. What is the point of having a graphics card that produces, say 120 frames a second, if your monitor can draw only 85 of them? Many people have responded to this by saying that Refresh rate and frame rate are different, and not to confuse them. But I think I am right in saying that what we see is what the monitor draws on the screen, right? we can't see the signals sent out by the graphics card?
So at the end of all this, if you're still reading, then you're definitely a techno freak. I'd like your opinion on this issue.
and no, i don't usually use this kind of freaky lingo
thnx, d
Let me clear up a few meanings first
(a) Frame rate - it is the number of frames , or still pictures per second generated by the video card to give an illusion of motion
(b) Refresh rate - it is the number of times per second the monitor re-draws the image on the screen according to the signal it receives.
Fine, that's cleared up. Now, most of the CRT's these days, even professional monitors like MAG deliver max refresh rates of upto 80-100 Hz. IT means the monitor draws the image on the screen 80 times a second. Now what's the use of having graphics cards, that deliver a frame rate that is more than your monitor's refresh rate?
As far as we, the user is concerned, it only matters what we can see on the screen, and that is what the monitor draws. What is the point of having a graphics card that produces, say 120 frames a second, if your monitor can draw only 85 of them? Many people have responded to this by saying that Refresh rate and frame rate are different, and not to confuse them. But I think I am right in saying that what we see is what the monitor draws on the screen, right? we can't see the signals sent out by the graphics card?
So at the end of all this, if you're still reading, then you're definitely a techno freak. I'd like your opinion on this issue.
and no, i don't usually use this kind of freaky lingo
thnx, d