FPS Vs. Refresh Rate

Status
Not open for further replies.

d

Journeyman
I've had this niggle for years....and I've never been able to find a satisfactory xplanation for it. I've even posted this on the forum a few months back but to no avail...anyway here's trying again.

Let me clear up a few meanings first

(a) Frame rate - it is the number of frames , or still pictures per second generated by the video card to give an illusion of motion
(b) Refresh rate - it is the number of times per second the monitor re-draws the image on the screen according to the signal it receives.

Fine, that's cleared up. Now, most of the CRT's these days, even professional monitors like MAG deliver max refresh rates of upto 80-100 Hz. IT means the monitor draws the image on the screen 80 times a second. Now what's the use of having graphics cards, that deliver a frame rate that is more than your monitor's refresh rate?

As far as we, the user is concerned, it only matters what we can see on the screen, and that is what the monitor draws. What is the point of having a graphics card that produces, say 120 frames a second, if your monitor can draw only 85 of them? Many people have responded to this by saying that Refresh rate and frame rate are different, and not to confuse them. But I think I am right in saying that what we see is what the monitor draws on the screen, right? we can't see the signals sent out by the graphics card?

So at the end of all this, if you're still reading, then you're definitely a techno freak. I'd like your opinion on this issue.

and no, i don't usually use this kind of freaky lingo :lol: :wink:

thnx, d
 

kalpik

In Pursuit of "Happyness"
Hmm... interesting! Id love to get a valid answer for this!

Anyway, what i think, is that regarding the refresh rate of the monitor, the monotor draws the image on the screen one line of pixels at a time. But regarding FPS, it refers to the whole image at a time. Just what i think, maybe im wrong! :p
 

theraven

Technomancer
suppose ur monitor frequency is 60 hz
anything in game above 60 fps is just redundant !

take an example of a higher refresh rate say 80 or even 100hz
and if ur fps is higher it is again redundant

this is y sometimes fps is limited to the refresh rate of the monitor. this is known as vertical synch

but with such high frquency the resultant effect is lost cuz of inability of human eye to perceive more than 50-60 Hz

thats y tv's work on 50 (PAL) and 60 hz (NTSC)


i read this on wikipedia once i thing :D
try en.wikipedia.org/fps
i think that was the link
 

vijay_7287

Cyborg Agent
^^^
does tht mean the GFX cards offerin more than 100fps , would be the same as cards offerin 60 fps in terms of perceivable performance
 

siriusb

Cyborg Agent
^^ That's only partially true. If a card can render at 100fps while another card can render at only 60fps on the same scene, then the 100fps card can render much tougher scenes at 60fps while the other card renders at 20fps (it does not follow this linear rule, of course).
AND, the 100fps thing for a game is only the average of the fps rates over the entire game.

And the human eye can't perceive anthing greater than 30fps. But a monitor frequency needs to be atleast 60Hz.


(a) Frame rate - it is the number of frames , or still pictures per second generated by the video card to give an illusion of motion
(b) Refresh rate - it is the number of times per second the monitor re-draws the image on the screen according to the signal it receives.

Let me rephrase the definitions:
(a)Frame rate: The number of frames the game/software engine can produce. This ability is directly dependent on the GPU, and hence the video card in general.
(b)Refresh rate: The number of times the information to be painted on the monitor is sent to the monitor, from the video memory.
 
OP
D

d

Journeyman
Oh i must beg to differ from sirius B with regard to the refresh rate definition. The Refresh rate has nothing whatsoever to the signal sent to the monitor, it is the monitor's electron gun repainting the picture onto the screen....which it will do even if there is no signal from the video card....
 

siriusb

Cyborg Agent
d said:
Oh i must beg to differ from sirius B with regard to the refresh rate definition. The Refresh rate has nothing whatsoever to the signal sent to the monitor, it is the monitor's electron gun repainting the picture onto the screen....which it will do even if there is no signal from the video card....
I think this is where you get confused. The monitor "supports" a refresh rate at each different resolutions. Your video card must support the same refresh rate for that resolution. Since each monitor is different, you will find that your video card supports quie a few refresh rates at each resolution for maximum compatiblity. So the video card will send information from it's ram 60 times per seond to the monitor if u set the refresh rate to 60Hz.
 

LordDJ

Broken In
Oh i must beg to differ from sirius B with regard to the refresh rate definition. The Refresh rate has nothing whatsoever to the signal sent to the monitor, it is the monitor's electron gun repainting the picture onto the screen....which it will do even if there is no signal from the video card....
When there is no signal applied to the monitor usually the electron gun is switched off to conserve power. So that statement is not entirewly correct.

(b)Refresh rate: The number of times the information to be painted on the monitor is sent to the monitor, from the video memory.
Quite the opposite actually. Refresh rate when talking about a monitor is indeed the rate at which the CRT repaints itself. This is the rate to which the video card must synch and not vice versa. At a higher resolution, the monitor will have to redraw at a higher freq to avoid flicker, therefore a higher refresh rate is required.

Frame rate is however dependent on how fast the s/w h/w combination renders and supplies the frame (image).
 
OP
D

d

Journeyman
ok lord DJ.....tht's accepted...but in old monitors, when there was no signal, you would be able to see a black screen....but it'll be luminous

so tht's where there is no signal n the electron gun is still scanning....obv i'm not talking abt crt power saving modes :roll:
 
OP
D

d

Journeyman
At a higher resolution, the monitor will have to redraw at a higher freq to avoid flicker, therefore a higher refresh rate is required.


Again, at this point, I hv to differ. Its not the frequency which differs, the frequency remains the same. But as in higher resolutions, you have more pixels over a constant(same) area, more pixels have to be scanned (and changed) accordingly in the same amount of time. So Frequency of the refresh rate is the no. of times it refreshes per second, i.e it doesnt change...only the speed with which the scanning takes place changes....so there you have a limiting factor..i.e the electron gun can only travel so fast...

[/quote]
 

LordDJ

Broken In
d said:
. Its not the frequency which differs, the frequency remains the same. But as in higher resolutions, you have more pixels over a constant(same) area, more pixels have to be scanned (and changed) accordingly in the same amount of time. So Frequency of the refresh rate is the no. of times it refreshes per second, i.e it doesnt change...only the speed with which the scanning takes place changes....so there you have a limiting factor..i.e the electron gun can only travel so fast...
Dude you are confused!
Frequency of the refresh rate
?? Refresh rate is itself the frequency! and more importantly when someone refers to refresh rate, he usually means the Vertical Refresh rate and not the horizontal one.


Where as the Vertical refresh rate is of the oder of Hertz, the Horizontal refresh rate is of the order of KILO hertz. So the speed at which it scans (ie. left to right motion) is very very high and thus is not a problem to worry about. But still, the VRR is limited by the HRR or scan rate (happy??)

Oh and about the quote thinngy you'll have to use Post a reply rather than Quick Post.
 

con_tester

Journeyman
Hmmmm..
I learned lots here, I only knows the content of the question but what SiriusB told is very new and knowledgeful to me. Thanks SiriusB.
I love all HP fans.
 

siriusb

Cyborg Agent
Quite the opposite actually. Refresh rate when talking about a monitor is indeed the rate at which the CRT repaints itself.
It depends on your POV :)

This is the rate to which the video card must synch and not vice versa.
That's what I said.

At a higher resolution, the monitor will have to redraw at a higher freq to avoid flicker, therefore a higher refresh rate is required.
Infact, the higher the resolution, the lower the refresh rate of the the monitor. You are confuseing vertical and horizontal refresh rates. The horizontal refresh rate may be higher at higher resolution, but it doesn't mean anything and is not referred to when talking of refresh rate. Because, the vertical refresh rate is the only thing that tells you how many times a screen is drawn per second.

Oh and about the quote thinngy you'll have to use Post a reply rather than Quick Post.
That is not true. It must be as Qwertymaniac says.

And finally, dude D, what you need is a good source to read than me telling you:
*geek.com/htbc/build/vidcarbu.htm
*sophia.dtp.fmph.uniba.sk/pchardware/video.html
 
OP
D

d

Journeyman
thnx....while i hv learnt a lotta new things, the basic question hasnt been answered..


Where as the Vertical refresh rate is of the oder of Hertz, the Horizontal refresh rate is of the order of KILO hertz. So the speed at which it scans (ie. left to right motion) is very very high and thus is not a problem to worry about. But still, the VRR is limited by the HRR or scan rate (happy??)

but when we change the vertical refresh rate, in the order of hertz, fron like, 85 hz, which we are used to (or atleast I am), to 60 Hz, we are able to see a noticeable difference....i.e the screen flickers...so what we see cannot be entirely independant of the vertical refresh rate, right?

From Wikipedia:

The refresh rate (or "vertical refresh rate", "vertical scan rate") is the maximum number of frames that can be displayed on a monitor (or television) in a second, expressed in hertz.

*en.wikipedia.org/wiki/refresh_rate

here, they're equating both frame rate and refresh rate...i.e wht can be understood from this statement is that the refresh rate is the max no. of frames tht can be dispayed per sec on the monitor...

and from this quote
24-30Hz is a sufficient video frame rate for smooth motion, and transmitting any more would be a waste of radio bandwidth

OF course, there, they're talking about TV transmission.....but I think computer generated 3d video rarely appears better than TV video, does it? wtvr the frame rate might be...

and btw
So Frequency of the refresh rate
was a boo boo :oops: :lol:


Now tht i hv learnt to use quotes, i hv used it to the max extent...god only knows how the result will be :wink:
 

mediator

Technomancer
The games like underground2,doom3 use higher fps and huge data files. That also explains why they need higher fps,MB graphics card. Also if u play these games under shared video memory u'll notice pauses with the game coz standard shared video memory isn't able to support such high fps!
Well that may be incorrect only a guess deduced from observation!
 

siriusb

Cyborg Agent
but I think computer generated 3d video rarely appears better than TV video, does it? wtvr the frame rate might be...
You are right. It rarely/never appears better on monitors than it does on TV. But you can't compare tv and monitor can you? TV is blurry as hell AND you hve to watch it from afar AND it can display only upto 525/618(?) lines. Where as in monitors, you have high quality display technology, you watch it sitting closer to the monitor and the resolution displayed is higher.
IOW, the tv screen blurs out the details that otherwise will be displayed on a comp monitor.

Mediator:
The games like underground2,doom3 use higher fps and huge data files. That also explains why they need higher fps,MB graphics card.
Use higher fps? No. FPS is not a resource to be used. What you say contains some truth though: That games that require fast decision making (shooters and racing games) need higher fps to be playable. But that is another story.
That having huge data (pak) files means that they may need more ram perhaps, but the game engine (or maybe video card) only loads the bitmps (texture, bump, normal, shadow, etc) that it needs for the current calculations, into the vid card memory. The huge files are mostly zip archives to limit hdd space consumption. The game extracts the mpas it needs before a level is loaded. This is one of the reasons for the level loading delays in most games.
Also if u play these games under shared video memory u'll notice pauses with the game coz standard shared video memory isn't able to support such high fps!
You will notice pauses maybe because of the facts like the shared memory's slower speeds, the slower speeds of theinterconnection, or it not being a SGRAM.
 
OP
D

d

Journeyman
:lol: i refuse to give up. we're losing focus, and tht's wht made me quit last time....but the stakes are too high :wink: so at the end of all this, with a thousand and one diversions, it still holds that a frame rate tht's above the monitor's refresh rate is superflous....


btw...thnx to sirius b n every1 else, i've learnt a lot......on this.
 
Status
Not open for further replies.
Top Bottom