I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
[quote name='cl55amg' date='04 April 2012 - 06:15 AM' timestamp='1333541732' post='1391863']
Hi guys,
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
thanks
[/quote]
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
[quote name='cl55amg' date='04 April 2012 - 06:15 AM' timestamp='1333541732' post='1391863']
Hi guys,
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
thanks
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64
[quote name='SnickerSnack' date='04 April 2012 - 05:29 AM' timestamp='1333542550' post='1391866']
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
[/quote]
I have read some some different oppinions about this issue which makes it pretty confusing for me :(
Games dont deliver a constant frame rate, like film material does. How is this affecting 120Hz displays in 2D and 3D mode?
[quote name='SnickerSnack' date='04 April 2012 - 05:29 AM' timestamp='1333542550' post='1391866']
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
I have read some some different oppinions about this issue which makes it pretty confusing for me :(
Games dont deliver a constant frame rate, like film material does. How is this affecting 120Hz displays in 2D and 3D mode?
IIRC your monitor and graphics card are two seporate entities: Your nVision capable monitor refreshes at 120Hz and your gfx card produces the maximum FPS it can handle depending on how demanding the game is.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
but
if the FPS is quicker than the refresh rate of the monitor, say 130 FPS, the signal the gfx sends might have changed by the time the monitor has completed its refresh cycle - this causes screen tearing (fixed using v-sync option). [i]The screen tears because the monitor refreshes bottom to top, the bottom of the screen shows game frame #1, as the monitor refresh cycle progresses up the monitor, the game's frame changes to #2 at this point the monitor shows the new image in its refresh cycle so the rest (upper end) of the screen has changed. [/i]
When you game in 3D the monitor is still refreshing at 120 Hz but the scenes alternate for each eye. So you see 60Hz each eye. 60Hz left + 60Hz right = 120Hz
That is what I remember so might not be totally accurate.
IIRC your monitor and graphics card are two seporate entities: Your nVision capable monitor refreshes at 120Hz and your gfx card produces the maximum FPS it can handle depending on how demanding the game is.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
but
if the FPS is quicker than the refresh rate of the monitor, say 130 FPS, the signal the gfx sends might have changed by the time the monitor has completed its refresh cycle - this causes screen tearing (fixed using v-sync option). The screen tears because the monitor refreshes bottom to top, the bottom of the screen shows game frame #1, as the monitor refresh cycle progresses up the monitor, the game's frame changes to #2 at this point the monitor shows the new image in its refresh cycle so the rest (upper end) of the screen has changed.
When you game in 3D the monitor is still refreshing at 120 Hz but the scenes alternate for each eye. So you see 60Hz each eye. 60Hz left + 60Hz right = 120Hz
That is what I remember so might not be totally accurate.
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
[quote name='andysonofbob' date='04 April 2012 - 06:26 AM' timestamp='1333545999' post='1391880']
IIRC your monitor and graphics card are two seporate entities: Your nVision capable monitor refreshes at 120Hz and your gfx card produces the maximum FPS it can handle depending on how demanding the game is.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
[/quote]
What you say is that the screen refresh rate is always fixed (60Hz oder 120Hz) inependently from the input it gets from a gfx card?
This means that reduced motion blur with a 120Hz display (in 2D mode) is always active, even if the gfx card can only output 30fps?
[quote name='andysonofbob' date='04 April 2012 - 06:26 AM' timestamp='1333545999' post='1391880']
IIRC your monitor and graphics card are two seporate entities: Your nVision capable monitor refreshes at 120Hz and your gfx card produces the maximum FPS it can handle depending on how demanding the game is.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
What you say is that the screen refresh rate is always fixed (60Hz oder 120Hz) inependently from the input it gets from a gfx card?
This means that reduced motion blur with a 120Hz display (in 2D mode) is always active, even if the gfx card can only output 30fps?
I think motion blur might have something to do with the monitor's response time. Check your monitor's. If it is less than 8ms you should be fine in 2D any more than that can cause blurring. I guess we will need 4ms for 3D.
Can anyone confirm? I am going all empirical here...
I think motion blur might have something to do with the monitor's response time. Check your monitor's. If it is less than 8ms you should be fine in 2D any more than that can cause blurring. I guess we will need 4ms for 3D.
Can anyone confirm? I am going all empirical here...
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
[quote name='andysonofbob' date='04 April 2012 - 07:46 AM' timestamp='1333550795' post='1391909']
I think motion blur might have something to do with the monitor's response time. Check your monitor's. If it is less than 8ms you should be fine in 2D any more than that can cause blurring. I guess we will need 4ms for 3D.
Can anyone confirm? I am going all empirical here...
[/quote]
Motion Blur is reduced by a high refresh rate (its a weakness of LCD techonlogy). Response time is a different topic.
You can check out Nvidias statements about the 120Hz motion blur advantage in 2D mode, which do not answer my particular question.
[quote name='andysonofbob' date='04 April 2012 - 07:46 AM' timestamp='1333550795' post='1391909']
I think motion blur might have something to do with the monitor's response time. Check your monitor's. If it is less than 8ms you should be fine in 2D any more than that can cause blurring. I guess we will need 4ms for 3D.
Can anyone confirm? I am going all empirical here...
Motion Blur is reduced by a high refresh rate (its a weakness of LCD techonlogy). Response time is a different topic.
You can check out Nvidias statements about the 120Hz motion blur advantage in 2D mode, which do not answer my particular question.
I think you are using an incorrect term. Motion blur is something that happens in film because of slow shutter speed of the camera. If you look at an individual frame of a film during an action scene, the image is blurry because the camera was capturing a moving object. When you view the final film on TV or movie screen, your brain will compensate and you will see a fluid movement. That's why a 24FPS movie looks much more fluid than a 24FPS video game would. Video games render 100% sharp images, not ones with blurring.
In some advanced video games there is a feature called 'motion blur', though, which tries to mimic this effect.
I bet what you mean, though, is the ghosting that you see on monitors and it has nothing to do with the refresh rate but rather the response time of the monitor. It has to do with how quickly the individual LCD pixels can change colors.
120Hz monitors do have a very fast response time so the ghosting is unnoticeable, but there are plenty of 60Hz monitors with equally high response time and you do not see any ghosting on them either.
By the way, just because you have a 120Hz monitor does not necessarily mean that the game is using 120Hz. For example Dirt 2 seems to have a bug in it where no matter what you do, the game will only run at 60Hz (not 60FPS, but actually 60Hz) when you play in full screen. The game even has an in-game option where you can choose the refresh rate and choose 120HZ, but it always uses 60Hz.
I think you are using an incorrect term. Motion blur is something that happens in film because of slow shutter speed of the camera. If you look at an individual frame of a film during an action scene, the image is blurry because the camera was capturing a moving object. When you view the final film on TV or movie screen, your brain will compensate and you will see a fluid movement. That's why a 24FPS movie looks much more fluid than a 24FPS video game would. Video games render 100% sharp images, not ones with blurring.
In some advanced video games there is a feature called 'motion blur', though, which tries to mimic this effect.
I bet what you mean, though, is the ghosting that you see on monitors and it has nothing to do with the refresh rate but rather the response time of the monitor. It has to do with how quickly the individual LCD pixels can change colors.
120Hz monitors do have a very fast response time so the ghosting is unnoticeable, but there are plenty of 60Hz monitors with equally high response time and you do not see any ghosting on them either.
By the way, just because you have a 120Hz monitor does not necessarily mean that the game is using 120Hz. For example Dirt 2 seems to have a bug in it where no matter what you do, the game will only run at 60Hz (not 60FPS, but actually 60Hz) when you play in full screen. The game even has an in-game option where you can choose the refresh rate and choose 120HZ, but it always uses 60Hz.
Hmmm...there is a lot of misinformation in this thread. It is, however, fairly complicated, so let's see if I can shed some light.
There are a few different things going on here:
First let's clarify that 120Hz has nothing to do with [b]input lag[/b]. Input lag is how long it takes for the monitor to show the image after it has received the signal. Display devices with high input lag are obviously bad for gaming...but this is dependent on how much processing the display is doing, not the refresh rate of the panel. For example, a television may do a bunch of processing on the signal to try and improve the picture, introducing a lot of input lag. This same TV may have a "Game" mode that forgoes the processing to make sure the input lag is minimized.
Another term that's been mentioned is the response time. Despite what it sounds like, this is not the same as input lag. It is more aptly named [b]pixel response time[/b]. This is the time it takes for a pixel to change colors. A slow response time makes the image "smear," as Gilador described in the above post. The response time is semi-independent of refresh rate. I say semi because it can be as low/fast as they want to/are able to make it, but it has an upper cap based on the refresh rate. The pixels cannot change slower than the refresh rate. They can, however, change faster than the refresh rate. The better the pixel response time, the better the display is for moving images (of any type: games, videos, tv, etc).
Then there's the actual [b]refresh rate[/b]. This is the rate at which the monitor changes all of the pixels on the display to the next image. On your 120Hz display, every pixel will be commanded to change 120 times per second. It may, of course, be commanded to "change" to the same color if the image hasn't changed for that pixel.
The final piece is the [b]frame rate[/b]. This is the rate at which your content device is sending your display device new information. As the OP points out, this is a constant from a device like a Blu-ray player. At 1080p over DVI, that information will always be coming at 24fps. In this case the 120Hz display would show each image for 5 refresh cycles. This actually does lead to a slightly different perception than a 60Hz TV, which has to alternate between showing each image for 2 or 3 cycles (known as 3:2 pulldown). It is important to note that most 120Hz and above TVs only accept a signal at 60Hz max (limited by the bandwidth of HDMI), and therefore must display the same image for multiple refresh cycles, while 120Hz monitors (that can accept a full 1080p @ 120Hz signal over Dual DVI) can display a different picture every refresh cycle.
However, the real question is, what happens when we have a variable rate input signal, like a PC with varying frame rates? Without vertical synchronization (vsync) enabled, the PC will push out a new frame as soon as it is rendered. Since the display refreshes pixel by pixel (horizontal first, then by line...the same way you would read a page in a book), this means you could receive a new picture in the middle of a refresh cycle, leading to the tearing phenomenon that has already been mentioned. Vsync essentially forces the GPU to wait until the beginning of the next refresh cycle to push out the picture. When there is an excess of rendering power, it also forces the GPU to wait until the next refresh cycle to render the next frame.
So what happens if you are vsync'ed to prevent tearing but only have enough "horsepower" to run, say, 30 fps on your 120Hz monitor? In this case, the same frame will be displayed for 4 consecutive refresh cycles. This will look nearly identical to the same frame being displayed for 2 consecutive cycles on a 60Hz monitor. So the answer to the OP's question is yes and no. Yes, the monitor does refresh still refresh at 120Hz by repeating frames. However, no, it will not look as smooth as actually showing 120 different frames. If the frame rate is less than the refresh rate of all monitors being compared, they will essentially look the same. The only caveat is that the pixel response time must necessarily be low on the 120Hz monitor, but may or may not be as low on the 60Hz monitor depending on it's quality.
Finally, 3D follows all the same rules, except one eye is blacked out at a time. This effectively halves the refresh rate to that eye. So what looked like 120Hz 2D to both eyes will look like 60Hz in 3D to each eye. In other words, the same 120Hz monitor will look smoother to you in 2D than when running in 3D. There are some important points on how the refresh rates on the glasses interact with that of the display device that can lead to or prevent cross-talk, but that's a whole other discussion!
Hmmm...there is a lot of misinformation in this thread. It is, however, fairly complicated, so let's see if I can shed some light.
There are a few different things going on here:
First let's clarify that 120Hz has nothing to do with input lag. Input lag is how long it takes for the monitor to show the image after it has received the signal. Display devices with high input lag are obviously bad for gaming...but this is dependent on how much processing the display is doing, not the refresh rate of the panel. For example, a television may do a bunch of processing on the signal to try and improve the picture, introducing a lot of input lag. This same TV may have a "Game" mode that forgoes the processing to make sure the input lag is minimized.
Another term that's been mentioned is the response time. Despite what it sounds like, this is not the same as input lag. It is more aptly named pixel response time. This is the time it takes for a pixel to change colors. A slow response time makes the image "smear," as Gilador described in the above post. The response time is semi-independent of refresh rate. I say semi because it can be as low/fast as they want to/are able to make it, but it has an upper cap based on the refresh rate. The pixels cannot change slower than the refresh rate. They can, however, change faster than the refresh rate. The better the pixel response time, the better the display is for moving images (of any type: games, videos, tv, etc).
Then there's the actual refresh rate. This is the rate at which the monitor changes all of the pixels on the display to the next image. On your 120Hz display, every pixel will be commanded to change 120 times per second. It may, of course, be commanded to "change" to the same color if the image hasn't changed for that pixel.
The final piece is the frame rate. This is the rate at which your content device is sending your display device new information. As the OP points out, this is a constant from a device like a Blu-ray player. At 1080p over DVI, that information will always be coming at 24fps. In this case the 120Hz display would show each image for 5 refresh cycles. This actually does lead to a slightly different perception than a 60Hz TV, which has to alternate between showing each image for 2 or 3 cycles (known as 3:2 pulldown). It is important to note that most 120Hz and above TVs only accept a signal at 60Hz max (limited by the bandwidth of HDMI), and therefore must display the same image for multiple refresh cycles, while 120Hz monitors (that can accept a full 1080p @ 120Hz signal over Dual DVI) can display a different picture every refresh cycle.
However, the real question is, what happens when we have a variable rate input signal, like a PC with varying frame rates? Without vertical synchronization (vsync) enabled, the PC will push out a new frame as soon as it is rendered. Since the display refreshes pixel by pixel (horizontal first, then by line...the same way you would read a page in a book), this means you could receive a new picture in the middle of a refresh cycle, leading to the tearing phenomenon that has already been mentioned. Vsync essentially forces the GPU to wait until the beginning of the next refresh cycle to push out the picture. When there is an excess of rendering power, it also forces the GPU to wait until the next refresh cycle to render the next frame.
So what happens if you are vsync'ed to prevent tearing but only have enough "horsepower" to run, say, 30 fps on your 120Hz monitor? In this case, the same frame will be displayed for 4 consecutive refresh cycles. This will look nearly identical to the same frame being displayed for 2 consecutive cycles on a 60Hz monitor. So the answer to the OP's question is yes and no. Yes, the monitor does refresh still refresh at 120Hz by repeating frames. However, no, it will not look as smooth as actually showing 120 different frames. If the frame rate is less than the refresh rate of all monitors being compared, they will essentially look the same. The only caveat is that the pixel response time must necessarily be low on the 120Hz monitor, but may or may not be as low on the 60Hz monitor depending on it's quality.
Finally, 3D follows all the same rules, except one eye is blacked out at a time. This effectively halves the refresh rate to that eye. So what looked like 120Hz 2D to both eyes will look like 60Hz in 3D to each eye. In other words, the same 120Hz monitor will look smoother to you in 2D than when running in 3D. There are some important points on how the refresh rates on the glasses interact with that of the display device that can lead to or prevent cross-talk, but that's a whole other discussion!
Hope that helps!
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
Just to clear that up, the motion blur I'm talking about is the LCD technololgy specific problem, where
the picture gets blury in rapid motion content. This is becuase a LCD is a [u]hold type display[/u].
I dont talk about ghosting. The response time can also have an influence on motion blur or a different type of it.
For LCD-TVs they implemented the 100Hz/200Hz technology to reduce motion blur, which use interpolated frames and also try to reduce film judder etc.
Interpolated frames have some dissadvantages and this is a completely different approach compared to 120Hz PC monitors.
Here are some of the basics explained about motion blur: http://scien.stanford.edu/pages/labsite/2010/psych221/projects/2010/LievenVerslegers/LCD_Motion_B
@ FormulaRedline
Now back to 120Hz for [b]PC monitors[/b]:
As you said, the refresh cycle is always 120Hz. So when my PC outputs 50 to 60 frames average in a certain game,
the monitor will still refresh with 120Hz by putting more of the same frames in its refresh cycle.
If this is the case the motion blur should still be reduced, right? Because you refresh the picture at a higher rate per second, so the hold type LCD display does not hold the picture for too long, which is a problem for the human eye. CRTs or plasmas dont have the motion blur problem,
because of the much shorter pulses per second. (pulse type display vs. hold type display)
Just to clear that up, the motion blur I'm talking about is the LCD technololgy specific problem, where
the picture gets blury in rapid motion content. This is becuase a LCD is a hold type display.
I dont talk about ghosting. The response time can also have an influence on motion blur or a different type of it.
For LCD-TVs they implemented the 100Hz/200Hz technology to reduce motion blur, which use interpolated frames and also try to reduce film judder etc.
Interpolated frames have some dissadvantages and this is a completely different approach compared to 120Hz PC monitors.
Here are some of the basics explained about motion blur: http://scien.stanford.edu/pages/labsite/2010/psych221/projects/2010/LievenVerslegers/LCD_Motion_B
@ FormulaRedline
Now back to 120Hz for PC monitors:
As you said, the refresh cycle is always 120Hz. So when my PC outputs 50 to 60 frames average in a certain game,
the monitor will still refresh with 120Hz by putting more of the same frames in its refresh cycle.
If this is the case the motion blur should still be reduced, right? Because you refresh the picture at a higher rate per second, so the hold type LCD display does not hold the picture for too long, which is a problem for the human eye. CRTs or plasmas dont have the motion blur problem,
because of the much shorter pulses per second. (pulse type display vs. hold type display)
I do believe I get your question and did from the start. What I tried to explain above is that the effect you are describing is wholly dependent on the [b]pixel response time[/b]. As I said, a 120Hz monitor must necessarily have a good pixel response time, but in theory a 60Hz monitor could also have a good pixel response time.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
I do believe I get your question and did from the start. What I tried to explain above is that the effect you are describing is wholly dependent on the pixel response time. As I said, a 120Hz monitor must necessarily have a good pixel response time, but in theory a 60Hz monitor could also have a good pixel response time.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
[quote name='FormulaRedline' date='05 April 2012 - 05:59 AM' timestamp='1333630796' post='1392254']
I do believe I get your question and did from the start. What I tried to explain above is that the effect you are describing is wholly dependent on the [b]pixel response time[/b]. As I said, a 120Hz monitor must necessarily have a good pixel response time, but in theory a 60Hz monitor could also have a good pixel response time.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
[/quote]
Waht you say makes defentely sense!
Is it possible that a 120Hz monitor, increases the pixel response time in 120Hz mode? And therefore decreases it in 60Hz mode...
Would this be an explaination why the motion blur seems to be lower in 120Hz mode? (on the same display compared to 60Hz)
[quote name='FormulaRedline' date='05 April 2012 - 05:59 AM' timestamp='1333630796' post='1392254']
I do believe I get your question and did from the start. What I tried to explain above is that the effect you are describing is wholly dependent on the pixel response time. As I said, a 120Hz monitor must necessarily have a good pixel response time, but in theory a 60Hz monitor could also have a good pixel response time.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
Waht you say makes defentely sense!
Is it possible that a 120Hz monitor, increases the pixel response time in 120Hz mode? And therefore decreases it in 60Hz mode...
Would this be an explaination why the motion blur seems to be lower in 120Hz mode? (on the same display compared to 60Hz)
I don't understand what you mean by 120Hz versus 60Hz mode. A 120Hz monitor will always refresh at 120Hz. You may send a signal to it at a lower rate, but it will still tell the pixels to update 120 times a second.
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
I don't understand what you mean by 120Hz versus 60Hz mode. A 120Hz monitor will always refresh at 120Hz. You may send a signal to it at a lower rate, but it will still tell the pixels to update 120 times a second.
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
[quote name='FormulaRedline' date='05 April 2012 - 01:01 PM' timestamp='1333656118' post='1392398']
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
[/quote]
That's an easy one, then! This setting only refers to the Windows desktop. Windows will either render the desktop at 60fps or 120fps. When you choose 120, you see twice as many frames. So the mouse cursor and any animations (windows moving around or minimizing/maximizing) seem twice as smooth because they are.
If you start a game, it will override this setting and display at whatever frame rate you allow it.
That's an easy one, then! This setting only refers to the Windows desktop. Windows will either render the desktop at 60fps or 120fps. When you choose 120, you see twice as many frames. So the mouse cursor and any animations (windows moving around or minimizing/maximizing) seem twice as smooth because they are.
If you start a game, it will override this setting and display at whatever frame rate you allow it.
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
thanks
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
thanks
Hi guys,
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
thanks
[/quote]
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
Hi guys,
I have a question and would appreciate if somebody with expert knowledge responds.
A 120Hz Nvision capable 3D display offers the advantage of reduced motion blur in 2D mode. This is pretty clear.
Is this only the case when my PC renders 120fps in a game? For expample: if I play a demanding game where my PC is only capable
of rendering 40fps average, does the display still resfresh the frames with 120Hz per second (by repeating indentical frames e.g) or is in this case the reduced motion blur advantage gone?
thanks
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
[/quote]
I have read some some different oppinions about this issue which makes it pretty confusing for me :(
Games dont deliver a constant frame rate, like film material does. How is this affecting 120Hz displays in 2D and 3D mode?
No, the advantage is not lost. The refresh rate of your monitor is different than the frames per second rate that your video card can render. They by no means have to be equal. In fact, it is rare to see 120fps in most games with high detail. I could be mistaken, but I think this is actually impossible when playing in 3D.
I have read some some different oppinions about this issue which makes it pretty confusing for me :(
Games dont deliver a constant frame rate, like film material does. How is this affecting 120Hz displays in 2D and 3D mode?
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
but
if the FPS is quicker than the refresh rate of the monitor, say 130 FPS, the signal the gfx sends might have changed by the time the monitor has completed its refresh cycle - this causes screen tearing (fixed using v-sync option). [i]The screen tears because the monitor refreshes bottom to top, the bottom of the screen shows game frame #1, as the monitor refresh cycle progresses up the monitor, the game's frame changes to #2 at this point the monitor shows the new image in its refresh cycle so the rest (upper end) of the screen has changed. [/i]
When you game in 3D the monitor is still refreshing at 120 Hz but the scenes alternate for each eye. So you see 60Hz each eye. 60Hz left + 60Hz right = 120Hz
That is what I remember so might not be totally accurate.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
but
if the FPS is quicker than the refresh rate of the monitor, say 130 FPS, the signal the gfx sends might have changed by the time the monitor has completed its refresh cycle - this causes screen tearing (fixed using v-sync option). The screen tears because the monitor refreshes bottom to top, the bottom of the screen shows game frame #1, as the monitor refresh cycle progresses up the monitor, the game's frame changes to #2 at this point the monitor shows the new image in its refresh cycle so the rest (upper end) of the screen has changed.
When you game in 3D the monitor is still refreshing at 120 Hz but the scenes alternate for each eye. So you see 60Hz each eye. 60Hz left + 60Hz right = 120Hz
That is what I remember so might not be totally accurate.
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
IIRC your monitor and graphics card are two seporate entities: Your nVision capable monitor refreshes at 120Hz and your gfx card produces the maximum FPS it can handle depending on how demanding the game is.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
[/quote]
What you say is that the screen refresh rate is always fixed (60Hz oder 120Hz) inependently from the input it gets from a gfx card?
This means that reduced motion blur with a 120Hz display (in 2D mode) is always active, even if the gfx card can only output 30fps?
IIRC your monitor and graphics card are two seporate entities: Your nVision capable monitor refreshes at 120Hz and your gfx card produces the maximum FPS it can handle depending on how demanding the game is.
The nVision monitor always refreshes at 120Hz (from bottom to top) - it looks at the signal send from the gfx card to see what it should display at each clock cycle.
If the game is quite demanding, say the gfx card can manage around 50 fps, each screen refresh cycle is easily within each frame of the game so all is well
What you say is that the screen refresh rate is always fixed (60Hz oder 120Hz) inependently from the input it gets from a gfx card?
This means that reduced motion blur with a 120Hz display (in 2D mode) is always active, even if the gfx card can only output 30fps?
Can anyone confirm? I am going all empirical here...
Can anyone confirm? I am going all empirical here...
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
I think motion blur might have something to do with the monitor's response time. Check your monitor's. If it is less than 8ms you should be fine in 2D any more than that can cause blurring. I guess we will need 4ms for 3D.
Can anyone confirm? I am going all empirical here...
[/quote]
Motion Blur is reduced by a high refresh rate (its a weakness of LCD techonlogy). Response time is a different topic.
You can check out Nvidias statements about the 120Hz motion blur advantage in 2D mode, which do not answer my particular question.
I think motion blur might have something to do with the monitor's response time. Check your monitor's. If it is less than 8ms you should be fine in 2D any more than that can cause blurring. I guess we will need 4ms for 3D.
Can anyone confirm? I am going all empirical here...
Motion Blur is reduced by a high refresh rate (its a weakness of LCD techonlogy). Response time is a different topic.
You can check out Nvidias statements about the 120Hz motion blur advantage in 2D mode, which do not answer my particular question.
In some advanced video games there is a feature called 'motion blur', though, which tries to mimic this effect.
I bet what you mean, though, is the ghosting that you see on monitors and it has nothing to do with the refresh rate but rather the response time of the monitor. It has to do with how quickly the individual LCD pixels can change colors.
120Hz monitors do have a very fast response time so the ghosting is unnoticeable, but there are plenty of 60Hz monitors with equally high response time and you do not see any ghosting on them either.
By the way, just because you have a 120Hz monitor does not necessarily mean that the game is using 120Hz. For example Dirt 2 seems to have a bug in it where no matter what you do, the game will only run at 60Hz (not 60FPS, but actually 60Hz) when you play in full screen. The game even has an in-game option where you can choose the refresh rate and choose 120HZ, but it always uses 60Hz.
In some advanced video games there is a feature called 'motion blur', though, which tries to mimic this effect.
I bet what you mean, though, is the ghosting that you see on monitors and it has nothing to do with the refresh rate but rather the response time of the monitor. It has to do with how quickly the individual LCD pixels can change colors.
120Hz monitors do have a very fast response time so the ghosting is unnoticeable, but there are plenty of 60Hz monitors with equally high response time and you do not see any ghosting on them either.
By the way, just because you have a 120Hz monitor does not necessarily mean that the game is using 120Hz. For example Dirt 2 seems to have a bug in it where no matter what you do, the game will only run at 60Hz (not 60FPS, but actually 60Hz) when you play in full screen. The game even has an in-game option where you can choose the refresh rate and choose 120HZ, but it always uses 60Hz.
There are a few different things going on here:
First let's clarify that 120Hz has nothing to do with [b]input lag[/b]. Input lag is how long it takes for the monitor to show the image after it has received the signal. Display devices with high input lag are obviously bad for gaming...but this is dependent on how much processing the display is doing, not the refresh rate of the panel. For example, a television may do a bunch of processing on the signal to try and improve the picture, introducing a lot of input lag. This same TV may have a "Game" mode that forgoes the processing to make sure the input lag is minimized.
Another term that's been mentioned is the response time. Despite what it sounds like, this is not the same as input lag. It is more aptly named [b]pixel response time[/b]. This is the time it takes for a pixel to change colors. A slow response time makes the image "smear," as Gilador described in the above post. The response time is semi-independent of refresh rate. I say semi because it can be as low/fast as they want to/are able to make it, but it has an upper cap based on the refresh rate. The pixels cannot change slower than the refresh rate. They can, however, change faster than the refresh rate. The better the pixel response time, the better the display is for moving images (of any type: games, videos, tv, etc).
Then there's the actual [b]refresh rate[/b]. This is the rate at which the monitor changes all of the pixels on the display to the next image. On your 120Hz display, every pixel will be commanded to change 120 times per second. It may, of course, be commanded to "change" to the same color if the image hasn't changed for that pixel.
The final piece is the [b]frame rate[/b]. This is the rate at which your content device is sending your display device new information. As the OP points out, this is a constant from a device like a Blu-ray player. At 1080p over DVI, that information will always be coming at 24fps. In this case the 120Hz display would show each image for 5 refresh cycles. This actually does lead to a slightly different perception than a 60Hz TV, which has to alternate between showing each image for 2 or 3 cycles (known as 3:2 pulldown). It is important to note that most 120Hz and above TVs only accept a signal at 60Hz max (limited by the bandwidth of HDMI), and therefore must display the same image for multiple refresh cycles, while 120Hz monitors (that can accept a full 1080p @ 120Hz signal over Dual DVI) can display a different picture every refresh cycle.
However, the real question is, what happens when we have a variable rate input signal, like a PC with varying frame rates? Without vertical synchronization (vsync) enabled, the PC will push out a new frame as soon as it is rendered. Since the display refreshes pixel by pixel (horizontal first, then by line...the same way you would read a page in a book), this means you could receive a new picture in the middle of a refresh cycle, leading to the tearing phenomenon that has already been mentioned. Vsync essentially forces the GPU to wait until the beginning of the next refresh cycle to push out the picture. When there is an excess of rendering power, it also forces the GPU to wait until the next refresh cycle to render the next frame.
So what happens if you are vsync'ed to prevent tearing but only have enough "horsepower" to run, say, 30 fps on your 120Hz monitor? In this case, the same frame will be displayed for 4 consecutive refresh cycles. This will look nearly identical to the same frame being displayed for 2 consecutive cycles on a 60Hz monitor. So the answer to the OP's question is yes and no. Yes, the monitor does refresh still refresh at 120Hz by repeating frames. However, no, it will not look as smooth as actually showing 120 different frames. If the frame rate is less than the refresh rate of all monitors being compared, they will essentially look the same. The only caveat is that the pixel response time must necessarily be low on the 120Hz monitor, but may or may not be as low on the 60Hz monitor depending on it's quality.
Finally, 3D follows all the same rules, except one eye is blacked out at a time. This effectively halves the refresh rate to that eye. So what looked like 120Hz 2D to both eyes will look like 60Hz in 3D to each eye. In other words, the same 120Hz monitor will look smoother to you in 2D than when running in 3D. There are some important points on how the refresh rates on the glasses interact with that of the display device that can lead to or prevent cross-talk, but that's a whole other discussion!
Hope that helps!
There are a few different things going on here:
First let's clarify that 120Hz has nothing to do with input lag. Input lag is how long it takes for the monitor to show the image after it has received the signal. Display devices with high input lag are obviously bad for gaming...but this is dependent on how much processing the display is doing, not the refresh rate of the panel. For example, a television may do a bunch of processing on the signal to try and improve the picture, introducing a lot of input lag. This same TV may have a "Game" mode that forgoes the processing to make sure the input lag is minimized.
Another term that's been mentioned is the response time. Despite what it sounds like, this is not the same as input lag. It is more aptly named pixel response time. This is the time it takes for a pixel to change colors. A slow response time makes the image "smear," as Gilador described in the above post. The response time is semi-independent of refresh rate. I say semi because it can be as low/fast as they want to/are able to make it, but it has an upper cap based on the refresh rate. The pixels cannot change slower than the refresh rate. They can, however, change faster than the refresh rate. The better the pixel response time, the better the display is for moving images (of any type: games, videos, tv, etc).
Then there's the actual refresh rate. This is the rate at which the monitor changes all of the pixels on the display to the next image. On your 120Hz display, every pixel will be commanded to change 120 times per second. It may, of course, be commanded to "change" to the same color if the image hasn't changed for that pixel.
The final piece is the frame rate. This is the rate at which your content device is sending your display device new information. As the OP points out, this is a constant from a device like a Blu-ray player. At 1080p over DVI, that information will always be coming at 24fps. In this case the 120Hz display would show each image for 5 refresh cycles. This actually does lead to a slightly different perception than a 60Hz TV, which has to alternate between showing each image for 2 or 3 cycles (known as 3:2 pulldown). It is important to note that most 120Hz and above TVs only accept a signal at 60Hz max (limited by the bandwidth of HDMI), and therefore must display the same image for multiple refresh cycles, while 120Hz monitors (that can accept a full 1080p @ 120Hz signal over Dual DVI) can display a different picture every refresh cycle.
However, the real question is, what happens when we have a variable rate input signal, like a PC with varying frame rates? Without vertical synchronization (vsync) enabled, the PC will push out a new frame as soon as it is rendered. Since the display refreshes pixel by pixel (horizontal first, then by line...the same way you would read a page in a book), this means you could receive a new picture in the middle of a refresh cycle, leading to the tearing phenomenon that has already been mentioned. Vsync essentially forces the GPU to wait until the beginning of the next refresh cycle to push out the picture. When there is an excess of rendering power, it also forces the GPU to wait until the next refresh cycle to render the next frame.
So what happens if you are vsync'ed to prevent tearing but only have enough "horsepower" to run, say, 30 fps on your 120Hz monitor? In this case, the same frame will be displayed for 4 consecutive refresh cycles. This will look nearly identical to the same frame being displayed for 2 consecutive cycles on a 60Hz monitor. So the answer to the OP's question is yes and no. Yes, the monitor does refresh still refresh at 120Hz by repeating frames. However, no, it will not look as smooth as actually showing 120 different frames. If the frame rate is less than the refresh rate of all monitors being compared, they will essentially look the same. The only caveat is that the pixel response time must necessarily be low on the 120Hz monitor, but may or may not be as low on the 60Hz monitor depending on it's quality.
Finally, 3D follows all the same rules, except one eye is blacked out at a time. This effectively halves the refresh rate to that eye. So what looked like 120Hz 2D to both eyes will look like 60Hz in 3D to each eye. In other words, the same 120Hz monitor will look smoother to you in 2D than when running in 3D. There are some important points on how the refresh rates on the glasses interact with that of the display device that can lead to or prevent cross-talk, but that's a whole other discussion!
Hope that helps!
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
the picture gets blury in rapid motion content. This is becuase a LCD is a [u]hold type display[/u].
I dont talk about ghosting. The response time can also have an influence on motion blur or a different type of it.
For LCD-TVs they implemented the 100Hz/200Hz technology to reduce motion blur, which use interpolated frames and also try to reduce film judder etc.
Interpolated frames have some dissadvantages and this is a completely different approach compared to 120Hz PC monitors.
Here are some of the basics explained about motion blur: http://scien.stanford.edu/pages/labsite/2010/psych221/projects/2010/LievenVerslegers/LCD_Motion_B
@ FormulaRedline
Now back to 120Hz for [b]PC monitors[/b]:
As you said, the refresh cycle is always 120Hz. So when my PC outputs 50 to 60 frames average in a certain game,
the monitor will still refresh with 120Hz by putting more of the same frames in its refresh cycle.
If this is the case the motion blur should still be reduced, right? Because you refresh the picture at a higher rate per second, so the hold type LCD display does not hold the picture for too long, which is a problem for the human eye. CRTs or plasmas dont have the motion blur problem,
because of the much shorter pulses per second. (pulse type display vs. hold type display)
Do you get my question?
the picture gets blury in rapid motion content. This is becuase a LCD is a hold type display.
I dont talk about ghosting. The response time can also have an influence on motion blur or a different type of it.
For LCD-TVs they implemented the 100Hz/200Hz technology to reduce motion blur, which use interpolated frames and also try to reduce film judder etc.
Interpolated frames have some dissadvantages and this is a completely different approach compared to 120Hz PC monitors.
Here are some of the basics explained about motion blur: http://scien.stanford.edu/pages/labsite/2010/psych221/projects/2010/LievenVerslegers/LCD_Motion_B
@ FormulaRedline
Now back to 120Hz for PC monitors:
As you said, the refresh cycle is always 120Hz. So when my PC outputs 50 to 60 frames average in a certain game,
the monitor will still refresh with 120Hz by putting more of the same frames in its refresh cycle.
If this is the case the motion blur should still be reduced, right? Because you refresh the picture at a higher rate per second, so the hold type LCD display does not hold the picture for too long, which is a problem for the human eye. CRTs or plasmas dont have the motion blur problem,
because of the much shorter pulses per second. (pulse type display vs. hold type display)
Do you get my question?
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
I do believe I get your question and did from the start. What I tried to explain above is that the effect you are describing is wholly dependent on the [b]pixel response time[/b]. As I said, a 120Hz monitor must necessarily have a good pixel response time, but in theory a 60Hz monitor could also have a good pixel response time.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
[/quote]
Waht you say makes defentely sense!
Is it possible that a 120Hz monitor, increases the pixel response time in 120Hz mode? And therefore decreases it in 60Hz mode...
Would this be an explaination why the motion blur seems to be lower in 120Hz mode? (on the same display compared to 60Hz)
I do believe I get your question and did from the start. What I tried to explain above is that the effect you are describing is wholly dependent on the pixel response time. As I said, a 120Hz monitor must necessarily have a good pixel response time, but in theory a 60Hz monitor could also have a good pixel response time.
Let's put some numbers to it. To refresh at 60HZ, the pixel must be able to completely change in 1/60th of a second, or about 16.7ms. For a 120Hz, this number is 1/120th of a second, about 8ms. If a monitor had these respective response times, it would spend the entire refresh cycle in transition. This would actually look quite blurry and in practice PC monitors are usually faster. This means the pixel gets the change done at the beginning of the cycle and then holds it for the rest of the cycle.
If you had a 120Hz monitor with a good pixel response time, a 60Hz monitor with an identical pixel response time, and were running them both at 60Hz, you should not see a difference. If you were running them both at 40fps, you might perceive a small difference because of the pulldown, but not the blur you are talking about.
Waht you say makes defentely sense!
Is it possible that a 120Hz monitor, increases the pixel response time in 120Hz mode? And therefore decreases it in 60Hz mode...
Would this be an explaination why the motion blur seems to be lower in 120Hz mode? (on the same display compared to 60Hz)
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
[/quote]
Yes exactly.
Are you referring to 60Hz vs. 120Hz on the desktop resolution settings? Or what?
Yes exactly.
If you start a game, it will override this setting and display at whatever frame rate you allow it.
If you start a game, it will override this setting and display at whatever frame rate you allow it.
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals