For most people, it's probably cheaper to get a graphics card upgrade/SLI than it would be to get this monitor. Then you'd have enough power to maintain a constant 60fps anyway.
For most people, it's probably cheaper to get a graphics card upgrade/SLI than it would be to get this monitor. Then you'd have enough power to maintain a constant 60fps anyway.
Except that what people are forgetting is that it is [i]minimum [/i]frame rate, not average that matters. Sure, average 60 fps, then when things get dicey, you drop to 45 and get stutter in exactly the time you least want it.
In order to skip the value that GSync should provide, you need to have a [i]minimum [/i]frame rate of 60 fps, and that's not all that easy even with SLI.
Except that what people are forgetting is that it is minimum frame rate, not average that matters. Sure, average 60 fps, then when things get dicey, you drop to 45 and get stutter in exactly the time you least want it.
In order to skip the value that GSync should provide, you need to have a minimum frame rate of 60 fps, and that's not all that easy even with SLI.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
This is a revolution and as Volnaiskra said, would be stupid to buy an expensive GPU and not G-Sync monitor to follow it.
I may say it's totaly crazy so thats why it should be G-Sync monitor first and im sure anyone will agree after they understand what this really do or else they just talking right of their ass whitch most people does anyway.
This is a REVOLUTION for 2d gaming and as for 3D it will be a VR device for me.
This is a revolution and as Volnaiskra said, would be stupid to buy an expensive GPU and not G-Sync monitor to follow it.
I may say it's totaly crazy so thats why it should be G-Sync monitor first and im sure anyone will agree after they understand what this really do or else they just talking right of their ass whitch most people does anyway.
This is a REVOLUTION for 2d gaming and as for 3D it will be a VR device for me.
Well I'm just stating what I would do if I had that monitor (I have the 120hz monitor that can't be upgraded since it doesn't have LED backlighting that can be controlled).
If it were me, I'd just wait a couple extra months, ebay that Asus, and buy a brand new model for the same price (resell + 299 you otherwise would have spent).
Well I'm just stating what I would do if I had that monitor (I have the 120hz monitor that can't be upgraded since it doesn't have LED backlighting that can be controlled).
If it were me, I'd just wait a couple extra months, ebay that Asus, and buy a brand new model for the same price (resell + 299 you otherwise would have spent).
[quote="Pirateguybrush"]For most people, it's probably cheaper to get a graphics card upgrade/SLI than it would be to get this monitor. Then you'd have enough power to maintain a constant 60fps anyway.[/quote]As bo3b said, that rarely happens with today's games. But the point is that a constant 60fps isn't enough anyway. That's the whole point of gsync. A constant 60fps will still get you stuttering (or tearing, if you don't use vsync). As I understand it, there are 4 possible options in a 60fps scenario:
1. By some miracle, your gpu spits out EXACTLY 60fps, which matches perfectly with your 60hz screen and you get flawless animation. Would be nice, but would never happen.
2. Your GPU spits out more than 60fps. Some of those frames must get dropped, which creates irregular animation (stuttering)
3. Your GPU spits out less than 60fps. Some of those frames must get duplicated. Again, (stuttering).
4. You turn off vsync. Your GPUs frames get displayed exactly when they appear, but this is often halfway through a refresh cycle. (Tearing.)
People keep saying "just get a better GPU instead" like it's going to solve anything. It won't. It's a *brute force approach* that will help mask the problem, and that is all. It's been our 'solution' for years because it was all we had, so we keep thinking it's the sensible option. But it's a crude, brute force approach. Gsync on the other hand addresses the problem specifically.
I have two of the 2nd-best GPUs known to man, yet I've played my fair share of games that are annoyingly jittery even though fraps insists I'm getting a steady 60fps.
Most of us already have good GPUs. At this stage, buying an even better GPU while forsaking gsync strikes me as an odd way to spend your money.
@Paul: Totally agree that it's worth waiting for the price to come down and other models to pop up. I myself am going to let the dust settle before I decide what to do. From what I've read, the prices will be very reasonable soon.
Pirateguybrush said:For most people, it's probably cheaper to get a graphics card upgrade/SLI than it would be to get this monitor. Then you'd have enough power to maintain a constant 60fps anyway.
As bo3b said, that rarely happens with today's games. But the point is that a constant 60fps isn't enough anyway. That's the whole point of gsync. A constant 60fps will still get you stuttering (or tearing, if you don't use vsync). As I understand it, there are 4 possible options in a 60fps scenario:
1. By some miracle, your gpu spits out EXACTLY 60fps, which matches perfectly with your 60hz screen and you get flawless animation. Would be nice, but would never happen.
2. Your GPU spits out more than 60fps. Some of those frames must get dropped, which creates irregular animation (stuttering)
3. Your GPU spits out less than 60fps. Some of those frames must get duplicated. Again, (stuttering).
4. You turn off vsync. Your GPUs frames get displayed exactly when they appear, but this is often halfway through a refresh cycle. (Tearing.)
People keep saying "just get a better GPU instead" like it's going to solve anything. It won't. It's a *brute force approach* that will help mask the problem, and that is all. It's been our 'solution' for years because it was all we had, so we keep thinking it's the sensible option. But it's a crude, brute force approach. Gsync on the other hand addresses the problem specifically.
I have two of the 2nd-best GPUs known to man, yet I've played my fair share of games that are annoyingly jittery even though fraps insists I'm getting a steady 60fps.
Most of us already have good GPUs. At this stage, buying an even better GPU while forsaking gsync strikes me as an odd way to spend your money.
@Paul: Totally agree that it's worth waiting for the price to come down and other models to pop up. I myself am going to let the dust settle before I decide what to do. From what I've read, the prices will be very reasonable soon.
I'm sure this will fire up the debate, but based on the two articles I've read, G-sync's [i]primary[/i] purpose is to eliminate screen tearing, which is caused by having a GPU pushing out more fps than the monitor can display. So I fail to see how beefier GPU's is going to compensate for the existing issue. In fact, it makes it worse. I have experienced more screen tearing since upgrading to a 690 than ever before.
I'm sure this will fire up the debate, but based on the two articles I've read, G-sync's primary purpose is to eliminate screen tearing, which is caused by having a GPU pushing out more fps than the monitor can display. So I fail to see how beefier GPU's is going to compensate for the existing issue. In fact, it makes it worse. I have experienced more screen tearing since upgrading to a 690 than ever before.
|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64
That sounds right, because screen tearing happens when your GPU produces more frames than your Hz. So the more powerful your GPU and/or the less demanding the game, the higher the likelihood of screen tearing.
That also explains why 120Hz monitors get less screen tearing. There are simply fewer opportunities for a GPU to pump out 121+ fps.
That sounds right, because screen tearing happens when your GPU produces more frames than your Hz. So the more powerful your GPU and/or the less demanding the game, the higher the likelihood of screen tearing.
That also explains why 120Hz monitors get less screen tearing. There are simply fewer opportunities for a GPU to pump out 121+ fps.
[quote="Volnaiskra"]That sounds right, because screen tearing happens when your GPU produces more frames than your Hz. So the more powerful your GPU and/or the less demanding the game, the higher the likelihood of screen tearing.
That also explains why 120Hz monitors get less screen tearing. There are simply fewer opportunities for a GPU to pump out 121+ fps.[/quote]
But even when it does exceed 120fps, I certainly don't notice it as much. Could possibly have to do with the refresh rate. 60hz refreshes at 16ms. 120hz refreshes as 8ms. So if things are getting drawn twice as quickly, maybe that's what cuts down on some of the really ugly tearing you get vs monitors that draw at 60hz.
Volnaiskra said:That sounds right, because screen tearing happens when your GPU produces more frames than your Hz. So the more powerful your GPU and/or the less demanding the game, the higher the likelihood of screen tearing.
That also explains why 120Hz monitors get less screen tearing. There are simply fewer opportunities for a GPU to pump out 121+ fps.
But even when it does exceed 120fps, I certainly don't notice it as much. Could possibly have to do with the refresh rate. 60hz refreshes at 16ms. 120hz refreshes as 8ms. So if things are getting drawn twice as quickly, maybe that's what cuts down on some of the really ugly tearing you get vs monitors that draw at 60hz.
As usual, misinformation on this board is rampant...
For starters, with VSync on at a certain refresh rate, the GPU is locked onto the monitor clock. It simply does not pump out and then drop extra frames. This is simple to see by monitoring the GPU usage being limited to a fraction of its full pump as when Vsync is off. You should never be seeing stuttering or tearing with VSync with frames being produced at the refresh level if your GPU can manage that.
What does produce microstutter / stutter is when the GPU cannot produce frames in time for the VSync and the FPS drops below.
Aside from lower FPS compared to VSync'd refresh rate, other reasons for stutter/microstutter:
SLI - cards producing mismatched frames
Bad engine coding such as Fallout/oblivion engines
[b]All G-Sync does is get the monitor to sync the frames the GPU has produced to the display in an equalized time line. In effect, the monitor lags its vsync a little bit to ensure that the current frame being produced by the GPU, is perfectly timed (lagged to be in sync) so it displays it at the correct moment in time. This means that if the refresh rate is set at 60Hz, each scan may last a little more or a little less than 1/60th of a second with GSync, instead of being exactly 1/60th of a second in a standard monitor.[/b]
[img]http://www.guru3d.com/index.php?ct=articles&action=file&id=7094[/img]
Example from Consecutive frame numbers:
a. VSync frame timeline if the GPU is powerful enough to keep up with refresh. (standard monitor)
b. VSync frame timeline if the GPU is not powerful enough to keep up with refresh. (standard monitor)
c. G-Sync
(repeat numbers indicate where frames have been skipped (doubled)).
a. 123456789 <-- no microstutter
b. 1124455789 <-- perceived microstutter
c. 1133557799 <-- No Microstutter
As can be seen, the frame rate in "a" is smooth. Frame rate in "b" is microstuttering / stuttering. Frame rate in "c", although will be seen as choppy if it gets too low, the motion of frames will be what is meant to be displayed, so will look very smooth, almost as good as perfect VSync (a).
[i]Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.[/i]
Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation.
As usual, misinformation on this board is rampant...
For starters, with VSync on at a certain refresh rate, the GPU is locked onto the monitor clock. It simply does not pump out and then drop extra frames. This is simple to see by monitoring the GPU usage being limited to a fraction of its full pump as when Vsync is off. You should never be seeing stuttering or tearing with VSync with frames being produced at the refresh level if your GPU can manage that.
What does produce microstutter / stutter is when the GPU cannot produce frames in time for the VSync and the FPS drops below.
Aside from lower FPS compared to VSync'd refresh rate, other reasons for stutter/microstutter:
SLI - cards producing mismatched frames
Bad engine coding such as Fallout/oblivion engines
All G-Sync does is get the monitor to sync the frames the GPU has produced to the display in an equalized time line. In effect, the monitor lags its vsync a little bit to ensure that the current frame being produced by the GPU, is perfectly timed (lagged to be in sync) so it displays it at the correct moment in time. This means that if the refresh rate is set at 60Hz, each scan may last a little more or a little less than 1/60th of a second with GSync, instead of being exactly 1/60th of a second in a standard monitor.
Example from Consecutive frame numbers:
a. VSync frame timeline if the GPU is powerful enough to keep up with refresh. (standard monitor)
b. VSync frame timeline if the GPU is not powerful enough to keep up with refresh. (standard monitor)
c. G-Sync
(repeat numbers indicate where frames have been skipped (doubled)).
a. 123456789 <-- no microstutter
b. 1124455789 <-- perceived microstutter
c. 1133557799 <-- No Microstutter
As can be seen, the frame rate in "a" is smooth. Frame rate in "b" is microstuttering / stuttering. Frame rate in "c", although will be seen as choppy if it gets too low, the motion of frames will be what is meant to be displayed, so will look very smooth, almost as good as perfect VSync (a).
Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Sorry for the DP, but this forum just hangs if I try and edit this post into the above.
[quote="RAGEdemon"]As usual, misinformation on this board is rampant...
For starters, with VSync on at a certain refresh rate, the GPU is locked onto the monitor clock. It simply does not pump out and then drop extra frames. This is simple to see by monitoring the GPU usage being limited to a fraction of its full pump as when Vsync is off. You should never be seeing stuttering or tearing with VSync with frames being produced at the refresh level if your GPU can manage that.
What does produce microstutter / stutter is when the GPU cannot produce frames in time for the VSync and the FPS drops below.
Aside from lower FPS compared to VSync'd refresh rate, other reasons for stutter/microstutter:
SLI - cards producing mismatched frames
Bad engine coding such as Fallout/oblivion engines
[b]All G-Sync does is get the monitor to sync the frames the GPU has produced to the display in an equalized time line. In effect, the monitor lags its vsync a little bit to ensure that the current frame being produced by the GPU, is perfectly timed (lagged to be in sync) so it displays it at the correct moment in time. This means that if the refresh rate is set at 60Hz, each scan may last a little more or a little less than 1/60th of a second with GSync, instead of being exactly 1/60th of a second in a standard monitor.[/b]
[img]http://www.guru3d.com/index.php?ct=articles&action=file&id=7094[/img]
Example from Consecutive frame numbers:
a. VSync frame timeline if the GPU is powerful enough to keep up with refresh. (standard monitor)
b. VSync frame timeline if the GPU is not powerful enough to keep up with refresh. (standard monitor)
c. G-Sync
(repeat numbers indicate where frames have been skipped (doubled)).
a. 123456789 <-- no microstutter
b. 1124455789 <-- perceived microstutter
c. 1133557799 <-- No Microstutter
As can be seen, the frame rate in "a" is smooth. Frame rate in "b" is microstuttering / stuttering. Frame rate in "c", although will be seen as choppy if it gets too low, the motion of frames will be what is meant to be displayed, so will look very smooth, almost as good as perfect VSync (a).
[i]Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.[/i]
Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation.
[/quote]
Okay. I've thought about this a bit, and it makes perfect sense to me now.
I've always been bothered by that "swinging pendulum with text written on it" video that every one swoons over.
In case people don't know, 60hz LCDs basically have motion resolution of 300 lines. It's why the 120/144hz monitors were such a revelation for us CRT lovers. We were finally getting LCD with full motion resolution.
It's also why HDTVs use so much processing and 120/240/480hz interpolation. Because without interpolation, their motion resolution is a pathetic 300 lines of resolution. It's a blurry mess. Interpolation solves this (by holding frame A and B and then having an algorithm creates frames in between that). Interpolation also adds a ton of input lag. Sony has solved this quandry with some of their 2013 HDTVs by allowing you to receive full 1080 motion resolution by strobing the LED backlight. The downside: people bothered by CRT flicker will be bothered by this 60hz stobing, and the picture is also incredibly dim from the strobing.
So what's my point?
I've always been bothered by that pendulum video. It's really just choosing a scenario that highlights how awful LCD motion is at 300 lines of motion resolution vs full 1080 motion resolution. And it does it in a way that even internet snobs who always discounted 120/144hz monitors (looking at some gaf posters here), are now swooning over this miracle.
But how do you get full 1080 motion resolution if the monitor is truly variable refresh rate? If it's only updating at 37hz (if required). You can't. There's no way a 37hz update would allow you full motion resolution. It'd be even worse than 60hz motion.
But your explanation solves it perfectly. What those people are really swooning over is the beautiful motion you get on a 120/144hz monitor. G-Sync is just allowing you to have that beautiful motion resolution, but with a smart v-sync that'll hold things so you don't get the stutter asssociated with v-sync and its need to skip frames when frames drop.
Sorry for the DP, but this forum just hangs if I try and edit this post into the above.
RAGEdemon said:As usual, misinformation on this board is rampant...
For starters, with VSync on at a certain refresh rate, the GPU is locked onto the monitor clock. It simply does not pump out and then drop extra frames. This is simple to see by monitoring the GPU usage being limited to a fraction of its full pump as when Vsync is off. You should never be seeing stuttering or tearing with VSync with frames being produced at the refresh level if your GPU can manage that.
What does produce microstutter / stutter is when the GPU cannot produce frames in time for the VSync and the FPS drops below.
Aside from lower FPS compared to VSync'd refresh rate, other reasons for stutter/microstutter:
SLI - cards producing mismatched frames
Bad engine coding such as Fallout/oblivion engines
All G-Sync does is get the monitor to sync the frames the GPU has produced to the display in an equalized time line. In effect, the monitor lags its vsync a little bit to ensure that the current frame being produced by the GPU, is perfectly timed (lagged to be in sync) so it displays it at the correct moment in time. This means that if the refresh rate is set at 60Hz, each scan may last a little more or a little less than 1/60th of a second with GSync, instead of being exactly 1/60th of a second in a standard monitor.
Example from Consecutive frame numbers:
a. VSync frame timeline if the GPU is powerful enough to keep up with refresh. (standard monitor)
b. VSync frame timeline if the GPU is not powerful enough to keep up with refresh. (standard monitor)
c. G-Sync
(repeat numbers indicate where frames have been skipped (doubled)).
a. 123456789 <-- no microstutter
b. 1124455789 <-- perceived microstutter
c. 1133557799 <-- No Microstutter
As can be seen, the frame rate in "a" is smooth. Frame rate in "b" is microstuttering / stuttering. Frame rate in "c", although will be seen as choppy if it gets too low, the motion of frames will be what is meant to be displayed, so will look very smooth, almost as good as perfect VSync (a).
Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation.
Okay. I've thought about this a bit, and it makes perfect sense to me now.
I've always been bothered by that "swinging pendulum with text written on it" video that every one swoons over.
In case people don't know, 60hz LCDs basically have motion resolution of 300 lines. It's why the 120/144hz monitors were such a revelation for us CRT lovers. We were finally getting LCD with full motion resolution.
It's also why HDTVs use so much processing and 120/240/480hz interpolation. Because without interpolation, their motion resolution is a pathetic 300 lines of resolution. It's a blurry mess. Interpolation solves this (by holding frame A and B and then having an algorithm creates frames in between that). Interpolation also adds a ton of input lag. Sony has solved this quandry with some of their 2013 HDTVs by allowing you to receive full 1080 motion resolution by strobing the LED backlight. The downside: people bothered by CRT flicker will be bothered by this 60hz stobing, and the picture is also incredibly dim from the strobing.
So what's my point?
I've always been bothered by that pendulum video. It's really just choosing a scenario that highlights how awful LCD motion is at 300 lines of motion resolution vs full 1080 motion resolution. And it does it in a way that even internet snobs who always discounted 120/144hz monitors (looking at some gaf posters here), are now swooning over this miracle.
But how do you get full 1080 motion resolution if the monitor is truly variable refresh rate? If it's only updating at 37hz (if required). You can't. There's no way a 37hz update would allow you full motion resolution. It'd be even worse than 60hz motion.
But your explanation solves it perfectly. What those people are really swooning over is the beautiful motion you get on a 120/144hz monitor. G-Sync is just allowing you to have that beautiful motion resolution, but with a smart v-sync that'll hold things so you don't get the stutter asssociated with v-sync and its need to skip frames when frames drop.
Ragedemon, you a partially right, but partially wrong.
You're right to point out that with vsync frames don't get dropped. But this amounts to little more than semantics.
What happens with vsync is that a frame gets produced, put in the buffer, and waits for the next refresh cycle to be displayed. How long it sits there waiting depends on how quickly it was rendered.
The end result is irregular animation. To use your example, it looks like this:
12 345 67 89
Yes, frames aren't being dropped per se, though that's beside the point. They are being ignored, while the GPU twiddles it's thumbs waiting for the next refresh. The end result is much the same: irregularly timed animation.
The purpose of g-sync is to get rid of those gaps: When a frame is ready, display it asap.
This will result in a smoother animation even in 60+ scenarios, because the timings of the frames will be closer to their intended timings (quick frames get shown quickly, slow frames get shown slowly, instead of all frames being displayed at an arbitrary 16ms).
Ragedemon, you a partially right, but partially wrong.
You're right to point out that with vsync frames don't get dropped. But this amounts to little more than semantics.
What happens with vsync is that a frame gets produced, put in the buffer, and waits for the next refresh cycle to be displayed. How long it sits there waiting depends on how quickly it was rendered.
The end result is irregular animation. To use your example, it looks like this:
12 345 67 89
Yes, frames aren't being dropped per se, though that's beside the point. They are being ignored, while the GPU twiddles it's thumbs waiting for the next refresh. The end result is much the same: irregularly timed animation.
The purpose of g-sync is to get rid of those gaps: When a frame is ready, display it asap.
This will result in a smoother animation even in 60+ scenarios, because the timings of the frames will be closer to their intended timings (quick frames get shown quickly, slow frames get shown slowly, instead of all frames being displayed at an arbitrary 16ms).
The thing I don't get is how can you have perfect motion resolution if you're simply drawing when the GPU is ready. LCD is an inherently flawed technology. That's why it took so long for the motion issue to finally be addressed. I don't see how they can possibly have full motion resolution at low update rates. It just defies what we've observed the last 25 years.
EDIT: And I'm not talking smoothness here. Because that's a separate issue. I'm talking about being able to read text on a swinging pendulum. That requires resolving full motion resolution. It just doesn't make sense how this could be done.
The thing I don't get is how can you have perfect motion resolution if you're simply drawing when the GPU is ready. LCD is an inherently flawed technology. That's why it took so long for the motion issue to finally be addressed. I don't see how they can possibly have full motion resolution at low update rates. It just defies what we've observed the last 25 years.
EDIT: And I'm not talking smoothness here. Because that's a separate issue. I'm talking about being able to read text on a swinging pendulum. That requires resolving full motion resolution. It just doesn't make sense how this could be done.
Volnaiskra, your diagram is grossly misleading.
As long as the FPS the GPU can generate is above the VSync, those red balls in picture 3 will be almost exactly equally spaced. This is due to double (or triple) buffering combined with frame pre-rendering which is 3 by default in most games.
An example of frame times with a 7970 in Crysis 2:
[img]http://techreport.com/r.x/geforce-gtx-690/frames-c2-7970.gif[/img]
As you can see, it is perfect.
Compare it with microstutter with SLI/CF, which is more like what your picture 3 looks like:
[img]http://techreport.com/r.x/geforce-gtx-690/frames-c2-7970-cf.gif[/img]
But even in this scenario, the frame is still displayed in its Hz scan time column (shown by vertical lines in your graphs). Granted, every ball has shifted a little to one side so it isn't perfect, but never do they miss a Hz scan and, never ever will the 2 balls be in the same column, leaving a scan to be completely blank.
Again, to reiterate:
If you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
As long as the FPS the GPU can generate is above the VSync, those red balls in picture 3 will be almost exactly equally spaced. This is due to double (or triple) buffering combined with frame pre-rendering which is 3 by default in most games.
An example of frame times with a 7970 in Crysis 2:
As you can see, it is perfect.
Compare it with microstutter with SLI/CF, which is more like what your picture 3 looks like:
But even in this scenario, the frame is still displayed in its Hz scan time column (shown by vertical lines in your graphs). Granted, every ball has shifted a little to one side so it isn't perfect, but never do they miss a Hz scan and, never ever will the 2 balls be in the same column, leaving a scan to be completely blank.
Again, to reiterate:
If you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
In order to skip the value that GSync should provide, you need to have a minimum frame rate of 60 fps, and that's not all that easy even with SLI.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I may say it's totaly crazy so thats why it should be G-Sync monitor first and im sure anyone will agree after they understand what this really do or else they just talking right of their ass whitch most people does anyway.
This is a REVOLUTION for 2d gaming and as for 3D it will be a VR device for me.
If it were me, I'd just wait a couple extra months, ebay that Asus, and buy a brand new model for the same price (resell + 299 you otherwise would have spent).
1. By some miracle, your gpu spits out EXACTLY 60fps, which matches perfectly with your 60hz screen and you get flawless animation. Would be nice, but would never happen.
2. Your GPU spits out more than 60fps. Some of those frames must get dropped, which creates irregular animation (stuttering)
3. Your GPU spits out less than 60fps. Some of those frames must get duplicated. Again, (stuttering).
4. You turn off vsync. Your GPUs frames get displayed exactly when they appear, but this is often halfway through a refresh cycle. (Tearing.)
People keep saying "just get a better GPU instead" like it's going to solve anything. It won't. It's a *brute force approach* that will help mask the problem, and that is all. It's been our 'solution' for years because it was all we had, so we keep thinking it's the sensible option. But it's a crude, brute force approach. Gsync on the other hand addresses the problem specifically.
I have two of the 2nd-best GPUs known to man, yet I've played my fair share of games that are annoyingly jittery even though fraps insists I'm getting a steady 60fps.
Most of us already have good GPUs. At this stage, buying an even better GPU while forsaking gsync strikes me as an odd way to spend your money.
@Paul: Totally agree that it's worth waiting for the price to come down and other models to pop up. I myself am going to let the dust settle before I decide what to do. From what I've read, the prices will be very reasonable soon.
|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64
That also explains why 120Hz monitors get less screen tearing. There are simply fewer opportunities for a GPU to pump out 121+ fps.
But even when it does exceed 120fps, I certainly don't notice it as much. Could possibly have to do with the refresh rate. 60hz refreshes at 16ms. 120hz refreshes as 8ms. So if things are getting drawn twice as quickly, maybe that's what cuts down on some of the really ugly tearing you get vs monitors that draw at 60hz.
For starters, with VSync on at a certain refresh rate, the GPU is locked onto the monitor clock. It simply does not pump out and then drop extra frames. This is simple to see by monitoring the GPU usage being limited to a fraction of its full pump as when Vsync is off. You should never be seeing stuttering or tearing with VSync with frames being produced at the refresh level if your GPU can manage that.
What does produce microstutter / stutter is when the GPU cannot produce frames in time for the VSync and the FPS drops below.
Aside from lower FPS compared to VSync'd refresh rate, other reasons for stutter/microstutter:
SLI - cards producing mismatched frames
Bad engine coding such as Fallout/oblivion engines
All G-Sync does is get the monitor to sync the frames the GPU has produced to the display in an equalized time line. In effect, the monitor lags its vsync a little bit to ensure that the current frame being produced by the GPU, is perfectly timed (lagged to be in sync) so it displays it at the correct moment in time. This means that if the refresh rate is set at 60Hz, each scan may last a little more or a little less than 1/60th of a second with GSync, instead of being exactly 1/60th of a second in a standard monitor.
Example from Consecutive frame numbers:
a. VSync frame timeline if the GPU is powerful enough to keep up with refresh. (standard monitor)
b. VSync frame timeline if the GPU is not powerful enough to keep up with refresh. (standard monitor)
c. G-Sync
(repeat numbers indicate where frames have been skipped (doubled)).
a. 123456789 <-- no microstutter
b. 1124455789 <-- perceived microstutter
c. 1133557799 <-- No Microstutter
As can be seen, the frame rate in "a" is smooth. Frame rate in "b" is microstuttering / stuttering. Frame rate in "c", although will be seen as choppy if it gets too low, the motion of frames will be what is meant to be displayed, so will look very smooth, almost as good as perfect VSync (a).
Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Okay. I've thought about this a bit, and it makes perfect sense to me now.
I've always been bothered by that "swinging pendulum with text written on it" video that every one swoons over.
In case people don't know, 60hz LCDs basically have motion resolution of 300 lines. It's why the 120/144hz monitors were such a revelation for us CRT lovers. We were finally getting LCD with full motion resolution.
It's also why HDTVs use so much processing and 120/240/480hz interpolation. Because without interpolation, their motion resolution is a pathetic 300 lines of resolution. It's a blurry mess. Interpolation solves this (by holding frame A and B and then having an algorithm creates frames in between that). Interpolation also adds a ton of input lag. Sony has solved this quandry with some of their 2013 HDTVs by allowing you to receive full 1080 motion resolution by strobing the LED backlight. The downside: people bothered by CRT flicker will be bothered by this 60hz stobing, and the picture is also incredibly dim from the strobing.
So what's my point?
I've always been bothered by that pendulum video. It's really just choosing a scenario that highlights how awful LCD motion is at 300 lines of motion resolution vs full 1080 motion resolution. And it does it in a way that even internet snobs who always discounted 120/144hz monitors (looking at some gaf posters here), are now swooning over this miracle.
But how do you get full 1080 motion resolution if the monitor is truly variable refresh rate? If it's only updating at 37hz (if required). You can't. There's no way a 37hz update would allow you full motion resolution. It'd be even worse than 60hz motion.
But your explanation solves it perfectly. What those people are really swooning over is the beautiful motion you get on a 120/144hz monitor. G-Sync is just allowing you to have that beautiful motion resolution, but with a smart v-sync that'll hold things so you don't get the stutter asssociated with v-sync and its need to skip frames when frames drop.
You're right to point out that with vsync frames don't get dropped. But this amounts to little more than semantics.
What happens with vsync is that a frame gets produced, put in the buffer, and waits for the next refresh cycle to be displayed. How long it sits there waiting depends on how quickly it was rendered.
The end result is irregular animation. To use your example, it looks like this:
12 345 67 89
Yes, frames aren't being dropped per se, though that's beside the point. They are being ignored, while the GPU twiddles it's thumbs waiting for the next refresh. The end result is much the same: irregularly timed animation.
The purpose of g-sync is to get rid of those gaps: When a frame is ready, display it asap.
This will result in a smoother animation even in 60+ scenarios, because the timings of the frames will be closer to their intended timings (quick frames get shown quickly, slow frames get shown slowly, instead of all frames being displayed at an arbitrary 16ms).
http://pbrd.co/1bdvW68
EDIT: And I'm not talking smoothness here. Because that's a separate issue. I'm talking about being able to read text on a swinging pendulum. That requires resolving full motion resolution. It just doesn't make sense how this could be done.
As long as the FPS the GPU can generate is above the VSync, those red balls in picture 3 will be almost exactly equally spaced. This is due to double (or triple) buffering combined with frame pre-rendering which is 3 by default in most games.
An example of frame times with a 7970 in Crysis 2:
As you can see, it is perfect.
Compare it with microstutter with SLI/CF, which is more like what your picture 3 looks like:
But even in this scenario, the frame is still displayed in its Hz scan time column (shown by vertical lines in your graphs). Granted, every ball has shifted a little to one side so it isn't perfect, but never do they miss a Hz scan and, never ever will the 2 balls be in the same column, leaving a scan to be completely blank.
Again, to reiterate:
If you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.