Vsync, how many people use it and questions about it?
  1 / 2    
I'm curious how many people here use Vsync. I ask because I found my way into 3dland because I was tired of the compromise you had to make. Either suffer from massive input lag, stutter when the framerate drops below your monitors refresh rate, or just screen tearing if you don't use vsync. So I ended up with a gsync monitor that supported 3d vision. I thought what the hell and ordered a kit to take advantage of all my monitors capabilities. So here I am stuck again with compromises because obviously gsync and 3d vision doesn't work together. Something I discovered was that capping the framerate at your monitors refresh rate with riva tuner stats server(rtss) greatly lowered the input lag in vsync. People said you needed to cap it to 1 or 2 frames per second lower than your refrsh rate to get this effect. I think the reason for this is nvidia inspector isn't as good framerate capper and often times varies from +-1-2 frames from where you cap it at, and as soon as your framerate goes over the refresh it's like adding 20-30 MS of input lag. You don't have that problem with rtss and it also adds less input lag than inspector does (battlenonsense on youtube tested this out) I've seen some digital foundry videos and they suggest using half-refresh-adaptive vsync long with rtss if you want to play at a locked 30 fps with good frame pacing. The issue is that adaptive vsync and framecapping doesnt work. You get some variances from the 33.333 ms frametime and when it falls below that vsync disengages and you get screen tearing. Also, does anybody know if vsync functions differently than regular vsync? Or is it just regular vsync that the drivers are capping your framerate around half your refresh rate? The reaosn I ask is because I get bad input lag if I don't apply a frame cap at the half refresh rate point. So it's almost like it operates differently than vsync. There is horrible information and just plain wrong info out there regarding vsync. We have a few people here that seem really knowledgable, so i was hoping someone would provide some answers. Specifically @RAGEdemon and @masterotaku RAGEdemon masterotaku
I'm curious how many people here use Vsync. I ask because I found my way into 3dland because I was tired of the compromise you had to make. Either suffer from massive input lag, stutter when the framerate drops below your monitors refresh rate, or just screen tearing if you don't use vsync.

So I ended up with a gsync monitor that supported 3d vision. I thought what the hell and ordered a kit to take advantage of all my monitors capabilities.

So here I am stuck again with compromises because obviously gsync and 3d vision doesn't work together. Something I discovered was that capping the framerate at your monitors refresh rate with riva tuner stats server(rtss) greatly lowered the input lag in vsync. People said you needed to cap it to 1 or 2 frames per second lower than your refrsh rate to get this effect. I think the reason for this is nvidia inspector isn't as good framerate capper and often times varies from +-1-2 frames from where you cap it at, and as soon as your framerate goes over the refresh it's like adding 20-30 MS of input lag. You don't have that problem with rtss and it also adds less input lag than inspector does (battlenonsense on youtube tested this out)

I've seen some digital foundry videos and they suggest using half-refresh-adaptive vsync long with rtss if you want to play at a locked 30 fps with good frame pacing. The issue is that adaptive vsync and framecapping doesnt work. You get some variances from the 33.333 ms frametime and when it falls below that vsync disengages and you get screen tearing.

Also, does anybody know if vsync functions differently than regular vsync? Or is it just regular vsync that the drivers are capping your framerate around half your refresh rate? The reaosn I ask is because I get bad input lag if I don't apply a frame cap at the half refresh rate point. So it's almost like it operates differently than vsync.

There is horrible information and just plain wrong info out there regarding vsync. We have a few people here that seem really knowledgable, so i was hoping someone would provide some answers.

Specifically @RAGEdemon and @masterotaku


RAGEdemon
masterotaku

#1
Posted 02/22/2017 11:55 PM   
Just use "Vertical Sync = Force On" and disable any Vsync in-game. It will put it perfectly in-sync with the 3D Vision driver. No need for other mombo-jumbo like RTSS and the like. [url=http://www.iforce.co.nz/View.aspx?i=qurmstjc.w3y.png][img]http://iforce.co.nz/i/qurmstjc.w3y.png[/img][/url] Vsync works differently when in 3D Vision Mode than regular 2D. (Not going in-detail here.) If you are knowledgeable enough you can measure the time between frames (MSI Afterburner allows you to do this). You will see a rock solid 16ms time between frames if you are running at constant 60 FPS. Problem with the lag will be when you ditch below 60 FPS. At which point you can use the "Frame Rate Limiter" (in NvInspector) to specify a FPS THAT YOU ALWAYS MAINTAIN - AND NEVER DIP BELOW !!! <- Very important!
Just use "Vertical Sync = Force On" and disable any Vsync in-game. It will put it perfectly in-sync with the 3D Vision driver. No need for other mombo-jumbo like RTSS and the like.

Image

Vsync works differently when in 3D Vision Mode than regular 2D. (Not going in-detail here.)
If you are knowledgeable enough you can measure the time between frames (MSI Afterburner allows you to do this).
You will see a rock solid 16ms time between frames if you are running at constant 60 FPS.

Problem with the lag will be when you ditch below 60 FPS. At which point you can use the "Frame Rate Limiter" (in NvInspector) to specify a FPS THAT YOU ALWAYS MAINTAIN - AND NEVER DIP BELOW !!! <- Very important!

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#2
Posted 02/23/2017 12:53 AM   
I'm afraid I am not well versed in VSync issues related to input lag, as I don't play multiplayer games competitively. I do hate micro-stutter however, and that's where my limited VSync experience comes from. From my experience, VSync + 3D Vision is a funny thing because there seem to be 2 different VSyncs happening when 3D Vision is enabled. 1. There are 2 buffers in the card which alternatingly sync to the display refresh rate to maintain the 3D affect for each eye. This can never be disabled because if it was, there would be no 3D Effect. 2. However, the VSync from the game to the buffer CAN be disabled. I have achieved this by using D3D overrider and setting VSync to OFF with NVinspector for the game. This allowed uncapped FPS in 3D vision [color="green"]while maintaining the 3D effect[/color]. I haven't seen this being mentioned by anyone in the past, and it is certainly not documented. It will, however, produce tearing in both eyes, so the 3D effect is impacted to an extent when moving around quickly. The advantage, of course, is that input lag is significantly reduced. A bonus advantage is also that you will be perceiving up to 120FPS (depending how many your system can produce) instead of being capped to 60FPS by your eyes, albeit in torn frames. Try it out. Maybe you might decide that the improved input lag is worth some perceived tearing in both eyes in 3D Vision.
I'm afraid I am not well versed in VSync issues related to input lag, as I don't play multiplayer games competitively. I do hate micro-stutter however, and that's where my limited VSync experience comes from.

From my experience, VSync + 3D Vision is a funny thing because there seem to be 2 different VSyncs happening when 3D Vision is enabled.

1. There are 2 buffers in the card which alternatingly sync to the display refresh rate to maintain the 3D affect for each eye. This can never be disabled because if it was, there would be no 3D Effect.

2. However, the VSync from the game to the buffer CAN be disabled. I have achieved this by using D3D overrider and setting VSync to OFF with NVinspector for the game. This allowed uncapped FPS in 3D vision while maintaining the 3D effect. I haven't seen this being mentioned by anyone in the past, and it is certainly not documented.

It will, however, produce tearing in both eyes, so the 3D effect is impacted to an extent when moving around quickly. The advantage, of course, is that input lag is significantly reduced. A bonus advantage is also that you will be perceiving up to 120FPS (depending how many your system can produce) instead of being capped to 60FPS by your eyes, albeit in torn frames.

Try it out. Maybe you might decide that the improved input lag is worth some perceived tearing in both eyes in 3D Vision.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#3
Posted 02/23/2017 01:46 AM   
[quote="helifax"]Just use "Vertical Sync = Force On" and disable any Vsync in-game. It will put it perfectly in-sync with the 3D Vision driver. No need for other mombo-jumbo like RTSS and the like. [url=http://www.iforce.co.nz/View.aspx?i=qurmstjc.w3y.png][img]http://iforce.co.nz/i/qurmstjc.w3y.png[/img][/url] Vsync works differently when in 3D Vision Mode than regular 2D. (Not going in-detail here.) If you are knowledgeable enough you can measure the time between frames (MSI Afterburner allows you to do this). You will see a rock solid 16ms time between frames if you are running at constant 60 FPS. Problem with the lag will be when you ditch below 60 FPS. At which point you can use the "Frame Rate Limiter" (in NvInspector) to specify a FPS THAT YOU ALWAYS MAINTAIN - AND NEVER DIP BELOW !!! <- Very important![/quote]The problem is if I do it just as you say I get massive input lag. I just tested it out. Capping the framerate at the monitors refresh rate with vsync eliminates a lot of that input lag. I's really noticeable if you turn off hardware cursor in witcher 3. RTSS is just used for the framerate capping because it does a better job than inspector. Also, RTSS adds less input lag. https://youtu.be/rs0PYCpBJjc?t=4m9s a 12 ms input lag difference in fact. It would be awesome if the framerate capper in inspector allowed you to always maintain 60 fps, but that isn't what it does. It isn't going to magically allow me to play at a locked 60 fps in witcher 3 with 3d enabled.
helifax said:Just use "Vertical Sync = Force On" and disable any Vsync in-game. It will put it perfectly in-sync with the 3D Vision driver. No need for other mombo-jumbo like RTSS and the like.

Image

Vsync works differently when in 3D Vision Mode than regular 2D. (Not going in-detail here.)
If you are knowledgeable enough you can measure the time between frames (MSI Afterburner allows you to do this).
You will see a rock solid 16ms time between frames if you are running at constant 60 FPS.

Problem with the lag will be when you ditch below 60 FPS. At which point you can use the "Frame Rate Limiter" (in NvInspector) to specify a FPS THAT YOU ALWAYS MAINTAIN - AND NEVER DIP BELOW !!! <- Very important!
The problem is if I do it just as you say I get massive input lag. I just tested it out. Capping the framerate at the monitors refresh rate with vsync eliminates a lot of that input lag. I's really noticeable if you turn off hardware cursor in witcher 3.

RTSS is just used for the framerate capping because it does a better job than inspector. Also, RTSS adds less input lag. https://youtu.be/rs0PYCpBJjc?t=4m9s a 12 ms input lag difference in fact.

It would be awesome if the framerate capper in inspector allowed you to always maintain 60 fps, but that isn't what it does. It isn't going to magically allow me to play at a locked 60 fps in witcher 3 with 3d enabled.

#4
Posted 02/23/2017 04:27 AM   
[quote="RAGEdemon"]I'm afraid I am not well versed in VSync issues related to input lag, as I don't play multiplayer games competitively. I do hate micro-stutter however, and that's where my limited VSync experience comes from. From my experience, VSync + 3D Vision is a funny thing because there seem to be 2 different VSyncs happening when 3D Vision is enabled. 1. There are 2 buffers in the card which alternatingly sync to the display refresh rate to maintain the 3D affect for each eye. This can never be disabled because if it was, there would be no 3D Effect. 2. However, the VSync from the game to the buffer CAN be disabled. I have achieved this by using D3D overrider and setting VSync to OFF with NVinspector for the game. This allowed uncapped FPS in 3D vision [color="green"]while maintaining the 3D effect[/color]. I haven't seen this being mentioned by anyone in the past, and it is certainly not documented. It will, however, produce tearing in both eyes, so the 3D effect is impacted to an extent when moving around quickly. The advantage, of course, is that input lag is significantly reduced. A bonus advantage is also that you will be perceiving up to 120FPS (depending how many your system can produce) instead of being capped to 60FPS by your eyes, albeit in torn frames. Try it out. Maybe you might decide that the improved input lag is worth some perceived tearing in both eyes in 3D Vision. [/quote]Unfortunately i don't have any games that I could get 120 fps in 3d. Perhaps trine 2? Certainly not witcher 3 or mad max. Yeah, I really hate input lag and screen tearing. I have to find a place where I can get answers on how half refresh vsync works and other nuances.
RAGEdemon said:I'm afraid I am not well versed in VSync issues related to input lag, as I don't play multiplayer games competitively. I do hate micro-stutter however, and that's where my limited VSync experience comes from.

From my experience, VSync + 3D Vision is a funny thing because there seem to be 2 different VSyncs happening when 3D Vision is enabled.

1. There are 2 buffers in the card which alternatingly sync to the display refresh rate to maintain the 3D affect for each eye. This can never be disabled because if it was, there would be no 3D Effect.

2. However, the VSync from the game to the buffer CAN be disabled. I have achieved this by using D3D overrider and setting VSync to OFF with NVinspector for the game. This allowed uncapped FPS in 3D vision while maintaining the 3D effect. I haven't seen this being mentioned by anyone in the past, and it is certainly not documented.

It will, however, produce tearing in both eyes, so the 3D effect is impacted to an extent when moving around quickly. The advantage, of course, is that input lag is significantly reduced. A bonus advantage is also that you will be perceiving up to 120FPS (depending how many your system can produce) instead of being capped to 60FPS by your eyes, albeit in torn frames.

Try it out. Maybe you might decide that the improved input lag is worth some perceived tearing in both eyes in 3D Vision.
Unfortunately i don't have any games that I could get 120 fps in 3d. Perhaps trine 2? Certainly not witcher 3 or mad max. Yeah, I really hate input lag and screen tearing. I have to find a place where I can get answers on how half refresh vsync works and other nuances.

#5
Posted 02/23/2017 04:33 AM   
You could try the D3D overrider method to force disable the 3D vision 60fPS cap and VSync. Then your FPS would be uncapped from 60 and your input lag /should/ be greatly reduced - I have never tested for input lag. The downside is, of course, tearing. Reduced input lag and no tearing is the holy grail of course. Please post a solution if you ever find one.
You could try the D3D overrider method to force disable the 3D vision 60fPS cap and VSync. Then your FPS would be uncapped from 60 and your input lag /should/ be greatly reduced - I have never tested for input lag. The downside is, of course, tearing.

Reduced input lag and no tearing is the holy grail of course. Please post a solution if you ever find one.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#6
Posted 02/23/2017 04:43 AM   
[quote="tygeezy"][quote="helifax"]Just use "Vertical Sync = Force On" and disable any Vsync in-game. It will put it perfectly in-sync with the 3D Vision driver. No need for other mombo-jumbo like RTSS and the like. [url=http://www.iforce.co.nz/View.aspx?i=qurmstjc.w3y.png][img]http://iforce.co.nz/i/qurmstjc.w3y.png[/img][/url] Vsync works differently when in 3D Vision Mode than regular 2D. (Not going in-detail here.) If you are knowledgeable enough you can measure the time between frames (MSI Afterburner allows you to do this). You will see a rock solid 16ms time between frames if you are running at constant 60 FPS. Problem with the lag will be when you ditch below 60 FPS. At which point you can use the "Frame Rate Limiter" (in NvInspector) to specify a FPS THAT YOU ALWAYS MAINTAIN - AND NEVER DIP BELOW !!! <- Very important![/quote]The problem is if I do it just as you say I get massive input lag. I just tested it out. Capping the framerate at the monitors refresh rate with vsync eliminates a lot of that input lag. I's really noticeable if you turn off hardware cursor in witcher 3. RTSS is just used for the framerate capping because it does a better job than inspector. Also, RTSS adds less input lag. https://youtu.be/rs0PYCpBJjc?t=4m9s a 12 ms input lag difference in fact. It would be awesome if the framerate capper in inspector allowed you to always maintain 60 fps, but that isn't what it does. It isn't going to magically allow me to play at a locked 60 fps in witcher 3 with 3d enabled. [/quote] The Witcher 3 is a different beast regarding input lag. I also tested this on my system using just one monitor and I can maintain 60FPS easily without any input lag noticed (but then I run SLI). Perhaps you are more susceptible to input lag then I am and you can clearly see it, while I can't and it hasn't really bothered me too much.
tygeezy said:
helifax said:Just use "Vertical Sync = Force On" and disable any Vsync in-game. It will put it perfectly in-sync with the 3D Vision driver. No need for other mombo-jumbo like RTSS and the like.

Image

Vsync works differently when in 3D Vision Mode than regular 2D. (Not going in-detail here.)
If you are knowledgeable enough you can measure the time between frames (MSI Afterburner allows you to do this).
You will see a rock solid 16ms time between frames if you are running at constant 60 FPS.

Problem with the lag will be when you ditch below 60 FPS. At which point you can use the "Frame Rate Limiter" (in NvInspector) to specify a FPS THAT YOU ALWAYS MAINTAIN - AND NEVER DIP BELOW !!! <- Very important!
The problem is if I do it just as you say I get massive input lag. I just tested it out. Capping the framerate at the monitors refresh rate with vsync eliminates a lot of that input lag. I's really noticeable if you turn off hardware cursor in witcher 3.

RTSS is just used for the framerate capping because it does a better job than inspector. Also, RTSS adds less input lag. https://youtu.be/rs0PYCpBJjc?t=4m9s a 12 ms input lag difference in fact.

It would be awesome if the framerate capper in inspector allowed you to always maintain 60 fps, but that isn't what it does. It isn't going to magically allow me to play at a locked 60 fps in witcher 3 with 3d enabled.


The Witcher 3 is a different beast regarding input lag. I also tested this on my system using just one monitor and I can maintain 60FPS easily without any input lag noticed (but then I run SLI).
Perhaps you are more susceptible to input lag then I am and you can clearly see it, while I can't and it hasn't really bothered me too much.

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#7
Posted 02/23/2017 12:41 PM   
I always force vsync in the drivers and turn it off ingame. Some games need a RTSS cap to have perfect frametimes (Alien Isolation for example). I would never use 3D Vision without vsync. Tearing is horrible in 3D. I also never cap fps to 30 if I can't reach 60. Well, I would do it if I got consistently under 40fps. And when I play in 2D, I use G-Sync + ULMB.
I always force vsync in the drivers and turn it off ingame. Some games need a RTSS cap to have perfect frametimes (Alien Isolation for example).

I would never use 3D Vision without vsync. Tearing is horrible in 3D. I also never cap fps to 30 if I can't reach 60. Well, I would do it if I got consistently under 40fps.

And when I play in 2D, I use G-Sync + ULMB.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#8
Posted 02/23/2017 02:20 PM   
[quote="masterotaku"]I always force vsync in the drivers and turn it off ingame. Some games need a RTSS cap to have perfect frametimes (Alien Isolation for example). I would never use 3D Vision without vsync. Tearing is horrible in 3D. I also never cap fps to 30 if I can't reach 60. Well, I would do it if I got consistently under 40fps. And when I play in 2D, I use G-Sync + ULMB.[/quote]How do you use gsync plus ULMB? I thought they were mutually exclusive? Also, don't you get bad input lag with vsync and no cap? What's interesting is the tearing isn't as bad in 3d... Maybe that's what ragedemon is talking about with d3doveride and disabling a different buffer for more speed at the expense of a lot of tearing? I would really love to use gsync plus ULMB in overwatch. I get 140 fps about 90 % oft he time with dips here and there to like 120. It's the fastest running game I own and with the reduced buffer option in that game input lag is figuratively nil. Such a smooth running game; I love overwatch. Edit: Is this it? "Steps (I have finally confirmed them with another custom resolution): How to do this (having a 3D Vision kit is necessary): 1- Enable 3D Vision and make sure Lightboost (not ULMB) is enabled in the desktop, using the "always" option. 2- Create a 120Hz custom resolution. I have tried 2389x1344 and 2528x1422. 3- Switch to G-Sync in the Nvidia CP without disabling 3D Vision first. 4- Now that custom resolution is locked into this G-Sync + ULMB mode (the monitor OSD will say "ULMB 120Hz")."
masterotaku said:I always force vsync in the drivers and turn it off ingame. Some games need a RTSS cap to have perfect frametimes (Alien Isolation for example).

I would never use 3D Vision without vsync. Tearing is horrible in 3D. I also never cap fps to 30 if I can't reach 60. Well, I would do it if I got consistently under 40fps.

And when I play in 2D, I use G-Sync + ULMB.
How do you use gsync plus ULMB? I thought they were mutually exclusive?

Also, don't you get bad input lag with vsync and no cap? What's interesting is the tearing isn't as bad in 3d... Maybe that's what ragedemon is talking about with d3doveride and disabling a different buffer for more speed at the expense of a lot of tearing?

I would really love to use gsync plus ULMB in overwatch. I get 140 fps about 90 % oft he time with dips here and there to like 120. It's the fastest running game I own and with the reduced buffer option in that game input lag is figuratively nil. Such a smooth running game; I love overwatch.

Edit: Is this it?

"Steps (I have finally confirmed them with another custom resolution):

How to do this (having a 3D Vision kit is necessary):

1- Enable 3D Vision and make sure Lightboost (not ULMB) is enabled in the desktop, using the "always" option.
2- Create a 120Hz custom resolution. I have tried 2389x1344 and 2528x1422.
3- Switch to G-Sync in the Nvidia CP without disabling 3D Vision first.
4- Now that custom resolution is locked into this G-Sync + ULMB mode (the monitor OSD will say "ULMB 120Hz")."

#9
Posted 02/23/2017 05:20 PM   
[quote="tygeezy"]How do you use gsync plus ULMB? I thought they were mutually exclusive?[/quote] Check the last two-three pages of this thread: http://forums.blurbusters.com/viewtopic.php?f=4&t=2883&start=140 Basically, you can create a custom resolution with +5 Vertical Total and bam!, it works with ULMB and G-Sync at the same time (when G-Sync is enabled in the Nvidia Control Panel. And you can disable ULMB at any time in the monitor OSD). 120Hz tops and the minimum fps for single strobing is 40fps. [quote="tygeezy"]Also, don't you get bad input lag with vsync and no cap? What's interesting is the tearing isn't as bad in 3d... Maybe that's what ragedemon is talking about with d3doveride and disabling a different buffer for more speed at the expense of a lot of tearing?[/quote] I'm not that sensitive to input lag, but yes, if the game is using vsync and the PC has a lot of unused resources, it generates the frame too soon and it waits there. So I guess that a game with 90% of GPU usage has less input lag than a game with 20% of GPU usage. I'm not an expert about input lag. [quote="tygeezy"] Edit: Is this it? "Steps (I have finally confirmed them with another custom resolution): How to do this (having a 3D Vision kit is necessary): 1- Enable 3D Vision and make sure Lightboost (not ULMB) is enabled in the desktop, using the "always" option. 2- Create a 120Hz custom resolution. I have tried 2389x1344 and 2528x1422. 3- Switch to G-Sync in the Nvidia CP without disabling 3D Vision first. 4- Now that custom resolution is locked into this G-Sync + ULMB mode (the monitor OSD will say "ULMB 120Hz")."[/quote] Haha, that's my old method, which by coincidence does what I said first. So you don't need 3D Vision or Windows 7 at all. I should update the description of my youtube videos.
tygeezy said:How do you use gsync plus ULMB? I thought they were mutually exclusive?


Check the last two-three pages of this thread: http://forums.blurbusters.com/viewtopic.php?f=4&t=2883&start=140

Basically, you can create a custom resolution with +5 Vertical Total and bam!, it works with ULMB and G-Sync at the same time (when G-Sync is enabled in the Nvidia Control Panel. And you can disable ULMB at any time in the monitor OSD). 120Hz tops and the minimum fps for single strobing is 40fps.

tygeezy said:Also, don't you get bad input lag with vsync and no cap? What's interesting is the tearing isn't as bad in 3d... Maybe that's what ragedemon is talking about with d3doveride and disabling a different buffer for more speed at the expense of a lot of tearing?


I'm not that sensitive to input lag, but yes, if the game is using vsync and the PC has a lot of unused resources, it generates the frame too soon and it waits there. So I guess that a game with 90% of GPU usage has less input lag than a game with 20% of GPU usage. I'm not an expert about input lag.

tygeezy said:
Edit: Is this it?

"Steps (I have finally confirmed them with another custom resolution):

How to do this (having a 3D Vision kit is necessary):

1- Enable 3D Vision and make sure Lightboost (not ULMB) is enabled in the desktop, using the "always" option.
2- Create a 120Hz custom resolution. I have tried 2389x1344 and 2528x1422.
3- Switch to G-Sync in the Nvidia CP without disabling 3D Vision first.
4- Now that custom resolution is locked into this G-Sync + ULMB mode (the monitor OSD will say "ULMB 120Hz")."


Haha, that's my old method, which by coincidence does what I said first. So you don't need 3D Vision or Windows 7 at all. I should update the description of my youtube videos.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#10
Posted 02/23/2017 06:02 PM   
[quote="masterotaku"][quote="tygeezy"]How do you use gsync plus ULMB? I thought they were mutually exclusive?[/quote] Check the last two-three pages of this thread: http://forums.blurbusters.com/viewtopic.php?f=4&t=2883&start=140 Basically, you can create a custom resolution with +5 Vertical Total and bam!, it works with ULMB and G-Sync at the same time (when G-Sync is enabled in the Nvidia Control Panel. And you can disable ULMB at any time in the monitor OSD). 120Hz tops and the minimum fps for single strobing is 40fps. [quote="tygeezy"]Also, don't you get bad input lag with vsync and no cap? What's interesting is the tearing isn't as bad in 3d... Maybe that's what ragedemon is talking about with d3doveride and disabling a different buffer for more speed at the expense of a lot of tearing?[/quote] I'm not that sensitive to input lag, but yes, if the game is using vsync and the PC has a lot of unused resources, it generates the frame too soon and it waits there. So I guess that a game with 90% of GPU usage has less input lag than a game with 20% of GPU usage. I'm not an expert about input lag. [quote="tygeezy"] Edit: Is this it? "Steps (I have finally confirmed them with another custom resolution): How to do this (having a 3D Vision kit is necessary): 1- Enable 3D Vision and make sure Lightboost (not ULMB) is enabled in the desktop, using the "always" option. 2- Create a 120Hz custom resolution. I have tried 2389x1344 and 2528x1422. 3- Switch to G-Sync in the Nvidia CP without disabling 3D Vision first. 4- Now that custom resolution is locked into this G-Sync + ULMB mode (the monitor OSD will say "ULMB 120Hz")."[/quote] Haha, that's my old method, which by coincidence does what I said first. So you don't need 3D Vision or Windows 7 at all. I should update the description of my youtube videos.[/quote]This is amazing. I can't wait to try out Overwatch with this. I'm going to try running with a plus 1 frame-cap to see if that drops the input lag. I am extremely sensitive to input lag if i'm playing with a mouse. It isn't so noticeable with a gamepad.
masterotaku said:
tygeezy said:How do you use gsync plus ULMB? I thought they were mutually exclusive?


Check the last two-three pages of this thread: http://forums.blurbusters.com/viewtopic.php?f=4&t=2883&start=140

Basically, you can create a custom resolution with +5 Vertical Total and bam!, it works with ULMB and G-Sync at the same time (when G-Sync is enabled in the Nvidia Control Panel. And you can disable ULMB at any time in the monitor OSD). 120Hz tops and the minimum fps for single strobing is 40fps.

tygeezy said:Also, don't you get bad input lag with vsync and no cap? What's interesting is the tearing isn't as bad in 3d... Maybe that's what ragedemon is talking about with d3doveride and disabling a different buffer for more speed at the expense of a lot of tearing?


I'm not that sensitive to input lag, but yes, if the game is using vsync and the PC has a lot of unused resources, it generates the frame too soon and it waits there. So I guess that a game with 90% of GPU usage has less input lag than a game with 20% of GPU usage. I'm not an expert about input lag.

tygeezy said:
Edit: Is this it?

"Steps (I have finally confirmed them with another custom resolution):

How to do this (having a 3D Vision kit is necessary):

1- Enable 3D Vision and make sure Lightboost (not ULMB) is enabled in the desktop, using the "always" option.
2- Create a 120Hz custom resolution. I have tried 2389x1344 and 2528x1422.
3- Switch to G-Sync in the Nvidia CP without disabling 3D Vision first.
4- Now that custom resolution is locked into this G-Sync + ULMB mode (the monitor OSD will say "ULMB 120Hz")."


Haha, that's my old method, which by coincidence does what I said first. So you don't need 3D Vision or Windows 7 at all. I should update the description of my youtube videos.
This is amazing. I can't wait to try out Overwatch with this.

I'm going to try running with a plus 1 frame-cap to see if that drops the input lag. I am extremely sensitive to input lag if i'm playing with a mouse. It isn't so noticeable with a gamepad.

#11
Posted 02/23/2017 06:17 PM   
@masterotaku I was reading that thread and it appears only a couple people have it working? A guy on the acer predator had inout lag and screen going black? I'll try it on my ben q. Perhaps later this year ill look at getting your dell since its 1440 p with gsync and 3d vision at 144 hz. I was planning on waiting for 4k 144 hz gsync.. But I don't think i'll have the hardware to run that for awhile (monitors supposedly dropping this year) plus, 4k and 3d vision seems impossible;e.. Although 1080 P 3d vision might not look so bad at 27 inch from 4k. Since 1080 P is 4x the resolution and ends up being 1:1. I'll see how my monitor handles gsync plus ULMB and report back. So the custom resolution I should try is 1920 by 1085? Does it break if you go in game and it revertst he resolution to 1920 by 1080? Although I should able to add that custom res in every game I play in 2d. Edit: Reading further it looks like you need to go 100 HZ for gsync plus ULMB to work well with lower input lag.
@masterotaku I was reading that thread and it appears only a couple people have it working? A guy on the acer predator had inout lag and screen going black? I'll try it on my ben q.

Perhaps later this year ill look at getting your dell since its 1440 p with gsync and 3d vision at 144 hz.

I was planning on waiting for 4k 144 hz gsync.. But I don't think i'll have the hardware to run that for awhile (monitors supposedly dropping this year) plus, 4k and 3d vision seems impossible;e..

Although 1080 P 3d vision might not look so bad at 27 inch from 4k. Since 1080 P is 4x the resolution and ends up being 1:1.

I'll see how my monitor handles gsync plus ULMB and report back.

So the custom resolution I should try is 1920 by 1085? Does it break if you go in game and it revertst he resolution to 1920 by 1080? Although I should able to add that custom res in every game I play in 2d.

Edit: Reading further it looks like you need to go 100 HZ for gsync plus ULMB to work well with lower input lag.

#12
Posted 02/23/2017 08:47 PM   
[quote="tygeezy"] So the custom resolution I should try is 1920 by 1085?[/quote] You should use something lower than your native resolution. And by "+5 Vertical Total" I mean in the timings window. Like this one that I found on the internet: [img]http://i119.photobucket.com/albums/o139/callsign_vega/120LB.jpg[/img] Create for example 1920x1078 under "Display mode", and then in the "Total Pixels" "Vertical" input box (1149 in that screenshot) use something +5 higher than what it automatically suggests you. In my case it was for example 1530 instead of 1525. Try creating the custom resolution while G-Sync is disabled, just in case. I didn't try doing it with it enabled, I think (or maybe I did).
tygeezy said:
So the custom resolution I should try is 1920 by 1085?


You should use something lower than your native resolution. And by "+5 Vertical Total" I mean in the timings window. Like this one that I found on the internet:

Image

Create for example 1920x1078 under "Display mode", and then in the "Total Pixels" "Vertical" input box (1149 in that screenshot) use something +5 higher than what it automatically suggests you. In my case it was for example 1530 instead of 1525. Try creating the custom resolution while G-Sync is disabled, just in case. I didn't try doing it with it enabled, I think (or maybe I did).

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#13
Posted 02/23/2017 09:28 PM   
[quote="masterotaku"][quote="tygeezy"] So the custom resolution I should try is 1920 by 1085?[/quote] You should use something lower than your native resolution. And by "+5 Vertical Total" I mean in the timings window. Like this one that I found on the internet: [img]http://i119.photobucket.com/albums/o139/callsign_vega/120LB.jpg[/img] Create for example 1920x1078 under "Display mode", and then in the "Total Pixels" "Vertical" input box (1149 in that screenshot) use something +5 higher than what it automatically suggests you. In my case it was for example 1530 instead of 1525. Try creating the custom resolution while G-Sync is disabled, just in case. I didn't try doing it with it enabled, I think (or maybe I did).[/quote]I'll give that a shot. Are you capping your framerate below the monitors refresh rate to keep it within gsync range? That's kind of been the standard. That guy mentioned that he was getting input lag though at 120 hz. Was that because he couldn't maintain that framerate? So would capping below the refresh cause input lag with both gsync and ULMB?
masterotaku said:
tygeezy said:
So the custom resolution I should try is 1920 by 1085?


You should use something lower than your native resolution. And by "+5 Vertical Total" I mean in the timings window. Like this one that I found on the internet:

Image

Create for example 1920x1078 under "Display mode", and then in the "Total Pixels" "Vertical" input box (1149 in that screenshot) use something +5 higher than what it automatically suggests you. In my case it was for example 1530 instead of 1525. Try creating the custom resolution while G-Sync is disabled, just in case. I didn't try doing it with it enabled, I think (or maybe I did).
I'll give that a shot. Are you capping your framerate below the monitors refresh rate to keep it within gsync range? That's kind of been the standard.

That guy mentioned that he was getting input lag though at 120 hz. Was that because he couldn't maintain that framerate? So would capping below the refresh cause input lag with both gsync and ULMB?

#14
Posted 02/23/2017 09:42 PM   
First thing i do is turn off v-sync and FXAA when i load up a new game. Strange that tearing never bothers me and i don't even notice it outside of a couple of games. when i do notice it, its bad sometimes and the tear line is right in my face for long periods of time, but very rare. Strange.
First thing i do is turn off v-sync and FXAA when i load up a new game. Strange that tearing never bothers me and i don't even notice it outside of a couple of games. when i do notice it, its bad sometimes and the tear line is right in my face for long periods of time, but very rare. Strange.

46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530

#15
Posted 02/23/2017 10:00 PM   
  1 / 2    
Scroll To Top