Hi!
I am having a little discussion over at a gameforum with another member whether or not it is possible to run a game in Nvidia 3D Vision without having the graphics card having to run in VSync.
From my understanding, VSync has to be enabled because the graphics card obviously needs to send full rendered frames to the monitor for each eye. If it would send incomplete frames to the monitor (without VSync), the bottom of the image which is being send to the monitor would still be from the previous frame, and the previous frame obviously was for the other eye.
He claims that the newer Nvidia GPUs (600 series and above) are able to run 3D Vision without having to use VSync, but can't point me to information about this.
Now my question is: How would that be possible? Are the GPUs from the 600 series and above really able to run 3D Vision without VSync? I doubt it.
Greeting,
Rob
I am having a little discussion over at a gameforum with another member whether or not it is possible to run a game in Nvidia 3D Vision without having the graphics card having to run in VSync.
From my understanding, VSync has to be enabled because the graphics card obviously needs to send full rendered frames to the monitor for each eye. If it would send incomplete frames to the monitor (without VSync), the bottom of the image which is being send to the monitor would still be from the previous frame, and the previous frame obviously was for the other eye.
He claims that the newer Nvidia GPUs (600 series and above) are able to run 3D Vision without having to use VSync, but can't point me to information about this.
Now my question is: How would that be possible? Are the GPUs from the 600 series and above really able to run 3D Vision without VSync? I doubt it.
I don't think its enabled. "something" might be happening there, but it doesn't seem like vysnc. Input lag goes away when i disable it and i have certainly seen tearing in 3D.
I don't think its enabled. "something" might be happening there, but it doesn't seem like vysnc. Input lag goes away when i disable it and i have certainly seen tearing in 3D.
You don`t need to play with Vsync on. 3D Vision don`t require this. Vsync is for totally something else. It helps tho - that`s why we`ve got this whole hype about G-sync monitors.
You don`t need to play with Vsync on. 3D Vision don`t require this. Vsync is for totally something else. It helps tho - that`s why we`ve got this whole hype about G-sync monitors.
What about this then?
[quote="andrewf@nvidia"]Our driver forces overrides Vsync in-game settings and always forced Vsync on. It does not matter what the game's menu says.[/quote]
[url=https://forums.geforce.com/default/topic/489962/3d-and-vsync/][b][u]SOURCE[/u][/b][/url] [i](Read posts #2, #6 and so on, the ones from Andrew)[/i]
I have to overide Vsync to smooth on every game I play in with SLI and Sdvision else i get etremely bad tearing
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
At some point in the last 6 months, turning on 3d vision alone doesn't actually enable v-sync in games. 3d will enable, and you won't have any input lag, but you WILL have tearing in 3d, and it ruins the experience. I've been turning on v-sync in conjunction with 3d vision since that time, where before I never used to.
Maybe framecapping at 60 with a third party app while leaving v-sync off might force the tear line to the very top or bottom of the screen, while providing the benefit of the typical input lag v-sync causes?
Someone should try it out (I'm a lazy dick, so it can't be me)
At some point in the last 6 months, turning on 3d vision alone doesn't actually enable v-sync in games. 3d will enable, and you won't have any input lag, but you WILL have tearing in 3d, and it ruins the experience. I've been turning on v-sync in conjunction with 3d vision since that time, where before I never used to.
Maybe framecapping at 60 with a third party app while leaving v-sync off might force the tear line to the very top or bottom of the screen, while providing the benefit of the typical input lag v-sync causes?
Someone should try it out (I'm a lazy dick, so it can't be me)
[quote="Robert Godicke"]What about this then?
[quote="andrewf@nvidia"]Our driver forces overrides Vsync in-game settings and always forced Vsync on. It does not matter what the game's menu says.[/quote]
[url=https://forums.geforce.com/default/topic/489962/3d-and-vsync/][b][u]SOURCE[/u][/b][/url] [i](Read posts #2, #6 and so on, the ones from Andrew)[/i][/quote]
In my experience, forcing vsync in the control panel doesn't always work for all games (it does most of the time, but not always). I know I've had situations where I needed to turn it on in-game to get rid of tearing. So, it seems nvidia isn't always able to force vsync on their end.
So maybe that's the reason for the confusion: perhaps the drivers *try* to force vsync, but don't always succeed.
andrewf@nvidia said:Our driver forces overrides Vsync in-game settings and always forced Vsync on. It does not matter what the game's menu says.
SOURCE(Read posts #2, #6 and so on, the ones from Andrew)
In my experience, forcing vsync in the control panel doesn't always work for all games (it does most of the time, but not always). I know I've had situations where I needed to turn it on in-game to get rid of tearing. So, it seems nvidia isn't always able to force vsync on their end.
So maybe that's the reason for the confusion: perhaps the drivers *try* to force vsync, but don't always succeed.
I would really like to see an official answer to the initial post.
If it's possible to run without VSync how is this achieved? A doubled framebuffer (one for each eye)?
I would really like to see an official answer to the initial post.
If it's possible to run without VSync how is this achieved? A doubled framebuffer (one for each eye)?
You may or may not use V-Sync while on 3D, as you choose. If you don't use it, you will get tearing.
One reason I am waiting for Maxwell now is to do a SLI and get around 70-80FPS at 3D vision, so I am a bit further up from the 60FPS boundary for V-Sync. This way I will not have screen stutter/lag, which is awesome.
I wonder if GTX 870s SLI will be enough, but I will see, hopefully it will give me the needed perfomance when I OC them on Closed Loop LC.
You may or may not use V-Sync while on 3D, as you choose. If you don't use it, you will get tearing.
One reason I am waiting for Maxwell now is to do a SLI and get around 70-80FPS at 3D vision, so I am a bit further up from the 60FPS boundary for V-Sync. This way I will not have screen stutter/lag, which is awesome.
I wonder if GTX 870s SLI will be enough, but I will see, hopefully it will give me the needed perfomance when I OC them on Closed Loop LC.
You are both correct.
GPU output has 2 phases. One phase outputs data from the GPU to 2 buffers. These 2 buffers output data to the screen. Whatever is in these buffers is actually being seen on the screen.
When we talk about Full sync, the gpu is in sync with the buffers, and the buffers are in sync with the screen.
In 3D vision mode, the driver forces the fact that the buffers are always in sync with the screen. It does not, however, force the GPU to be in sync with the buffers (unless VSync is enabled).
This means that although the buffers are synced to the screen (each frame of the screen is receiving alternating information from the 2 buffers, for the right and left eyes), the GPU is not sync'd with the buffers themselves.
This means that although you are perceiving Stereo image with correct images for both the left and right eye, each eye is also perceiving tearing caused due to the GPU not being in sync with the buffers.
I hope I'm clear. This is my understanding. Perhaps someone with more insight would post more information on the subject.
-- Shahzad
GPU output has 2 phases. One phase outputs data from the GPU to 2 buffers. These 2 buffers output data to the screen. Whatever is in these buffers is actually being seen on the screen.
When we talk about Full sync, the gpu is in sync with the buffers, and the buffers are in sync with the screen.
In 3D vision mode, the driver forces the fact that the buffers are always in sync with the screen. It does not, however, force the GPU to be in sync with the buffers (unless VSync is enabled).
This means that although the buffers are synced to the screen (each frame of the screen is receiving alternating information from the 2 buffers, for the right and left eyes), the GPU is not sync'd with the buffers themselves.
This means that although you are perceiving Stereo image with correct images for both the left and right eye, each eye is also perceiving tearing caused due to the GPU not being in sync with the buffers.
I hope I'm clear. This is my understanding. Perhaps someone with more insight would post more information on the subject.
-- Shahzad
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I am having a little discussion over at a gameforum with another member whether or not it is possible to run a game in Nvidia 3D Vision without having the graphics card having to run in VSync.
From my understanding, VSync has to be enabled because the graphics card obviously needs to send full rendered frames to the monitor for each eye. If it would send incomplete frames to the monitor (without VSync), the bottom of the image which is being send to the monitor would still be from the previous frame, and the previous frame obviously was for the other eye.
He claims that the newer Nvidia GPUs (600 series and above) are able to run 3D Vision without having to use VSync, but can't point me to information about this.
Now my question is: How would that be possible? Are the GPUs from the 600 series and above really able to run 3D Vision without VSync? I doubt it.
Greeting,
Rob
Creative X-Fi Titanium | 2x Seagate Barracuda 7200.14 2TB | Samsung SyncMaster 2233RZ 120Hz | Windows 7 Pro x64
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
https://steamcommunity.com/profiles/76561198014296177/
SOURCE (Read posts #2, #6 and so on, the ones from Andrew)
Creative X-Fi Titanium | 2x Seagate Barracuda 7200.14 2TB | Samsung SyncMaster 2233RZ 120Hz | Windows 7 Pro x64
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
Maybe framecapping at 60 with a third party app while leaving v-sync off might force the tear line to the very top or bottom of the screen, while providing the benefit of the typical input lag v-sync causes?
Someone should try it out (I'm a lazy dick, so it can't be me)
In my experience, forcing vsync in the control panel doesn't always work for all games (it does most of the time, but not always). I know I've had situations where I needed to turn it on in-game to get rid of tearing. So, it seems nvidia isn't always able to force vsync on their end.
So maybe that's the reason for the confusion: perhaps the drivers *try* to force vsync, but don't always succeed.
If it's possible to run without VSync how is this achieved? A doubled framebuffer (one for each eye)?
Creative X-Fi Titanium | 2x Seagate Barracuda 7200.14 2TB | Samsung SyncMaster 2233RZ 120Hz | Windows 7 Pro x64
One reason I am waiting for Maxwell now is to do a SLI and get around 70-80FPS at 3D vision, so I am a bit further up from the 60FPS boundary for V-Sync. This way I will not have screen stutter/lag, which is awesome.
I wonder if GTX 870s SLI will be enough, but I will see, hopefully it will give me the needed perfomance when I OC them on Closed Loop LC.
GPU output has 2 phases. One phase outputs data from the GPU to 2 buffers. These 2 buffers output data to the screen. Whatever is in these buffers is actually being seen on the screen.
When we talk about Full sync, the gpu is in sync with the buffers, and the buffers are in sync with the screen.
In 3D vision mode, the driver forces the fact that the buffers are always in sync with the screen. It does not, however, force the GPU to be in sync with the buffers (unless VSync is enabled).
This means that although the buffers are synced to the screen (each frame of the screen is receiving alternating information from the 2 buffers, for the right and left eyes), the GPU is not sync'd with the buffers themselves.
This means that although you are perceiving Stereo image with correct images for both the left and right eye, each eye is also perceiving tearing caused due to the GPU not being in sync with the buffers.
I hope I'm clear. This is my understanding. Perhaps someone with more insight would post more information on the subject.
-- Shahzad
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.