3D vision and G-Sync, FreeSync, Adaptive-Sync, and more? Let's get them working!
1 / 2
Dear All,
VSync is a crucial part of gaming, especially 3D Vision gaming (or is it...?)
[color="orange"][i]Note: The below discussion takes into account triple buffering to be ON, as is the reality in almost all games whether DirectX or OpenGL, especially today. This means that for the purposes of the below discussion, FPS does NOT half if it drops below display refresh - this has not been the case for a long, long time.[/i][/color]
Many might not know why VSync is so important so here is the run-down:
1.
The GPU produces frames inconsistently - it is intrinsically how GPUs work. This shows as tearing on the screen:
[img]https://camo.githubusercontent.com/eca45d7d1e44a2f78541c27fe49f5e28fe66be0e/68747470733a2f2f662e636c6f75642e6769746875622e636f6d2f6173736574732f333738393232362f3439323436372f34323630643034612d626162322d313165322d383731662d3338316536343163336538332e706e67[/img]
https://www.youtube.com/watch?v=6p5a21dawZ0
This is bad especially for 3D Vision because both eyes need to see the same image (albeit from different perspectives) for 3D Vision to work.
Side Note: Tearing gets better with higher FPS.
To fix tearing, VSync was invented, which ensures that the GPU syncs perfectly with the display > no tearing.
https://www.youtube.com/watch?v=yuZnsCAJBOI
2.
The problem with V-Sync:
Because the frame is now only being produced in-line with the display, this is great when the GPU can keep up with the display (i.e. 60fps/120fps/144fps etc). This provides a consistently smooth experience.
A major problem occurs when the GPU is unable to keep up with the display sync, i.e. FPS drops below say 60FPS.
This causes a critical problem known as "stuttering" and "micro stutter", which even at high FPS, cause the game to feel and play like a mess. This is because the GPU, although producing high (i.e. 59 FPS), it is producing them inconsistently whenever it can because it can'd keep up. One frame might be close to the previous frame, but the next frame might be very different.
We can see this stuttering on a graph if we plot frame times from a game:
a. [VSync OFF] - Smooth as silk gameplay at consistent 7ms frame time:
[img]https://abload.de/img/fps_cap_200_low7huiz.png[/img]
b. [VSync ON] - Stuttery mess gameplay:
[img]https://abload.de/img/fps_cap_2001mjdx.png[/img]
Notice how the frame time goes from consistent 7ms to an erratic 7ms-30ms?
Yeah.
3.
So then we come to G-sync (nVidia) and FreeSync (AMD).
Both these technologies aim to ensure that with VSync ON, the frames that are displayed by the GPU when it cannot keep up with the display (below 60 FPS), are at least as consistent in time as when the GPU can keep up (60FPS locked). This means there is a huge improvement in the smoothness and experience of gameplay.
https://www.youtube.com/watch?v=ZSgHqImxQpE
Sadly, G-Sync (and FreeSync) are not supported by 3D Vision.
4.
So where does this leave us 3D Vision gamers?
Unfortunately, we are either forced to:
a. Mostly be locked down to VSync and hope the FPS never dips below 60FPS - but this is unrealistic as due to the fact that 3D Vision requires twice the processing power, and the CPU bottleneck driver bug, solid 60 fps locked can seldom be attained, especially in modern games.
b. Play with VSync off, in which case the game is a lot "smoother" - but, the 3D effect degrades because both eyes experience frame tearing at different parts of the frame (Note: degrades, NOT impossible).
5.
Half way to salvation?
Before G-Sync, there was nVidia's brainchild - Adaptive V-Sync.
a. When the GPU was able to match the display (60 FPS), VSync was automatically enabled in the game.
b. When the GPU was NOT able to match the display, i.e. the FPS dropped below 60 FPS, VSync was automatically DISABLED.
[img]https://www.geforce.com/Active/en_US/shared/images/articles/introducing-the-geforce-gtx-680-gpu/AdaptiveVSync-2-650.png[/img]
This happens dynamically within the game, and worked pretty well because you would have "the best of both worlds" - perfect, consistent frames at display refresh rate with no tearing what so ever, and consistent FPS (albeit with tearing) at below refresh rate.
Unfortunately, Just like G-Sync, Adaptive Sync is NOT supported by 3D Vision, BUT it can be argued that there is no reason for it to NOT support 3D Vision...
Presumably, Nvidia's thinking was that this feature should be locked out for 3D Vision use because tearing would be detrimental to the 3D effect - and in theory they *might* have been correct.
In practice, however, I have found that playing with VSync OFF can be a huge improvement because it offers a much smoother gameplay while minimising input lag and eliminating frame-time anomalies (hard stutter, microstutter), all the while not disrupting the 3D effect [i]too[/i] noticeably.
I would, of course, rather have VSync ON when the FPS can sync with my display (i.e. can produce 60fps).
Hence the reason for this thread:
6.
[color="green"]My hope is that perhaps we can use this thread to discuss ways in which we might be able to get Adaptive VSync (or any advanced VSync related tech such as G-Sync) to work with 3D Vision, either through specific settings, hacks, bypasses, 3DMigoto, or anything else.[/color]
We can also discuss things such as what would happen if we hacked G-Sync ON with 3D Vision as it stands?
NVidia have disabled this because each eye would receive a slightly different image - the thinking being that the difference would kill the 3D Vision effect.
The only problem is that this is a 100% false assumption and is simply bad science. The fact is that they would be correct IF and ONLY IF the image was a static picture or had very low FPS.
The reality is that one can receive different images per eye and STILL have a perfect 3D vision experience, as long as the image change rate (FPS) is decent enough.
[color="green"]HERESY! you might say. Liar! you might shout ;-)
I would say try it for yourself:
Put on your 3D Vision glasses and wave your hand in front of your face. Each eye is receiving a different image than the other. Look around your room.
Is the 3D Effect any less? NO!
Does it give you a headache? NO!
So, does receiving a different image for one eye compared to the other (at a decent FPS), compromise 3d at all?
Simply, the answer is no, the 3D is perfect even though each eye is receiving a different "frame" in time. [/color]
7.
In conclusion:
For better, smoother, more enjoyable gameplay in ALL 3D Vision games, let's try and somehow make Adaptive-Sync, G-Sync, FreeSync, etc work with 3D Vision! :)
How one might do this, I have no idea. What I do know to be true is that if enabled, they will work just fine, contrary to what nVidia might believe, and there is no justifiable technical reason for these technologies to not have been left enabled for 3D vision. It's another simple case of nvidia cutting off the nose because there is an itch on the tip.
What I do know is that forcing Adaptive-Sync through nVidia Profile Inspector has never worked with 3D Vison, and I do not have a G-Sync display but I suspect the outcome there to be the same...
This affects and would benefit all of us a great deal. Let's make this happen together!
Potential start: What happens if G-Sync is used with 3rd party StereoScopic 3D drivers where nVidia can't blanket disable G-Sync as with 3D Vision?
TriDef driver?
Old eDimensional driver?
Win3D stereoScopic driver?
iZ3D driver?
VSync is a crucial part of gaming, especially 3D Vision gaming (or is it...?)
Note: The below discussion takes into account triple buffering to be ON, as is the reality in almost all games whether DirectX or OpenGL, especially today. This means that for the purposes of the below discussion, FPS does NOT half if it drops below display refresh - this has not been the case for a long, long time.
Many might not know why VSync is so important so here is the run-down:
1.
The GPU produces frames inconsistently - it is intrinsically how GPUs work. This shows as tearing on the screen:
This is bad especially for 3D Vision because both eyes need to see the same image (albeit from different perspectives) for 3D Vision to work.
Side Note: Tearing gets better with higher FPS.
To fix tearing, VSync was invented, which ensures that the GPU syncs perfectly with the display > no tearing.
2.
The problem with V-Sync:
Because the frame is now only being produced in-line with the display, this is great when the GPU can keep up with the display (i.e. 60fps/120fps/144fps etc). This provides a consistently smooth experience.
A major problem occurs when the GPU is unable to keep up with the display sync, i.e. FPS drops below say 60FPS.
This causes a critical problem known as "stuttering" and "micro stutter", which even at high FPS, cause the game to feel and play like a mess. This is because the GPU, although producing high (i.e. 59 FPS), it is producing them inconsistently whenever it can because it can'd keep up. One frame might be close to the previous frame, but the next frame might be very different.
We can see this stuttering on a graph if we plot frame times from a game:
a. [VSync OFF] - Smooth as silk gameplay at consistent 7ms frame time:
b. [VSync ON] - Stuttery mess gameplay:
Notice how the frame time goes from consistent 7ms to an erratic 7ms-30ms?
Yeah.
3.
So then we come to G-sync (nVidia) and FreeSync (AMD).
Both these technologies aim to ensure that with VSync ON, the frames that are displayed by the GPU when it cannot keep up with the display (below 60 FPS), are at least as consistent in time as when the GPU can keep up (60FPS locked). This means there is a huge improvement in the smoothness and experience of gameplay.
Sadly, G-Sync (and FreeSync) are not supported by 3D Vision.
4.
So where does this leave us 3D Vision gamers?
Unfortunately, we are either forced to:
a. Mostly be locked down to VSync and hope the FPS never dips below 60FPS - but this is unrealistic as due to the fact that 3D Vision requires twice the processing power, and the CPU bottleneck driver bug, solid 60 fps locked can seldom be attained, especially in modern games.
b. Play with VSync off, in which case the game is a lot "smoother" - but, the 3D effect degrades because both eyes experience frame tearing at different parts of the frame (Note: degrades, NOT impossible).
5.
Half way to salvation?
Before G-Sync, there was nVidia's brainchild - Adaptive V-Sync.
a. When the GPU was able to match the display (60 FPS), VSync was automatically enabled in the game.
b. When the GPU was NOT able to match the display, i.e. the FPS dropped below 60 FPS, VSync was automatically DISABLED.
This happens dynamically within the game, and worked pretty well because you would have "the best of both worlds" - perfect, consistent frames at display refresh rate with no tearing what so ever, and consistent FPS (albeit with tearing) at below refresh rate.
Unfortunately, Just like G-Sync, Adaptive Sync is NOT supported by 3D Vision, BUT it can be argued that there is no reason for it to NOT support 3D Vision...
Presumably, Nvidia's thinking was that this feature should be locked out for 3D Vision use because tearing would be detrimental to the 3D effect - and in theory they *might* have been correct.
In practice, however, I have found that playing with VSync OFF can be a huge improvement because it offers a much smoother gameplay while minimising input lag and eliminating frame-time anomalies (hard stutter, microstutter), all the while not disrupting the 3D effect too noticeably.
I would, of course, rather have VSync ON when the FPS can sync with my display (i.e. can produce 60fps).
Hence the reason for this thread:
6. My hope is that perhaps we can use this thread to discuss ways in which we might be able to get Adaptive VSync (or any advanced VSync related tech such as G-Sync) to work with 3D Vision, either through specific settings, hacks, bypasses, 3DMigoto, or anything else.
We can also discuss things such as what would happen if we hacked G-Sync ON with 3D Vision as it stands?
NVidia have disabled this because each eye would receive a slightly different image - the thinking being that the difference would kill the 3D Vision effect.
The only problem is that this is a 100% false assumption and is simply bad science. The fact is that they would be correct IF and ONLY IF the image was a static picture or had very low FPS.
The reality is that one can receive different images per eye and STILL have a perfect 3D vision experience, as long as the image change rate (FPS) is decent enough.
HERESY! you might say. Liar! you might shout ;-)
I would say try it for yourself:
Put on your 3D Vision glasses and wave your hand in front of your face. Each eye is receiving a different image than the other. Look around your room.
Is the 3D Effect any less? NO!
Does it give you a headache? NO!
So, does receiving a different image for one eye compared to the other (at a decent FPS), compromise 3d at all?
Simply, the answer is no, the 3D is perfect even though each eye is receiving a different "frame" in time.
7.
In conclusion:
For better, smoother, more enjoyable gameplay in ALL 3D Vision games, let's try and somehow make Adaptive-Sync, G-Sync, FreeSync, etc work with 3D Vision! :)
How one might do this, I have no idea. What I do know to be true is that if enabled, they will work just fine, contrary to what nVidia might believe, and there is no justifiable technical reason for these technologies to not have been left enabled for 3D vision. It's another simple case of nvidia cutting off the nose because there is an itch on the tip.
What I do know is that forcing Adaptive-Sync through nVidia Profile Inspector has never worked with 3D Vison, and I do not have a G-Sync display but I suspect the outcome there to be the same...
This affects and would benefit all of us a great deal. Let's make this happen together!
Potential start: What happens if G-Sync is used with 3rd party StereoScopic 3D drivers where nVidia can't blanket disable G-Sync as with 3D Vision?
TriDef driver?
Old eDimensional driver?
Win3D stereoScopic driver?
iZ3D driver?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
If that happened I would play with 3D glasses again (lately I only "test" with 3D Vision). My monitor is G-sync capable and "3D Vision + G-sync" (+solving CPU bottleneck) would be the salvation.
Adaptive vsync is also a good idea, at least, but I think it already "works" together with 3D Vision (at least with some games). I can enable it in the NVidia CP at the same time with 3D Vision, and I have noticed the differece in some games (Pro Evolutioin Soccer 2017 is a clear example, it is smoother with Adaptive vsync ON, and of course 3D Vision).
I have always wondered why Nvidia do not give us the opportunity to see a complete OSD information (with the typical toogle key to show and hide) into the game with all the parameters activated in the Nvidia CP with a toogle key, the same way MSI Afterburner does, to know exactly what is going on when playing a specific game. I don't even know sometimes if G-sync is activated and I have to push the monitor osd button if I need to be sure. It would be nice to see how adaptive vsync automatically toggles vsync on/off depending on the framerate when playing, and a way to test if it is working properly. And of course it would be great to have the oportunity to change the Nvidia CP parameters "in the fly" without leaving the game.
Maybe the idea is not to give us too much, because the consumer must buy another video card and there must be reasons.
If that happened I would play with 3D glasses again (lately I only "test" with 3D Vision). My monitor is G-sync capable and "3D Vision + G-sync" (+solving CPU bottleneck) would be the salvation.
Adaptive vsync is also a good idea, at least, but I think it already "works" together with 3D Vision (at least with some games). I can enable it in the NVidia CP at the same time with 3D Vision, and I have noticed the differece in some games (Pro Evolutioin Soccer 2017 is a clear example, it is smoother with Adaptive vsync ON, and of course 3D Vision).
I have always wondered why Nvidia do not give us the opportunity to see a complete OSD information (with the typical toogle key to show and hide) into the game with all the parameters activated in the Nvidia CP with a toogle key, the same way MSI Afterburner does, to know exactly what is going on when playing a specific game. I don't even know sometimes if G-sync is activated and I have to push the monitor osd button if I need to be sure. It would be nice to see how adaptive vsync automatically toggles vsync on/off depending on the framerate when playing, and a way to test if it is working properly. And of course it would be great to have the oportunity to change the Nvidia CP parameters "in the fly" without leaving the game.
Maybe the idea is not to give us too much, because the consumer must buy another video card and there must be reasons.
There was one time where I could "enable" G-Sync while using 3D Vision. In practice, it didn't work. There was judder and 3D wasn't working, so I got no 3D and no G-Sync.
I always use vsync with the "forced on" setting. But yeah, when I don't get fps=Hz, it angers me :p. That's why I sometimes delay playing some games until I have a better GPU, or use less resolution, or 3D at 100Hz.
There was one time where I could "enable" G-Sync while using 3D Vision. In practice, it didn't work. There was judder and 3D wasn't working, so I got no 3D and no G-Sync.
I always use vsync with the "forced on" setting. But yeah, when I don't get fps=Hz, it angers me :p. That's why I sometimes delay playing some games until I have a better GPU, or use less resolution, or 3D at 100Hz.
I think Adaptive VSync works with 3DVision in Frostbite3 games (I tried DA:I and BF1). Obviously you need to disable VSYNC in the game and use it via NV Control Panel.
I haven't tested other games, so it might not work. Would be interesting if somebody else sees the same thing (or maybe I only see it because I run 3D Surround).
I think Adaptive VSync works with 3DVision in Frostbite3 games (I tried DA:I and BF1). Obviously you need to disable VSYNC in the game and use it via NV Control Panel.
I haven't tested other games, so it might not work. Would be interesting if somebody else sees the same thing (or maybe I only see it because I run 3D Surround).
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Swap Interval FTW
[quote="Helifax"]I think Adaptive VSync works with 3DVision in Frostbite3 games (I tried DA:I and BF1). Obviously you need to disable VSYNC in the game and use it via NV Control Panel.
I haven't tested other games, so it might not work. Would be interesting if somebody else sees the same thing (or maybe I only see it because I run 3D Surround).[/quote]
Shouldn't you be using Smooth Vsync, designed for SLI users?
Helifax said:I think Adaptive VSync works with 3DVision in Frostbite3 games (I tried DA:I and BF1). Obviously you need to disable VSYNC in the game and use it via NV Control Panel.
I haven't tested other games, so it might not work. Would be interesting if somebody else sees the same thing (or maybe I only see it because I run 3D Surround).
Shouldn't you be using Smooth Vsync, designed for SLI users?
Overheating can be another reason to use VSync. I have encountered some older games where, with VSync OFF, my 4K frame rates were well above the 60 Hz limit of my passive TV display, and my Titan GPU happily ran up to whatever thermal limit was set in Afterburner - occasionally crashing when this limit was set as high as 90 C. I found it was much better to enable VSync in these games - and lower thermal limit to 80 C or less.
Overheating can be another reason to use VSync. I have encountered some older games where, with VSync OFF, my 4K frame rates were well above the 60 Hz limit of my passive TV display, and my Titan GPU happily ran up to whatever thermal limit was set in Afterburner - occasionally crashing when this limit was set as high as 90 C. I found it was much better to enable VSync in these games - and lower thermal limit to 80 C or less.
Thanks for the input guys.
@Helifax: As far as I can tell, adaptive sync only turns on VSync with 3D Vision - there is no tearing when FPS is < refresh unfortunately. I have just tried ME:A with your fix to be certain. Maybe it is a surround gaming thing... the plot thickens...
@D-Man11: Smooth Sync was a solution to a problem which no longer exists, back when FPS dropped to half the refresh rate when GPUs couldn't maintain FPS=Refesh. For very many years now, both OpenGL and DirectX have their own ways of rendering using triple buffering and render ahead, and more recently, fast sync, which emulates OpenGL triple buffering in DirectX for most purposes. Nowadays, smooth sync is a hindrance rather than a positive as it locks the FPS down to 30 fps as soon you go 1 fps below 60 :(
@masterotaku: That's very interesting. I think current 3DV+GSync is likely buggy - IMO, what you probably experienced was not a dead end per say, but merely a small road block that ought to be fixable relatively easily.
Has anyone ever tried the Tridef driver with G-Sync?
@Helifax: As far as I can tell, adaptive sync only turns on VSync with 3D Vision - there is no tearing when FPS is < refresh unfortunately. I have just tried ME:A with your fix to be certain. Maybe it is a surround gaming thing... the plot thickens...
@D-Man11: Smooth Sync was a solution to a problem which no longer exists, back when FPS dropped to half the refresh rate when GPUs couldn't maintain FPS=Refesh. For very many years now, both OpenGL and DirectX have their own ways of rendering using triple buffering and render ahead, and more recently, fast sync, which emulates OpenGL triple buffering in DirectX for most purposes. Nowadays, smooth sync is a hindrance rather than a positive as it locks the FPS down to 30 fps as soon you go 1 fps below 60 :(
@masterotaku: That's very interesting. I think current 3DV+GSync is likely buggy - IMO, what you probably experienced was not a dead end per say, but merely a small road block that ought to be fixable relatively easily.
Has anyone ever tried the Tridef driver with G-Sync?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="RAGEdemon"]@D-Man11: Smooth Sync was a solution to a problem which no longer exists, back when FPS dropped to half the refresh rate when GPUs couldn't maintain FPS=Refesh. For very many years now, both OpenGL and DirectX have their own ways of rendering using triple buffering and render ahead, and more recently, fast sync, which emulates OpenGL triple buffering in DirectX for most purposes. Nowadays, smooth sync is a hindrance rather than a positive as it locks the FPS down to 30 fps as soon you go 1 fps below 60 :( [/quote]
They way I understood it, is that Smooth Sync is a variation of adaptive v-sync, but only for SLI users. It copes with the inherent microstutter, that can be associated with SLI.
Fastsync does not work with 3D or SLI.
I do not have SLI, so I have no first hand experience. But when I was considering it over a single GPU solution, I did some reading on it.
RAGEdemon said:@D-Man11: Smooth Sync was a solution to a problem which no longer exists, back when FPS dropped to half the refresh rate when GPUs couldn't maintain FPS=Refesh. For very many years now, both OpenGL and DirectX have their own ways of rendering using triple buffering and render ahead, and more recently, fast sync, which emulates OpenGL triple buffering in DirectX for most purposes. Nowadays, smooth sync is a hindrance rather than a positive as it locks the FPS down to 30 fps as soon you go 1 fps below 60 :(
They way I understood it, is that Smooth Sync is a variation of adaptive v-sync, but only for SLI users. It copes with the inherent microstutter, that can be associated with SLI.
Fastsync does not work with 3D or SLI.
I do not have SLI, so I have no first hand experience. But when I was considering it over a single GPU solution, I did some reading on it.
@RAGEdemon
I don't know about tridef, but using PCSX2 in the old anaglyph/SBS/TAB mode (no 3D in the drivers) works fine with G-Sync. But it's a passive type of 3D, so it doesn't surprise me.
The question is if 3D Vision glasses would be able to support variable refresh rate themselves. They have shown a great static range (minimum 32Hz per eye and at the very least up to 82.5Hz per eye).
@RAGEdemon
I don't know about tridef, but using PCSX2 in the old anaglyph/SBS/TAB mode (no 3D in the drivers) works fine with G-Sync. But it's a passive type of 3D, so it doesn't surprise me.
The question is if 3D Vision glasses would be able to support variable refresh rate themselves. They have shown a great static range (minimum 32Hz per eye and at the very least up to 82.5Hz per eye).
That's a very interesting point masterotaku. Do we know if a passive 3D Gsync display exists which might bypass this limitation?
@D-Man11, thanks for the info. I have just tried it. Results: It no longer locks to half refresh as it did in the old days. Unfortunately, it still syncs with the refresh even under the refresh rate meaning no tearing is visible (which in our particular case is a bad thing) as we are interested in smooth fps transitions over tearing for the purposes of this discussion. It acting like standard VSync now was a pleasant surprise though, need more testing. For clarification, I used the setting "VSync Smooth AFR Behavior".
That's a very interesting point masterotaku. Do we know if a passive 3D Gsync display exists which might bypass this limitation?
@D-Man11, thanks for the info. I have just tried it. Results: It no longer locks to half refresh as it did in the old days. Unfortunately, it still syncs with the refresh even under the refresh rate meaning no tearing is visible (which in our particular case is a bad thing) as we are interested in smooth fps transitions over tearing for the purposes of this discussion. It acting like standard VSync now was a pleasant surprise though, need more testing. For clarification, I used the setting "VSync Smooth AFR Behavior".
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Adaptive sync must make something more than just prevent vsync on when below 60fps. The game I notice more is with Pro Evolution Soccer 2017 (as I said before). I play this game with ocasional stutter when fps goes under 60 fps, but using adaptive sync let me play smooth all the time and without tearing (or maybe the tearing is not noticeable to my eye because the fps is very close to 60 when that happen..., but I think tearing does not work that way).
This game seems to be a bit "special" in terms of sync, but what is very clear is that adaptive sync makes a big difference in terms of smoothness, and I am always refering to playing with 3D glasses. So vsync on is not exactly the same as adaptive sync when using the glasses, at least with some games.
Adaptive sync must make something more than just prevent vsync on when below 60fps. The game I notice more is with Pro Evolution Soccer 2017 (as I said before). I play this game with ocasional stutter when fps goes under 60 fps, but using adaptive sync let me play smooth all the time and without tearing (or maybe the tearing is not noticeable to my eye because the fps is very close to 60 when that happen..., but I think tearing does not work that way).
This game seems to be a bit "special" in terms of sync, but what is very clear is that adaptive sync makes a big difference in terms of smoothness, and I am always refering to playing with 3D glasses. So vsync on is not exactly the same as adaptive sync when using the glasses, at least with some games.
@Duerf: the way adaptive sync works is that it disabled VSync at < 60FPS. If you are not seeing tearing below 60 then something "special" seems to be happening with this game and your setup... very interesting! Does this happen with other games too?
@Duerf: the way adaptive sync works is that it disabled VSync at < 60FPS. If you are not seeing tearing below 60 then something "special" seems to be happening with this game and your setup... very interesting! Does this happen with other games too?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I believe the reason why Nvidia does not allow 3D Vision and G-sync simultaneously is due to the impossible management of the active shutter glasses with variable timing.
The problem might be technical (current glasses not designed to sync that fast, Nvidia not wanting to spend millions of $ redesigning the transmitter and glases)
It might also be psychovisual : what happens when one eye (always the same) sees the picture longer or shorter than the other eye. (how bad would the eyestrain be ?).
A possible solution to that could require a total redesign of the 3D vision hardware.
Passive systems would have a much better chance at transitioning to variable refresh rates since they display both eye views at the same time.
On my system (dual-projectors), back when I used an AMD card and direct driving the projectors from the computer (dual-output) I have spent quite a lot of time dealing with this issue...
The best and most reliable results are when transmitting side-by-side.
SBS transmission allows you to do anything you want with V-sync without affecting the quality of 3D.
I've had every type of V-sync working with Tridef and SBS mode, except the variable refresh rates... due to the lack of hardware. (Are there any variable refresh projectors out there ? I haven't seen any)
I believe the reason why Nvidia does not allow 3D Vision and G-sync simultaneously is due to the impossible management of the active shutter glasses with variable timing.
The problem might be technical (current glasses not designed to sync that fast, Nvidia not wanting to spend millions of $ redesigning the transmitter and glases)
It might also be psychovisual : what happens when one eye (always the same) sees the picture longer or shorter than the other eye. (how bad would the eyestrain be ?).
A possible solution to that could require a total redesign of the 3D vision hardware.
Passive systems would have a much better chance at transitioning to variable refresh rates since they display both eye views at the same time.
On my system (dual-projectors), back when I used an AMD card and direct driving the projectors from the computer (dual-output) I have spent quite a lot of time dealing with this issue...
The best and most reliable results are when transmitting side-by-side.
SBS transmission allows you to do anything you want with V-sync without affecting the quality of 3D.
I've had every type of V-sync working with Tridef and SBS mode, except the variable refresh rates... due to the lack of hardware. (Are there any variable refresh projectors out there ? I haven't seen any)
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
[quote="RAGEdemon"]@D-Man11, thanks for the info.[/quote]
I know that you use DSR, perhaps in demanding games, you might be better served using Nvidia's FXAA solution for SLI.
Also MFAA is available for Pascal and Maxwell owners and can be used in almost any game that has a 2x/4x MSAA option in the game's options/settings.
As far as Free Sync, I do not think that there is any possibility there, because those monitors have an additional module installed from AMD. This module is not supported by Nvidia, so most likely a no go. But then again, there may be a hack, somewhere, by someone, that adds that support.
But I still disagree with your assumption that the frames are sent out one after the other and not in identical pairs offset for stereo.
I know that you use DSR, perhaps in demanding games, you might be better served using Nvidia's FXAA solution for SLI.
Also MFAA is available for Pascal and Maxwell owners and can be used in almost any game that has a 2x/4x MSAA option in the game's options/settings.
As far as Free Sync, I do not think that there is any possibility there, because those monitors have an additional module installed from AMD. This module is not supported by Nvidia, so most likely a no go. But then again, there may be a hack, somewhere, by someone, that adds that support.
But I still disagree with your assumption that the frames are sent out one after the other and not in identical pairs offset for stereo.
Hello mate,
The problem I'm facing isn't one of the GPU not having enough power but one of the CPU not having enough IPC/clock power. With 2x 1080s in SLi, it doesn't matter in most games whether you use DSR 1600p, MFAA/FXAA, or no AA at all @640x480 - the GPUs are never near even 90% utilisation because the CPU can't supply them with enough frames to process.
[quote="D-Man11"]
But I still disagree with your assumption that the frames are sent out one after the other and not in identical pairs offset for stereo.[/quote]
No-one says otherwise mate. What I am saying is that although they are sent over as stereo "pairs" one after the other at present (standard 3D Vision page-flipping) or a single half-resolution frame for SBS/OU, or both together in a single double-long frame for some technologies, *if they were sent out one after the other in time also (not a pair but every frame moved slightly along in time), 3D would work just fine, and I have given an example of how you can do a real life test to prove it to oneself.
*the FPS being high enough - this would vary from person to person - At 0 FPS for example, it wouldn't work at all, because you would have a stereo pair where the images would be different. At 10fps? Maybe. At 60fps (looking around your seating area with 3D Vision glasses activated), it works most definitely.
All the best :)
The problem I'm facing isn't one of the GPU not having enough power but one of the CPU not having enough IPC/clock power. With 2x 1080s in SLi, it doesn't matter in most games whether you use DSR 1600p, MFAA/FXAA, or no AA at all @640x480 - the GPUs are never near even 90% utilisation because the CPU can't supply them with enough frames to process.
D-Man11 said:
But I still disagree with your assumption that the frames are sent out one after the other and not in identical pairs offset for stereo.
No-one says otherwise mate. What I am saying is that although they are sent over as stereo "pairs" one after the other at present (standard 3D Vision page-flipping) or a single half-resolution frame for SBS/OU, or both together in a single double-long frame for some technologies, *if they were sent out one after the other in time also (not a pair but every frame moved slightly along in time), 3D would work just fine, and I have given an example of how you can do a real life test to prove it to oneself.
*the FPS being high enough - this would vary from person to person - At 0 FPS for example, it wouldn't work at all, because you would have a stereo pair where the images would be different. At 10fps? Maybe. At 60fps (looking around your seating area with 3D Vision glasses activated), it works most definitely.
All the best :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
VSync is a crucial part of gaming, especially 3D Vision gaming (or is it...?)
Note: The below discussion takes into account triple buffering to be ON, as is the reality in almost all games whether DirectX or OpenGL, especially today. This means that for the purposes of the below discussion, FPS does NOT half if it drops below display refresh - this has not been the case for a long, long time.
Many might not know why VSync is so important so here is the run-down:
1.
The GPU produces frames inconsistently - it is intrinsically how GPUs work. This shows as tearing on the screen:
This is bad especially for 3D Vision because both eyes need to see the same image (albeit from different perspectives) for 3D Vision to work.
Side Note: Tearing gets better with higher FPS.
To fix tearing, VSync was invented, which ensures that the GPU syncs perfectly with the display > no tearing.
2.
The problem with V-Sync:
Because the frame is now only being produced in-line with the display, this is great when the GPU can keep up with the display (i.e. 60fps/120fps/144fps etc). This provides a consistently smooth experience.
A major problem occurs when the GPU is unable to keep up with the display sync, i.e. FPS drops below say 60FPS.
This causes a critical problem known as "stuttering" and "micro stutter", which even at high FPS, cause the game to feel and play like a mess. This is because the GPU, although producing high (i.e. 59 FPS), it is producing them inconsistently whenever it can because it can'd keep up. One frame might be close to the previous frame, but the next frame might be very different.
We can see this stuttering on a graph if we plot frame times from a game:
a. [VSync OFF] - Smooth as silk gameplay at consistent 7ms frame time:
b. [VSync ON] - Stuttery mess gameplay:
Notice how the frame time goes from consistent 7ms to an erratic 7ms-30ms?
Yeah.
3.
So then we come to G-sync (nVidia) and FreeSync (AMD).
Both these technologies aim to ensure that with VSync ON, the frames that are displayed by the GPU when it cannot keep up with the display (below 60 FPS), are at least as consistent in time as when the GPU can keep up (60FPS locked). This means there is a huge improvement in the smoothness and experience of gameplay.
Sadly, G-Sync (and FreeSync) are not supported by 3D Vision.
4.
So where does this leave us 3D Vision gamers?
Unfortunately, we are either forced to:
a. Mostly be locked down to VSync and hope the FPS never dips below 60FPS - but this is unrealistic as due to the fact that 3D Vision requires twice the processing power, and the CPU bottleneck driver bug, solid 60 fps locked can seldom be attained, especially in modern games.
b. Play with VSync off, in which case the game is a lot "smoother" - but, the 3D effect degrades because both eyes experience frame tearing at different parts of the frame (Note: degrades, NOT impossible).
5.
Half way to salvation?
Before G-Sync, there was nVidia's brainchild - Adaptive V-Sync.
a. When the GPU was able to match the display (60 FPS), VSync was automatically enabled in the game.
b. When the GPU was NOT able to match the display, i.e. the FPS dropped below 60 FPS, VSync was automatically DISABLED.
This happens dynamically within the game, and worked pretty well because you would have "the best of both worlds" - perfect, consistent frames at display refresh rate with no tearing what so ever, and consistent FPS (albeit with tearing) at below refresh rate.
Unfortunately, Just like G-Sync, Adaptive Sync is NOT supported by 3D Vision, BUT it can be argued that there is no reason for it to NOT support 3D Vision...
Presumably, Nvidia's thinking was that this feature should be locked out for 3D Vision use because tearing would be detrimental to the 3D effect - and in theory they *might* have been correct.
In practice, however, I have found that playing with VSync OFF can be a huge improvement because it offers a much smoother gameplay while minimising input lag and eliminating frame-time anomalies (hard stutter, microstutter), all the while not disrupting the 3D effect too noticeably.
I would, of course, rather have VSync ON when the FPS can sync with my display (i.e. can produce 60fps).
Hence the reason for this thread:
6.
My hope is that perhaps we can use this thread to discuss ways in which we might be able to get Adaptive VSync (or any advanced VSync related tech such as G-Sync) to work with 3D Vision, either through specific settings, hacks, bypasses, 3DMigoto, or anything else.
We can also discuss things such as what would happen if we hacked G-Sync ON with 3D Vision as it stands?
NVidia have disabled this because each eye would receive a slightly different image - the thinking being that the difference would kill the 3D Vision effect.
The only problem is that this is a 100% false assumption and is simply bad science. The fact is that they would be correct IF and ONLY IF the image was a static picture or had very low FPS.
The reality is that one can receive different images per eye and STILL have a perfect 3D vision experience, as long as the image change rate (FPS) is decent enough.
HERESY! you might say. Liar! you might shout ;-)
I would say try it for yourself:
Put on your 3D Vision glasses and wave your hand in front of your face. Each eye is receiving a different image than the other. Look around your room.
Is the 3D Effect any less? NO!
Does it give you a headache? NO!
So, does receiving a different image for one eye compared to the other (at a decent FPS), compromise 3d at all?
Simply, the answer is no, the 3D is perfect even though each eye is receiving a different "frame" in time.
7.
In conclusion:
For better, smoother, more enjoyable gameplay in ALL 3D Vision games, let's try and somehow make Adaptive-Sync, G-Sync, FreeSync, etc work with 3D Vision! :)
How one might do this, I have no idea. What I do know to be true is that if enabled, they will work just fine, contrary to what nVidia might believe, and there is no justifiable technical reason for these technologies to not have been left enabled for 3D vision. It's another simple case of nvidia cutting off the nose because there is an itch on the tip.
What I do know is that forcing Adaptive-Sync through nVidia Profile Inspector has never worked with 3D Vison, and I do not have a G-Sync display but I suspect the outcome there to be the same...
This affects and would benefit all of us a great deal. Let's make this happen together!
Potential start: What happens if G-Sync is used with 3rd party StereoScopic 3D drivers where nVidia can't blanket disable G-Sync as with 3D Vision?
TriDef driver?
Old eDimensional driver?
Win3D stereoScopic driver?
iZ3D driver?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Adaptive vsync is also a good idea, at least, but I think it already "works" together with 3D Vision (at least with some games). I can enable it in the NVidia CP at the same time with 3D Vision, and I have noticed the differece in some games (Pro Evolutioin Soccer 2017 is a clear example, it is smoother with Adaptive vsync ON, and of course 3D Vision).
I have always wondered why Nvidia do not give us the opportunity to see a complete OSD information (with the typical toogle key to show and hide) into the game with all the parameters activated in the Nvidia CP with a toogle key, the same way MSI Afterburner does, to know exactly what is going on when playing a specific game. I don't even know sometimes if G-sync is activated and I have to push the monitor osd button if I need to be sure. It would be nice to see how adaptive vsync automatically toggles vsync on/off depending on the framerate when playing, and a way to test if it is working properly. And of course it would be great to have the oportunity to change the Nvidia CP parameters "in the fly" without leaving the game.
Maybe the idea is not to give us too much, because the consumer must buy another video card and there must be reasons.
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "Gigabyte RTX 2080 Gaming OC"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)
I always use vsync with the "forced on" setting. But yeah, when I don't get fps=Hz, it angers me :p. That's why I sometimes delay playing some games until I have a better GPU, or use less resolution, or 3D at 100Hz.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
I haven't tested other games, so it might not work. Would be interesting if somebody else sees the same thing (or maybe I only see it because I run 3D Surround).
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Shouldn't you be using Smooth Vsync, designed for SLI users?
@Helifax: As far as I can tell, adaptive sync only turns on VSync with 3D Vision - there is no tearing when FPS is < refresh unfortunately. I have just tried ME:A with your fix to be certain. Maybe it is a surround gaming thing... the plot thickens...
@D-Man11: Smooth Sync was a solution to a problem which no longer exists, back when FPS dropped to half the refresh rate when GPUs couldn't maintain FPS=Refesh. For very many years now, both OpenGL and DirectX have their own ways of rendering using triple buffering and render ahead, and more recently, fast sync, which emulates OpenGL triple buffering in DirectX for most purposes. Nowadays, smooth sync is a hindrance rather than a positive as it locks the FPS down to 30 fps as soon you go 1 fps below 60 :(
@masterotaku: That's very interesting. I think current 3DV+GSync is likely buggy - IMO, what you probably experienced was not a dead end per say, but merely a small road block that ought to be fixable relatively easily.
Has anyone ever tried the Tridef driver with G-Sync?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
They way I understood it, is that Smooth Sync is a variation of adaptive v-sync, but only for SLI users. It copes with the inherent microstutter, that can be associated with SLI.
Fastsync does not work with 3D or SLI.
I do not have SLI, so I have no first hand experience. But when I was considering it over a single GPU solution, I did some reading on it.
I don't know about tridef, but using PCSX2 in the old anaglyph/SBS/TAB mode (no 3D in the drivers) works fine with G-Sync. But it's a passive type of 3D, so it doesn't surprise me.
The question is if 3D Vision glasses would be able to support variable refresh rate themselves. They have shown a great static range (minimum 32Hz per eye and at the very least up to 82.5Hz per eye).
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
@D-Man11, thanks for the info. I have just tried it. Results: It no longer locks to half refresh as it did in the old days. Unfortunately, it still syncs with the refresh even under the refresh rate meaning no tearing is visible (which in our particular case is a bad thing) as we are interested in smooth fps transitions over tearing for the purposes of this discussion. It acting like standard VSync now was a pleasant surprise though, need more testing. For clarification, I used the setting "VSync Smooth AFR Behavior".
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
This game seems to be a bit "special" in terms of sync, but what is very clear is that adaptive sync makes a big difference in terms of smoothness, and I am always refering to playing with 3D glasses. So vsync on is not exactly the same as adaptive sync when using the glasses, at least with some games.
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "Gigabyte RTX 2080 Gaming OC"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
The problem might be technical (current glasses not designed to sync that fast, Nvidia not wanting to spend millions of $ redesigning the transmitter and glases)
It might also be psychovisual : what happens when one eye (always the same) sees the picture longer or shorter than the other eye. (how bad would the eyestrain be ?).
A possible solution to that could require a total redesign of the 3D vision hardware.
Passive systems would have a much better chance at transitioning to variable refresh rates since they display both eye views at the same time.
On my system (dual-projectors), back when I used an AMD card and direct driving the projectors from the computer (dual-output) I have spent quite a lot of time dealing with this issue...
The best and most reliable results are when transmitting side-by-side.
SBS transmission allows you to do anything you want with V-sync without affecting the quality of 3D.
I've had every type of V-sync working with Tridef and SBS mode, except the variable refresh rates... due to the lack of hardware. (Are there any variable refresh projectors out there ? I haven't seen any)
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
I know that you use DSR, perhaps in demanding games, you might be better served using Nvidia's FXAA solution for SLI.
Also MFAA is available for Pascal and Maxwell owners and can be used in almost any game that has a 2x/4x MSAA option in the game's options/settings.
As far as Free Sync, I do not think that there is any possibility there, because those monitors have an additional module installed from AMD. This module is not supported by Nvidia, so most likely a no go. But then again, there may be a hack, somewhere, by someone, that adds that support.
But I still disagree with your assumption that the frames are sent out one after the other and not in identical pairs offset for stereo.
The problem I'm facing isn't one of the GPU not having enough power but one of the CPU not having enough IPC/clock power. With 2x 1080s in SLi, it doesn't matter in most games whether you use DSR 1600p, MFAA/FXAA, or no AA at all @640x480 - the GPUs are never near even 90% utilisation because the CPU can't supply them with enough frames to process.
No-one says otherwise mate. What I am saying is that although they are sent over as stereo "pairs" one after the other at present (standard 3D Vision page-flipping) or a single half-resolution frame for SBS/OU, or both together in a single double-long frame for some technologies, *if they were sent out one after the other in time also (not a pair but every frame moved slightly along in time), 3D would work just fine, and I have given an example of how you can do a real life test to prove it to oneself.
*the FPS being high enough - this would vary from person to person - At 0 FPS for example, it wouldn't work at all, because you would have a stereo pair where the images would be different. At 10fps? Maybe. At 60fps (looking around your seating area with 3D Vision glasses activated), it works most definitely.
All the best :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.