3D Vision CPU Bottelneck: Gathering Information thread.
  6 / 22    
AMD FX-8350 GTX 1060 6GB - [color="green"]370.90[/color] Windows 10 x64 Anniversary While running and turning around so I tried to average the results Game: [color="green"]Mirrors Edge:Catalyst(DX11) CM mode[/color] - while running NO Stutter 3D Vision disabled in Control panel 66% CPU usage 44% GPU usage 160 FPS 3D Vision enabled in Control Panel - 3D Toggeled ON: Heavy Stutter 60% CPU usage 45% GPU usage 124 FPS 3D Vision enabled in Control Panel - 3D Toggeled OFF: Stutter 63% CPU usage 44% GPU usage 141 FPS Summary: 3dvison Disabled 2D Performance 100% 3dvision Enabled 2D Performance (141fps / 160fps) x 100 = 88% 3dvision Enabled 3D Performance (124fps / 160fps) x 100 = 78% 100-88 = [color="green"]12% performance drop caused by 3D Vision driver[/color] 100-78 = [color="green"]22% performance drop caused by 3D Vision driver with 3D enabled in game[/color] [quote="terintamel"] AMD FX-8350 GTX 1060 6GB - 370.72 Windows 10 x64 This game was a hard one as the stutter and drops only happen while running and turning around so I tried to average the results Game: [color="green"]Mirrors Edge:Catalyst(DX11) CM mode[/color] - while running NO Stutter 3D Vision disabled in Control panel 69% CPU usage 42% GPU usage 145 FPS 3D Vision enabled in Control Panel - 3D Toggeled ON: Heavy Stutter 45% CPU usage 45% GPU usage 41 FPS 3D Vision enabled in Control Panel - 3D Toggeled OFF: Stutter 51% CPU usage 37% GPU usage 71 FPS Summary: 3dvison Disabled 2D Performance 100% 3dvision Enabled 2D Performance (71fps / 145fps) x 100 = 48.9% 3dvision Enabled 3D Performance (41fps / 145fps) x 100 = 28.2% 100-48.9 = [color="green"]51.1% performance drop caused by 3D Vision driver[/color] 100-28.2 = [color="green"]71.8% performance drop caused by 3D Vision driver with 3D enabled in game[/color] [/quote][/quote]
AMD FX-8350
GTX 1060 6GB - 370.90
Windows 10 x64 Anniversary

While running and turning around so I tried to average the results

Game: Mirrors Edge:Catalyst(DX11) CM mode - while running
NO Stutter
3D Vision disabled in Control panel
66% CPU usage
44% GPU usage
160 FPS

3D Vision enabled in Control Panel - 3D Toggeled ON:
Heavy Stutter
60% CPU usage
45% GPU usage
124 FPS

3D Vision enabled in Control Panel - 3D Toggeled OFF:
Stutter
63% CPU usage
44% GPU usage
141 FPS


Summary:
3dvison Disabled 2D Performance 100%
3dvision Enabled 2D Performance (141fps / 160fps) x 100 = 88%
3dvision Enabled 3D Performance (124fps / 160fps) x 100 = 78%

100-88 = 12% performance drop caused by 3D Vision driver
100-78 = 22% performance drop caused by 3D Vision driver with 3D enabled in game

terintamel said:
AMD FX-8350
GTX 1060 6GB - 370.72
Windows 10 x64

This game was a hard one as the stutter and drops only happen while running and turning around so I tried to average the results

Game: Mirrors Edge:Catalyst(DX11) CM mode - while running
NO Stutter
3D Vision disabled in Control panel
69% CPU usage
42% GPU usage
145 FPS

3D Vision enabled in Control Panel - 3D Toggeled ON:
Heavy Stutter
45% CPU usage
45% GPU usage
41 FPS

3D Vision enabled in Control Panel - 3D Toggeled OFF:
Stutter
51% CPU usage
37% GPU usage
71 FPS


Summary:
3dvison Disabled 2D Performance 100%
3dvision Enabled 2D Performance (71fps / 145fps) x 100 = 48.9%
3dvision Enabled 3D Performance (41fps / 145fps) x 100 = 28.2%

100-48.9 = 51.1% performance drop caused by 3D Vision driver
100-28.2 = 71.8% performance drop caused by 3D Vision driver with 3D enabled in game

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#76
Posted 09/23/2016 10:11 PM   
Hi fellas, It would be best to keep you guys updates on what's happening. That way, we can discuss their / our findings and there is less chance of me overlooking something. This is the progress of what the communication has been so far. I am copy-pasting from my emails, so all formatting has been removed. For readability, I will colour my emails in [color="orange"]orange[/color], and nVidia's emails in [color="green"]green[/color]. [color="orange"]Hi Ray, Myself and the folks over at the nVidia 3D Vision forum have been investigating what seems to be a CPU bottleneck problem. It's a little complex in it's manifestation, but definitely reproducible. The end result is that most if not all games are severely performance impacted due to this. I was hoping if you could investigate? We have compiled a good set of results on a thread. I would be happy to explain what happens but the person that is assigned would really have to be clued into bottlenecks, CPU usage, GPU scaling etc, as the idea seems to be quite difficult for the uninitiated to understand :( What is the best way to proceed? We would rather not go through tier 1 support, as it's a huge waste of time and effort :) Kind regards, -- Shahzad[/color] Reply from nVidia: [color="green"] Hi Shahzad, If you can elaborate on the issue, and provide all the details and data on how to re-create the issue, I'll be happy to submit a bug so our development team can investigate. As with any bug the key to fixing the problem is our ability to replicate the failure in our lab, that way we have a failing system we can provide to development team to debug the failure. Once I have more details of the actual problem I can request additional details to help our lab to secure a failing setup. Best regards, Ray [/color]
Hi fellas,

It would be best to keep you guys updates on what's happening. That way, we can discuss their / our findings and there is less chance of me overlooking something.

This is the progress of what the communication has been so far. I am copy-pasting from my emails, so all formatting has been removed. For readability, I will colour my emails in orange, and nVidia's emails in green.

Hi Ray,

Myself and the folks over at the nVidia 3D Vision forum have been investigating what seems to be a CPU bottleneck problem. It's a little complex in it's manifestation, but definitely reproducible. The end result is that most if not all games are severely performance impacted due to this. I was hoping if you could investigate?

We have compiled a good set of results on a thread. I would be happy to explain what happens but the person that is assigned would really have to be clued into bottlenecks, CPU usage, GPU scaling etc, as the idea seems to be quite difficult for the uninitiated to understand :(

What is the best way to proceed?

We would rather not go through tier 1 support, as it's a huge waste of time and effort :)

Kind regards,
-- Shahzad


Reply from nVidia:

Hi Shahzad,

If you can elaborate on the issue, and provide all the details and data on how to re-create the issue, I'll be happy to submit a bug so our development team can investigate. As with any bug the key to fixing the problem is our ability to replicate the failure in our lab, that way we have a failing system we can provide to development team to debug the failure. Once I have more details of the actual problem I can request additional details to help our lab to secure a failing setup.

Best regards,
Ray

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#77
Posted 10/21/2016 03:26 AM   
[color="orange"]Thank you Ray, I'll preface this by saying that most people who are not knowledgeable won't see a problem, as we have found out. The last thing we want is for you guys is to waste your time and come back to us with "there is no problem" simply because it hasn't been understood. Towards that end, I'll have to give some background on the testing methodology. The problem has been confirmed by many users. Please bare with us :) Premise: 3D Vision should have next to no CPU overhead, but it transpires through testing that not only it does, but it does to a significant degree. Fact: Setting low resolution / graphics settings vs setting high resolution and high graphics settings have about the exact same load on the CPU. The only loading that changes is on the GPU. Problem: Even when the GPU is not bottlenecking performance (Let's say GPU usage in an arbitrary game in both 2D and 3D Vision are below 90%), the CPU usage significantly DECREASES when 3D Vision is enabled in almost all games. 3D Vision seems to cause a CPU bottleneck. If a game is using multi-threading (as most games do nowadays), we believe that 3D Vision forces the main game thread to have some kind of pipeline overhead, so the nmain game thread can't pass off its workload to the other CPU game threads as efficiently, so the overall CPU usage with 3D Vision ON becomes decreased. This in turn means the CPU is no longer able to supply enough data to the GPU for processing, which results in hugely degraded and stuttery FPS. This is a problem which will only get worse for our 3D Vision community going forward. The vast majority of games are designed for consoles, which are getting more powerful every year. Unfortunately, PC CPU performance has stagnated over the past 7 years (only ~10 percent increase every generation every few years). As new games will be designed for more powerful consoles, no amount of reducing the settings and tweaking will give us 3D Vision users playable FPS. The problem is worse for games that use more threads, for example a game which uses 5-6 cores such as GTA5 is limited to the performance of 3 cores, which results in a 65% performance decrease CPU bottleneck caused by 3D Vision is confirmed because you can overclock/underclock the CPU alone, while everything else in the system remains constant. With 3D Vision ON, the game fps and GPU usage will scale with the overclock / underclock of the CPU. Some games of note which exhibit the CPU bottleneck with 3D Vision (tested while the GPU usage remains well below saturation): GTA5 (65% performance decrease) The Witcher 3 (50% performance decrease) Call of Duty:Advanced Warfare (72% performance decrease) Deus Ex: Mankind Divided (53% performance decrease) Standard Background information which applies to even 2D games: 1. CPU core usage Fundamentals Taking an example of an arbitrarily chosen game, if that game is programmed to only use 2 threads, and if there is no GPU bottlenec (Game settings all set to low so that the GPU usage is less than 90% in all scenareos), then: 2 Core CPU = 100% usage when CPU becomes saturated and becomes a bottleneck 4 Core CPU = 50% usage when CPU becomes saturated and becomes a bottleneck 6 core CPU = 33% usage when CPU becomes saturated and becomes a bottleneck 8 core CPU = 25% usage when CPU becomes saturated and becomes a bottleneck 2. GPU usage scaling with CPU core usage Fundamentals [Preface: numbers below are ideal cases, and do not perfectly represent real life scenareos, but are close approximations] For example, 2 Thread game. Taking example of 4 core CPU as above: "4 Core CPU = 50% usage when CPU becomes saturated and becomes a bottleneck" Scenareo 1, CPU bottlenecking GPU and FPS. Game = Some_Game_Name Resolution = 1080p CPU usage = 50% GPU usage = 50% FPS = 100FPS (CPU is bottlenecking the GPU) If SLI, GPU usage on each card = ~25% FPS = Still 100 fps (CPU is bottlenecking the GPU) Scenareo 2, no CPU or GPU bottleneck. Upping the resolution to 1440p CPU usage = 50% GPU usage = 100% FPS = 100FPS (no bottleneck, either in the CPU or the GPU) If SLI, GPU usage on each card = ~50% FPS = Still 100 fps (CPU is bottlenecking the GPU) Scenareo 3, GPU bottlenecking CPU and FPS. Upping the resolution to 4k CPU usage = 25% GPU usage = 100% FPS = 50FPS (GPU bottleneck the CPU Usage) If SLI, CPU usage = 50% GPU usage on each card = 100% FPS = 100 fps (no bottleneck, either in the CPU or the GPU) Scenareo 1: In the 1080p scenario, the 4 core CPU is being maxed out by the 2 game threads at 50% CPU usage. This is the max that the CPU can be used by the game. We know this because the GPU is less than saturation (i.e. < 95% or so). Here, no matter how powerful your graphics card, or how many cards you have in SLi, you will never have anything higher than 100 FPS because the CPU is holding the GPUs back. (Side note: If you Overclock/underclock the CPU, the GPU usage as well as the FPS will increase/decrease linearly with the CPU clock speed!) Scenareo 2: The 1440p Scenario is where we have 'doubled' the resolution. The CPU is being utilised at 50% max. Due to doubling the resolution, there is no extra strain on the CPU but there is a double strain on the GPU, so the GPU will go to 100%, but your FPS will still be 100FPS. Adding SLi here will drop the GPUs down to 50% each, but your FPS will still be the same because the CPU will then become the bottleneck. Scenareo 3: Now look at what happens when you finally come to a GPU bottleneck: The 4k Scenario is where we have quadrupled the resolution. Because of this huge resolution, the GPU is the new bottleneck. The GPU can't supply FPS by number crunching fast enough so the CPU (remember there is no significant impact on the CPU when you increase resolution or graphics settings) has nothing to do half the time, so the CPU usage is very low, say 25%. Due to quadrupling of the resolution, there is quadruple the strain on the GPU. Your single card will be at 100% but of course, only being able to produce 50 fps. (Side note: If you Overclock/(or underclock) the CPU, the GPU usage as well as the FPS will NOT CHANGE, because the CPU is sitting idle half the time already!, However, the CPU USAGE will show a change in average usage because the game threads are being processed in less (or more) CPU time due to the overclock (or underclock)! NOW, if you SLi, your SLi will have both cards at 100%, and your CPU can finally stretch is muscles because there is no GPU bottleneck, so it too can work at its full capacity for the game (CPU at 50% usage). This means that you get double your FPS, to 100fps. This is basic knowledge about CPU/GPU performance and scaling in 2D games without 3D Vision installed. If you don't understand this, you will definitely not be able to understand what is happening in our tests in this thread. This is very important to understand, as 3D Vision ON, for all intents and purposes, has an extremely similar performance impact to doubling the game resolution. ================= Testing Methodology: It's simple! All we are trying to do here examine what the CPU usage and FPS are like when the GPU is not being saturated (has less than 90% load in 2D and 3D). We have to keep in mind that 3D Vision doubles the GPU load. How do we do this? Simple! Use the lowest resolution and graphics settings available in a given game. Since we are measuring CPU usage in multithreaded games, the problem will become most apparent if we have a test rig running more (maybe 6 or so) physical cores, the CPU is clocked at around 2 GHz, and having Hyper-threading Disabled. As for GPU, to ensure GPU is less than saturated in tests (even with all graphics settings set to lowest), we have used systems with powerful GPU setups such as 1080 SLi etc. The best game to test this with is GTA5, as it is heavily multithreaded, and can use a 6 Core CPU properly. [nVidia] Driver revisions: We don't know at what driver revision this problem started, but it remains up to the latest driver revision. We Installed CPU/GPU In-Game monitoring software such as MSI Afterburner, and enable showing of: a. Average CPU usage b. Average GPU usage c. Average FPS Load up a game, go to a CPU intensive area where your 3D Vision FPS is: 1. Less than the game FPS cap 2. Less than half your VSYNC e.g, for 120Hz monitor, ensure you are getting less than 60FPS in 3D. This is to ensure that your system performance isn't being capped, as your CPU and GPU will not work any harder when your game reaches and is capped at 60FPS. Toggle 3D Vision to OFF and note down: 1. CPU usage 2: GPU usage 3. FPS Then Toggle 3D Vision to ON and note down: 1. CPU usage 2: GPU usage 3. FPS To stress again, neither of these should have the GPU at >90%, or the 3D FPS anywhere near 60. If there is NO problem, then what you should see is that the CPU usage remains the same in 2D and 3D Vision ON, however, the GPU usage significantly increases (3D Vision doubles the stress on a GPU). However, the FPS should remain about the same in 2D as well as 3D as the GPU is not being bottlenecked. Example using a 4 core system with a 2 thread game: Game Name: Some_Game_Name Toggle 3D Vision to OFF: 50% CPU usage 40% GPU usage 50 FPS Toggle 3D Vision to ON: 50% CPU usage 80% GPU usage <--- double GPU usage as 3D Vision causes about double the load on the GPU. 50 FPS Notice how the GPU is working twice as hard in 3D Vision, but you are still getting the same FPS. This would shows that the 3D Vision driver is working perfectly as it should and not causing any bottlenecks. Unfortunately, what you will likely find is that the results are more like: Game Name: Some_Game_Name Toggle 3D Vision to OFF: 50% CPU usage 40% GPU usage 50 FPS Toggle 3D Vision to ON: 33% CPU usage 47% GPU usage 22 FPS Here, the 3D Vision driver has bottlenecked the CPU which has in turn not enough data to feed the GPU, so you are getting a severely degraded FPS. This is NOT a GPU under-utilisation problem. Overclocking the CPU alone will have the GPU and game FPS scale up and down with CPU clock speed. So far, the only way around the issue is brute force: use the latest Intel processors at a heavy overclock, which will increase performance to a certain degree. Unfortunately, this is not a viable solution, especially going into the future. Results from multiple users with various different systems all show that with 3D Vision, even when the GPU is not being saturated, for some strange reason, the CPU usage DECREASES, which leads to low FPS and GPU usage: ========================== Game: Grand Theft Auto 5 Toggle 3D Vision to OFF: 80% CPU usage 98% GPU usage 138 FPS Toggle 3D Vision to ON: 45% CPU usage 48% GPU usage 50 FPS Summary: 2D Performance 100% 3D Performance (50fps / 138fps) x 100 = 36% CPU usage drop in 3D Vision = [(2D 80% CPU usage) - (3D 45% CPU usage)] / 2D 80% CPU usage) x 100 = 44% CPU usage drop caused by 3D Vision 100-36 = 64% performance drop due purely to CPU this bottleneck caused by 3D Vision. We also did some tests with core affinity to see if the number of cores which were able to be used by the game was impacted. These results show the problem in more detail: Individual core usage analysis by setting the core affinity in task manager at different CPU clocks: Intel Xeon x5660 @ 4.2GHz, 3D Vision enabled 1 cores = 12 fps 2 cores = 40 fps, GPU1 @ 35%, GPU2 @ 46%, Total (Average) GPU Usage = 40% 3 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48% 4 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48% 5 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48% 6 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48% 3D Vision Toggled Off: 6 cores 2D = 138fps 97% 98% (toggled off), Total (Average) GPU Usage = 97% Intel Xeon x5660 @ 2.4GHz 1 cores = 7 fps 2 cores = 25 fps, GPU1 @ 27%, GPU2 @ 46%, Total (Average)GPU Usage = 36% 3 cores = 36 fps, GPU1 @ 33%, GPU2 @ 46%, Total (Average)GPU Usage = 39% 4 cores = 38 fps, GPU1 @ 35%, GPU2 @ 47%, Total (Average)GPU Usage = 41% 5 cores = 38 fps, GPU1 @ 35%, GPU2 @ 47%, Total (Average) GPU Usage = 41% 6 cores = 38 fps, GPU1 @ 35%, GPU2 @ 47%, Total (Average) GPU Usage = 41% 3D Vision Toggled Off: 6 cores 2D = 100fps, GPU1 @ 65%, GPU2 @ 65% (toggled off), Total (Average) GPU Usage = 65% Clearly, the game performance and GPU usage scale up purely with the CPU clock speed. This shows that it's a CPU bottleneck, if the GPU isn't being saturated. ========================== ========================== Game: Call of Duty: Advanced Warefare Toggle 3D Vision to OFF: 60% CPU usage 71% GPU usage 171 FPS Toggle 3D Vision to ON: 25% CPU usage 26% GPU usage 48 FPS Summary: 2D Performance 100% 3D Performance (48fps / 171fps) x 100 = 28% CPU usage drop in 3D Vision = [(2D 60% CPU usage) - (3D 25% CPU usage)] / 2D 60% CPU usage) x 100 = 60% CPU usage drop caused by 3D Vision 100-28 = 72% performance drop due purely to this CPU bottleneck caused by 3D Vision. Individual core usage analysis by setting the core affinity in task manager: Intel Xeon x5660 @ 4.4GHz, 6 physical cores, Hyper Threading OFF. 1 cores 3D Vision = 26 fps, GPU1 @ 14% + GPU2 @ 12%, Total GPU Usage = 13% 2 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26% 3 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26% 4 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26% 5 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26% 6 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26% 1 cores 2D = 51 fps, GPU1 @ 25% + GPU2 @ 24% (toggled off), Total GPU Usage = 25% 2 cores 2D = 101fps, GPU1 @ 41% + GPU2 @ 43% (toggled off), Total GPU Usage = 42% 3 cores 2D = 142fps, GPU1 @ 64% + GPU2 @ 65% (toggled off), Total GPU Usage = 65% 4 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71% 5 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71% 6 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71% ========================== ========================== Game: The Witcher 3 Toggle 3D Vision to OFF: 45% CPU usage 63% GPU usage 82 FPS Toggle 3D Vision to ON: 26% CPU usage 58% GPU usage 41 FPS Summary: 2D Performance 100% 3D Performance (41fps / 82fps) x 100 = 50% CPU usage drop in 3D Vision = [(2D 45% CPU usage) - (3D 26% CPU usage)] / 2D 45% CPU usage) x 100 = 42% CPU usage drop caused by 3D Vision 100-50 = 50% performance drop due purely to this CPU bottleneck caused by 3D Vision. ========================== ========================== Game: Far Cry 3 Toggle 3D Vision to OFF: 53% CPU usage 47% GPU usage 88 FPS Toggle 3D Vision to ON: 38% CPU usage 49% GPU usage 35 FPS Summary: 2D Performance 100% 3D Performance (35fps / 88fps) x 100 = 40% CPU usage drop in 3D Vision = [(2D 53% CPU usage) - (3D 38% CPU usage)] / 2D 53% CPU usage) x 100 = 28% CPU usage drop caused by 3D Vision 100-40 = 60% performance drop due purely to this CPU bottleneck caused by 3D Vision. ========================== ========================== Game: Deus Ex: Mankind Divided Toggle 3D Vision to OFF: 74% CPU usage 70% GPU usage 66 FPS Toggle 3D Vision to ON: 55% CPU usage 63% GPU usage 31 FPS Summary: 2D Performance 100% 3D Performance (31fps / 66fps) x 100 = 47% CPU usage drop in 3D Vision = [(2D 74% CPU usage) - (3D 55% CPU usage)] / 2D 80% CPU usage) x 100 = 26% CPU usage drop caused by 3D Vision 100-47 = 53% performance drop due purely to this CPU bottleneck caused by 3D Vision. ========================== ========================== Game: Batman: Arkham Origins Toggle 3D Vision to OFF: 31% CPU usage 30% GPU usage 72 FPS Toggle 3D Vision to ON: 21% CPU usage 13% GPU usage 37 FPS Summary: 2D Performance 100% 3D Performance (31fps / 66fps) x 100 = 51% CPU usage drop in 3D Vision = [(2D 31% CPU usage) - (3D 21% CPU usage)] / 2D 31% CPU usage) x 100 = 32% CPU usage drop caused by 3D Vision 100-51 = 49% performance drop due purely to this CPU bottleneck caused by 3D Vision. ========================== ============================ XCOM 2 3D off, enabled in Nvidia control panel. CPU usage 32.25% GPU usage 47% FPS 80 3D on CPU usage 28.75 GPU usage 61% FPS 51 Summary 2D Performance 100% 3D performance (51 / 80) x 100 = 63.75% CPU usage drop in 3D Vision = [(2D 32.25% CPU usage) - (3D 28.75% CPU usage)] / 2D 32.25% CPU usage) x 100 = 11% CPU usage drop caused by 3D Vision 100-63.75 = 36.25% performance drop due purely to this CPU bottleneck caused by 3D Vision. ============================ ============================ MAFIA II (3d vision ready) 3D off (enabled in control panel) CPU usage - 41% GPU usage 22% FPS - 100 3D enabled CPU usage - 34% GPU usage - 22% FPS - 53 Summary 2D performance 100% 3D performance (44 / 80) x 100 = 53% CPU usage drop in 3D Vision = [(2D 41 CPU usage) - (3D 34 CPU usage)] / 2D 41 CPU usage) x 100 = 17% CPU usage drop caused by 3D Vision 100-53 = 47% performance drop due purely to this CPU bottleneck caused by 3D Vision. ======================== We really hope that you are able to replicate the problem, and hopefully fix the issue. If you require more information, please let us know! Thank you Ray! Kind regards, -- Shahzad Ali Cambridge UK [/color] [color="green"] Thanks for the update Shahzad, certainly a lot of data here to digest. Let me forward this to our quality lab and development team for review. I checked our database and I'm not finding any known bug related to 3D Vision and CPU overhead. I don't know enough about of 3D Vision technology so let me get these data to our quality lab and development team and get their input. If need be I will submit a bug to have our quality lab try to replicate the CPU overhead internally. I suspect if development has to investigate they will need a failing system to debug. Can you provide the link to the discussion forum for reference? Best regards, Ray [/color]
Thank you Ray,

I'll preface this by saying that most people who are not knowledgeable won't see a problem, as we have found out. The last thing we want is for you guys is to waste your time and come back to us with "there is no problem" simply because it hasn't been understood. Towards that end, I'll have to give some background on the testing methodology. The problem has been confirmed by many users. Please bare with us :)

Premise:
3D Vision should have next to no CPU overhead, but it transpires through testing that not only it does, but it does to a significant degree.

Fact: Setting low resolution / graphics settings vs setting high resolution and high graphics settings have about the exact same load on the CPU. The only loading that changes is on the GPU.

Problem:
Even when the GPU is not bottlenecking performance (Let's say GPU usage in an arbitrary game in both 2D and 3D Vision are below 90%), the CPU usage significantly DECREASES when 3D Vision is enabled in almost all games. 3D Vision seems to cause a CPU bottleneck.

If a game is using multi-threading (as most games do nowadays), we believe that 3D Vision forces the main game thread to have some kind of pipeline overhead, so the nmain game thread can't pass off its workload to the other CPU game threads as efficiently, so the overall CPU usage with 3D Vision ON becomes decreased. This in turn means the CPU is no longer able to supply enough data to the GPU for processing, which results in hugely degraded and stuttery FPS.

This is a problem which will only get worse for our 3D Vision community going forward. The vast majority of games are designed for consoles, which are getting more powerful every year. Unfortunately, PC CPU performance has stagnated over the past 7 years (only ~10 percent increase every generation every few years). As new games will be designed for more powerful consoles, no amount of reducing the settings and tweaking will give us 3D Vision users playable FPS.

The problem is worse for games that use more threads, for example a game which uses 5-6 cores such as GTA5 is limited to the performance of 3 cores, which results in a 65% performance decrease

CPU bottleneck caused by 3D Vision is confirmed because you can overclock/underclock the CPU alone, while everything else in the system remains constant. With 3D Vision ON, the game fps and GPU usage will scale with the overclock / underclock of the CPU.

Some games of note which exhibit the CPU bottleneck with 3D Vision (tested while the GPU usage remains well below saturation):

GTA5 (65% performance decrease)
The Witcher 3 (50% performance decrease)
Call of Duty:Advanced Warfare (72% performance decrease)
Deus Ex: Mankind Divided (53% performance decrease)



Standard Background information which applies to even 2D games:

1. CPU core usage Fundamentals
Taking an example of an arbitrarily chosen game, if that game is programmed to only use 2 threads, and if there is no GPU bottlenec (Game settings all set to low so that the GPU usage is less than 90% in all scenareos), then:
2 Core CPU = 100% usage when CPU becomes saturated and becomes a bottleneck
4 Core CPU = 50% usage when CPU becomes saturated and becomes a bottleneck
6 core CPU = 33% usage when CPU becomes saturated and becomes a bottleneck
8 core CPU = 25% usage when CPU becomes saturated and becomes a bottleneck

2. GPU usage scaling with CPU core usage Fundamentals
[Preface: numbers below are ideal cases, and do not perfectly represent real life scenareos, but are close approximations]


For example,
2 Thread game. Taking example of 4 core CPU as above:
"4 Core CPU = 50% usage when CPU becomes saturated and becomes a bottleneck"

Scenareo 1, CPU bottlenecking GPU and FPS.
Game = Some_Game_Name
Resolution = 1080p
CPU usage = 50%
GPU usage = 50%
FPS = 100FPS (CPU is bottlenecking the GPU)

If SLI, GPU usage on each card = ~25%
FPS = Still 100 fps (CPU is bottlenecking the GPU)

Scenareo 2, no CPU or GPU bottleneck.
Upping the resolution to 1440p
CPU usage = 50%
GPU usage = 100%
FPS = 100FPS (no bottleneck, either in the CPU or the GPU)

If SLI, GPU usage on each card = ~50%
FPS = Still 100 fps (CPU is bottlenecking the GPU)

Scenareo 3, GPU bottlenecking CPU and FPS.
Upping the resolution to 4k
CPU usage = 25%
GPU usage = 100%
FPS = 50FPS (GPU bottleneck the CPU Usage)

If SLI,
CPU usage = 50%
GPU usage on each card = 100%
FPS = 100 fps (no bottleneck, either in the CPU or the GPU)

Scenareo 1: In the 1080p scenario, the 4 core CPU is being maxed out by the 2 game threads at 50% CPU usage. This is the max that the CPU can be used by the game. We know this because the GPU is less than saturation (i.e. < 95% or so). Here, no matter how powerful your graphics card, or how many cards you have in SLi, you will never have anything higher than 100 FPS because the CPU is holding the GPUs back. (Side note: If you Overclock/underclock the CPU, the GPU usage as well as the FPS will increase/decrease linearly with the CPU clock speed!)

Scenareo 2: The 1440p Scenario is where we have 'doubled' the resolution. The CPU is being utilised at 50% max. Due to doubling the resolution, there is no extra strain on the CPU but there is a double strain on the GPU, so the GPU will go to 100%, but your FPS will still be 100FPS. Adding SLi here will drop the GPUs down to 50% each, but your FPS will still be the same because the CPU will then become the bottleneck.


Scenareo 3: Now look at what happens when you finally come to a GPU bottleneck:
The 4k Scenario is where we have quadrupled the resolution. Because of this huge resolution, the GPU is the new bottleneck. The GPU can't supply FPS by number crunching fast enough so the CPU (remember there is no significant impact on the CPU when you increase resolution or graphics settings) has nothing to do half the time, so the CPU usage is very low, say 25%. Due to quadrupling of the resolution, there is quadruple the strain on the GPU. Your single card will be at 100% but of course, only being able to produce 50 fps. (Side note: If you Overclock/(or underclock) the CPU, the GPU usage as well as the FPS will NOT CHANGE, because the CPU is sitting idle half the time already!, However, the CPU USAGE will show a change in average usage because the game threads are being processed in less (or more) CPU time due to the overclock (or underclock)!

NOW, if you SLi, your SLi will have both cards at 100%, and your CPU can finally stretch is muscles because there is no GPU bottleneck, so it too can work at its full capacity for the game (CPU at 50% usage). This means that you get double your FPS, to 100fps.


This is basic knowledge about CPU/GPU performance and scaling in 2D games without 3D Vision installed. If you don't understand this, you will definitely not be able to understand what is happening in our tests in this thread.

This is very important to understand, as 3D Vision ON, for all intents and purposes, has an extremely similar performance impact to doubling the game resolution.

=================

Testing Methodology:
It's simple! All we are trying to do here examine what the CPU usage and FPS are like when the GPU is not being saturated (has less than 90% load in 2D and 3D). We have to keep in mind that 3D Vision doubles the GPU load.

How do we do this? Simple! Use the lowest resolution and graphics settings available in a given game.

Since we are measuring CPU usage in multithreaded games, the problem will become most apparent if we have a test rig running more (maybe 6 or so) physical cores, the CPU is clocked at around 2 GHz, and having Hyper-threading Disabled. As for GPU, to ensure GPU is less than saturated in tests (even with all graphics settings set to lowest), we have used systems with powerful GPU setups such as 1080 SLi etc.

The best game to test this with is GTA5, as it is heavily multithreaded, and can use a 6 Core CPU properly.

[nVidia] Driver revisions: We don't know at what driver revision this problem started, but it remains up to the latest driver revision.

We Installed CPU/GPU In-Game monitoring software such as MSI Afterburner, and enable showing of:
a. Average CPU usage
b. Average GPU usage
c. Average FPS

Load up a game, go to a CPU intensive area where your 3D Vision FPS is:
1. Less than the game FPS cap
2. Less than half your VSYNC e.g, for 120Hz monitor, ensure you are getting less than 60FPS in 3D. This is to ensure that your system performance isn't being capped, as your CPU and GPU will not work any harder when your game reaches and is capped at 60FPS.

Toggle 3D Vision to OFF and note down:
1. CPU usage
2: GPU usage
3. FPS

Then

Toggle 3D Vision to ON and note down:
1. CPU usage
2: GPU usage
3. FPS

To stress again, neither of these should have the GPU at >90%, or the 3D FPS anywhere near 60.

If there is NO problem, then what you should see is that the CPU usage remains the same in 2D and 3D Vision ON, however, the GPU usage significantly increases (3D Vision doubles the stress on a GPU). However, the FPS should remain about the same in 2D as well as 3D as the GPU is not being bottlenecked.

Example using a 4 core system with a 2 thread game:
Game Name: Some_Game_Name
Toggle 3D Vision to OFF:
50% CPU usage
40% GPU usage
50 FPS

Toggle 3D Vision to ON:
50% CPU usage
80% GPU usage <--- double GPU usage as 3D Vision causes about double the load on the GPU.
50 FPS

Notice how the GPU is working twice as hard in 3D Vision, but you are still getting the same FPS. This would shows that the 3D Vision driver is working perfectly as it should and not causing any bottlenecks.

Unfortunately, what you will likely find is that the results are more like:

Game Name: Some_Game_Name
Toggle 3D Vision to OFF:
50% CPU usage
40% GPU usage
50 FPS

Toggle 3D Vision to ON:
33% CPU usage
47% GPU usage
22 FPS


Here, the 3D Vision driver has bottlenecked the CPU which has in turn not enough data to feed the GPU, so you are getting a severely degraded FPS.

This is NOT a GPU under-utilisation problem. Overclocking the CPU alone will have the GPU and game FPS scale up and down with CPU clock speed.

So far, the only way around the issue is brute force: use the latest Intel processors at a heavy overclock, which will increase performance to a certain degree. Unfortunately, this is not a viable solution, especially going into the future.

Results from multiple users with various different systems all show that with 3D Vision, even when the GPU is not being saturated, for some strange reason, the CPU usage DECREASES, which leads to low FPS and GPU usage:

==========================
Game: Grand Theft Auto 5
Toggle 3D Vision to OFF:
80% CPU usage
98% GPU usage
138 FPS

Toggle 3D Vision to ON:
45% CPU usage
48% GPU usage
50 FPS

Summary:
2D Performance 100%
3D Performance (50fps / 138fps) x 100 = 36%

CPU usage drop in 3D Vision = [(2D 80% CPU usage) - (3D 45% CPU usage)] / 2D 80% CPU usage) x 100 = 44% CPU usage drop caused by 3D Vision

100-36 = 64% performance drop due purely to CPU this bottleneck caused by 3D Vision.

We also did some tests with core affinity to see if the number of cores which were able to be used by the game was impacted. These results show the problem in more detail: Individual core usage analysis by setting the core affinity in task manager at different CPU clocks:

Intel Xeon x5660 @ 4.2GHz, 3D Vision enabled
1 cores = 12 fps
2 cores = 40 fps, GPU1 @ 35%, GPU2 @ 46%, Total (Average) GPU Usage = 40%
3 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48%
4 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48%
5 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48%
6 cores = 50 fps, GPU1 @ 37%, GPU2 @ 60%, Total (Average) GPU Usage = 48%

3D Vision Toggled Off:
6 cores 2D = 138fps 97% 98% (toggled off), Total (Average) GPU Usage = 97%


Intel Xeon x5660 @ 2.4GHz
1 cores = 7 fps
2 cores = 25 fps, GPU1 @ 27%, GPU2 @ 46%, Total (Average)GPU Usage = 36%
3 cores = 36 fps, GPU1 @ 33%, GPU2 @ 46%, Total (Average)GPU Usage = 39%
4 cores = 38 fps, GPU1 @ 35%, GPU2 @ 47%, Total (Average)GPU Usage = 41%
5 cores = 38 fps, GPU1 @ 35%, GPU2 @ 47%, Total (Average) GPU Usage = 41%
6 cores = 38 fps, GPU1 @ 35%, GPU2 @ 47%, Total (Average) GPU Usage = 41%

3D Vision Toggled Off:
6 cores 2D = 100fps, GPU1 @ 65%, GPU2 @ 65% (toggled off), Total (Average) GPU Usage = 65%

Clearly, the game performance and GPU usage scale up purely with the CPU clock speed. This shows that it's a CPU bottleneck, if the GPU isn't being saturated.


==========================
==========================
Game: Call of Duty: Advanced Warefare
Toggle 3D Vision to OFF:
60% CPU usage
71% GPU usage
171 FPS

Toggle 3D Vision to ON:
25% CPU usage
26% GPU usage
48 FPS

Summary:
2D Performance 100%
3D Performance (48fps / 171fps) x 100 = 28%

CPU usage drop in 3D Vision = [(2D 60% CPU usage) - (3D 25% CPU usage)] / 2D 60% CPU usage) x 100 = 60% CPU usage drop caused by 3D Vision


100-28 = 72% performance drop due purely to this CPU bottleneck caused by 3D Vision.


Individual core usage analysis by setting the core affinity in task manager:

Intel Xeon x5660 @ 4.4GHz, 6 physical cores, Hyper Threading OFF.
1 cores 3D Vision = 26 fps, GPU1 @ 14% + GPU2 @ 12%, Total GPU Usage = 13%
2 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
3 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
4 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
5 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
6 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%

1 cores 2D = 51 fps, GPU1 @ 25% + GPU2 @ 24% (toggled off), Total GPU Usage = 25%
2 cores 2D = 101fps, GPU1 @ 41% + GPU2 @ 43% (toggled off), Total GPU Usage = 42%
3 cores 2D = 142fps, GPU1 @ 64% + GPU2 @ 65% (toggled off), Total GPU Usage = 65%
4 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
5 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
6 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%

==========================
==========================
Game: The Witcher 3
Toggle 3D Vision to OFF:
45% CPU usage
63% GPU usage
82 FPS

Toggle 3D Vision to ON:
26% CPU usage
58% GPU usage
41 FPS

Summary:
2D Performance 100%
3D Performance (41fps / 82fps) x 100 = 50%

CPU usage drop in 3D Vision = [(2D 45% CPU usage) - (3D 26% CPU usage)] / 2D 45% CPU usage) x 100 = 42% CPU usage drop caused by 3D Vision

100-50 = 50% performance drop due purely to this CPU bottleneck caused by 3D Vision.
==========================
==========================

Game: Far Cry 3
Toggle 3D Vision to OFF:
53% CPU usage
47% GPU usage
88 FPS

Toggle 3D Vision to ON:
38% CPU usage
49% GPU usage
35 FPS

Summary:
2D Performance 100%
3D Performance (35fps / 88fps) x 100 = 40%

CPU usage drop in 3D Vision = [(2D 53% CPU usage) - (3D 38% CPU usage)] / 2D 53% CPU usage) x 100 = 28% CPU usage drop caused by 3D Vision

100-40 = 60% performance drop due purely to this CPU bottleneck caused by 3D Vision.
==========================
==========================
Game: Deus Ex: Mankind Divided
Toggle 3D Vision to OFF:
74% CPU usage
70% GPU usage
66 FPS

Toggle 3D Vision to ON:
55% CPU usage
63% GPU usage
31 FPS

Summary:
2D Performance 100%
3D Performance (31fps / 66fps) x 100 = 47%

CPU usage drop in 3D Vision = [(2D 74% CPU usage) - (3D 55% CPU usage)] / 2D 80% CPU usage) x 100 = 26% CPU usage drop caused by 3D Vision

100-47 = 53% performance drop due purely to this CPU bottleneck caused by 3D Vision.
==========================
==========================
Game: Batman: Arkham Origins
Toggle 3D Vision to OFF:
31% CPU usage
30% GPU usage
72 FPS

Toggle 3D Vision to ON:
21% CPU usage
13% GPU usage
37 FPS

Summary:
2D Performance 100%
3D Performance (31fps / 66fps) x 100 = 51%

CPU usage drop in 3D Vision = [(2D 31% CPU usage) - (3D 21% CPU usage)] / 2D 31% CPU usage) x 100 = 32% CPU usage drop caused by 3D Vision

100-51 = 49% performance drop due purely to this CPU bottleneck caused by 3D Vision.
==========================

============================
XCOM 2
3D off, enabled in Nvidia control panel.
CPU usage 32.25%
GPU usage 47%
FPS 80

3D on
CPU usage 28.75
GPU usage 61%
FPS 51

Summary
2D Performance 100%
3D performance (51 / 80) x 100 = 63.75%

CPU usage drop in 3D Vision = [(2D 32.25% CPU usage) - (3D 28.75% CPU usage)] / 2D 32.25% CPU usage) x 100 = 11% CPU usage drop caused by 3D Vision

100-63.75 = 36.25% performance drop due purely to this CPU bottleneck caused by 3D Vision.
============================
============================
MAFIA II (3d vision ready)
3D off (enabled in control panel)
CPU usage - 41%
GPU usage 22%
FPS - 100

3D enabled
CPU usage - 34%
GPU usage - 22%
FPS - 53

Summary
2D performance 100%
3D performance (44 / 80) x 100 = 53%

CPU usage drop in 3D Vision = [(2D 41 CPU usage) - (3D 34 CPU usage)] / 2D 41 CPU usage) x 100 = 17% CPU usage drop caused by 3D Vision

100-53 = 47% performance drop due purely to this CPU bottleneck caused by 3D Vision.
========================

We really hope that you are able to replicate the problem, and hopefully fix the issue. If you require more information, please let us know!

Thank you Ray!

Kind regards,
-- Shahzad Ali
Cambridge
UK






Thanks for the update Shahzad, certainly a lot of data here to digest. Let me forward this to our quality lab and development team for review. I checked our database and I'm not finding any known bug related to 3D Vision and CPU overhead. I don't know enough about of 3D Vision technology so let me get these data to our quality lab and development team and get their input. If need be I will submit a bug to have our quality lab try to replicate the CPU overhead internally. I suspect if development has to investigate they will need a failing system to debug. Can you provide the link to the discussion forum for reference?

Best regards,
Ray

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#78
Posted 10/21/2016 03:34 AM   
[color="orange"]Hi Ray, Thank you for helping us look into this. The discussion is on the nVidia 3D Vision forum here: https://forums.geforce.com/default/topic/966422/3d-vision/3d-vision-cpu-bottelneck-gathering-information-thread-/ Please ignore the drama, we, the 3D Vision community are a passionate bunch of people :) A failing system would be easy to setup: Install GTA5 on a system with high end graphics cards such as 1080 SLi etc, and a 6 core CPU clocked at ~2GHz. Everyone is experiencing this issue, and with the above system, it is most apparent due to the number of threads GTA5 uses combined with the number of physical cores being used on the CPU in and out of 3D Vision, and the "low" clock so that the CPU isn't brute forcing through the bottleneck :) This is a problem in most / all games which run multiple threads of course. Kind regards, -- Shahzad Hi Ray, I have been reminded that I should mention that the problem occurs with both 'real' 3D Vision and 3D Vision discover, but not in Compatibility Mode ('N'vidia's Fake' 3D Mode). Kind regards, -- Shahzad [/color] [color="green"] Hello, I had a discussion with development team regarding this thread and after some internal review and discussion they concluded this is not a bug and expected due to fact that 3D Vision must run with VSYNC on. Here is the response I received from development to this matter. "Well if you look the thread the real issue they are pointing out is the fact that in some situations the CPU usage drops when 3D Vision is on AND there are cases where the GPU is NOT maxed out. However if you take into account the fact that 3D Vision must run with VSYNC on, there is always that issue where GPU usage maxes out and we get a missed frame. This makes the GPU usage LOOK LIKE it’s very low. Their first example of GTA5 shows this. 3DV off = 138FPS(98% GPU usage), 3DV on = 50fps(48% usage). The fact here is that GPU usage should have to go to ~2*98% to render 3D but obviously can’t, so you get a missed frame quite often and GPU usage reduction because of this waiting. And as a consequence the CPU is waiting around for that missed frame and also drops usage. So this is all a consequence of the 3D Vision forced VSYNC. The fact that overclocking the CPU can somewhat mitigate this issue is due to the fact that when you are on the very edge of missing a frame or not, a faster CPU can reduce the small overhead in the driver and possibly cause less missed frames. You could also say the overclocking the GPU could somewhat mitigate this issue(maybe even more than the CPU overclocking). This is not a bug but an attribute of having to run with VSYNC ON when 3D Vision runs." Best regards, Ray [/color]
Hi Ray,

Thank you for helping us look into this.

The discussion is on the nVidia 3D Vision forum here:

https://forums.geforce.com/default/topic/966422/3d-vision/3d-vision-cpu-bottelneck-gathering-information-thread-/

Please ignore the drama, we, the 3D Vision community are a passionate bunch of people :)

A failing system would be easy to setup: Install GTA5 on a system with high end graphics cards such as 1080 SLi etc, and a 6 core CPU clocked at ~2GHz. Everyone is experiencing this issue, and with the above system, it is most apparent due to the number of threads GTA5 uses combined with the number of physical cores being used on the CPU in and out of 3D Vision, and the "low" clock so that the CPU isn't brute forcing through the bottleneck :)

This is a problem in most / all games which run multiple threads of course.

Kind regards,
-- Shahzad


Hi Ray,

I have been reminded that I should mention that the problem occurs with both 'real' 3D Vision and 3D Vision discover, but not in Compatibility Mode ('N'vidia's Fake' 3D Mode).

Kind regards,
-- Shahzad



Hello,

I had a discussion with development team regarding this thread and after some internal review and discussion they concluded this is not a bug and expected due to fact that 3D Vision must run with VSYNC on. Here is the response I received from development to this matter.

"Well if you look the thread the real issue they are pointing out is the fact that in some situations the CPU usage drops when 3D Vision is on AND there are cases where the GPU is NOT maxed out. However if you take into account the fact that 3D Vision must run with VSYNC on, there is always that issue where GPU usage maxes out and we get a missed frame. This makes the GPU usage LOOK LIKE it’s very low. Their first example of GTA5 shows this. 3DV off = 138FPS(98% GPU usage), 3DV on = 50fps(48% usage). The fact here is that GPU usage should have to go to ~2*98% to render 3D but obviously can’t, so you get a missed frame quite often and GPU usage reduction because of this waiting. And as a consequence the CPU is waiting around for that missed frame and also drops usage.
So this is all a consequence of the 3D Vision forced VSYNC.

The fact that overclocking the CPU can somewhat mitigate this issue is due to the fact that when you are on the very edge of missing a frame or not, a faster CPU can reduce the small overhead in the driver and possibly cause less missed frames. You could also say the overclocking the GPU could somewhat mitigate this issue(maybe even more than the CPU overclocking). This is not a bug but an attribute of having to run with VSYNC ON when 3D Vision runs."


Best regards,
Ray

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#79
Posted 10/21/2016 03:39 AM   
[color="orange"]Hi Ray, Thank you again for looking into this. We had considered that possibility but unfortunately, this is not the case - it doesn't seem to be that simple. I would ask you to refrain from looking at the results posted by randoms on the forum thread that I linked to, as a lot of them have not followed testing instructions - most of their results are invalid (saturated GPU, FPS cap, etc). Please stick to the verified results that I have put together to reach conclusions. I think this is the source for some of the confusion - the development team are not looking at verified data. Contaminated data will of course lead to faulty conclusions :) Please correct me if I am mistaken but my understanding from your email is that the conclusion from development is that taking only GTA5 as an example: 1. Because 3D Vision puts double the load on the GPU, when the 2D GPU usage is above 50%, it can't double usage so there are dropped frames, which causes CPU usage drop as the CPU has to wait. 2. This is due to VSync. I admit GTA5 was a bad example because the core usage in 2D is being saturated. Please find updated GTA5 results where both 2D core usage and 3D core usage is under 50%, so this is no longer an issue. To prove that VSync and dropped frames is not the cause of the issue, It would be best to show you: 1. GTA5 GPU usage in 2D < 50%, and the effective usage statistics of what happens when 3D Vision is then toggled on. If the development's conclusion is correct then the GPU usage will double and the CPU usage will stay the same (will not drop), nor will the FPS. 2. Perform the test with both VSync OFF and VSync ON, so that even 2D is limited by frame skipping, not just 3D. Please see the results below: 2D VSync ON in game and Control Panel. Game: Grand Theft Auto 5 UPDATED! Toggle 3D Vision to OFF: 70% CPU usage 58% GPU usage 102 FPS Toggle 3D Vision to ON: 46% CPU usage 34% GPU usage 41 FPS Summary: Taking 2D Performance as 100% 3D Performance (50fps / 138fps) x 100 = 40% 100-40 = 60% performance drop due purely to CPU this bottleneck caused by 3D Vision. CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 46% CPU usage)] / 2D 70% CPU usage) x 100 = 34% CPU usage drop caused by 3D Vision ========================== 2D VSync OFF - No difference. VSync OFF in game and Control Panel. Game: Grand Theft Auto 5 UPDATED! Toggle 3D Vision to OFF: 70% CPU usage 58% GPU usage 102 FPS Toggle 3D Vision to ON: 46% CPU usage 34% GPU usage 41 FPS Summary: Taking 2D Performance as 100% 3D Performance (50fps / 138fps) x 100 = 40% 100-40 = 60% performance drop due purely to CPU this bottleneck caused by 3D Vision. CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 46% CPU usage)] / 2D 70% CPU usage) x 100 = 34% CPU usage drop caused by 3D Vision ========== ========== Please also see [supplemental] data below for when the GPU is <50% usage in 2D, so that it can double up to 100% usage in 3D if / when required. This should assist the Development team further as the results show the projected doubling of the GPU usage is not happening as it should (it seems to be halfing!): Please see the results below: 2D VSync ON in game and Control Panel. Game: Grand Theft Auto 5 UPDATED! supplemental Toggle 3D Vision to OFF: 70% CPU usage 35% GPU usage <----- 2D GPU less than 50% 102 FPS Toggle 3D Vision to ON: 48% CPU usage <----- CPU usage lowered! 3D Vision causing CPU bottleneck?! 18% GPU usage <----- 3D Vision GPU less than 50%; not doubling (more like halfing!) - no frame skipping! 46 FPS Summary: Taking 2D Performance as 100% 3D Performance (46fps / 102fps) x 100 = 45% 100-45 = 55% performance drop due purely to this CPU bottleneck caused by 3D Vision. CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 48% CPU usage)] / 2D 70% CPU usage) x 100 = 31% CPU usage drop caused by 3D Vision. ========================== 2D VSync OFF - No difference. VSync OFF in game and Control Panel. Game: Grand Theft Auto 5 UPDATED! supplemental Toggle 3D Vision to OFF: 70% CPU usage 35% GPU usage <----- 2D GPU less than 50% 102 FPS Toggle 3D Vision to ON: 48% CPU usage <----- CPU usage lowered! 3D Vision causing CPU bottleneck?! 18% GPU usage <----- 3D Vision GPU less than 50%; not doubling (more like halfing!) - no frame skipping! 46 FPS Summary: Taking 2D Performance as 100% 3D Performance (46fps / 102fps) x 100 = 45% 100-45 = 55% performance drop due purely to this CPU bottleneck caused by 3D Vision. CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 48% CPU usage)] / 2D 70% CPU usage) x 100 = 31% CPU usage drop caused by 3D Vision. ========================== The game seems to saturate the 6-core CPU at 70% - it looks like it's using 4 threads to saturate 4 cores. When 3D is enabled, the CPU plummets to 48% usage from 70%, and the GPU plummets to 18% from 35. The FPS drops from 102 fps to 42 fps. Please let me know if I can do more tests to verify or rule out other suggested conclusions. I am happy to do the leg work so you clever folk can get to the bottom of this :) I am testing and sending this email from my work computer to rule out a single system being at fault. Kind regards, -- Shahzad Eur. Ing. Shahzad Ali. Chartered Engineer, MIET.[/color] [color="green"] Thanks for the update. Let me get these data to development so they can analyze and respond. Best regards, Ray [/color]
Hi Ray, Thank you again for looking into this.

We had considered that possibility but unfortunately, this is not the case - it doesn't seem to be that simple.

I would ask you to refrain from looking at the results posted by randoms on the forum thread that I linked to, as a lot of them have not followed testing instructions - most of their results are invalid (saturated GPU, FPS cap, etc). Please stick to the verified results that I have put together to reach conclusions. I think this is the source for some of the confusion - the development team are not looking at verified data. Contaminated data will of course lead to faulty conclusions :)

Please correct me if I am mistaken but my understanding from your email is that the conclusion from development is that taking only GTA5 as an example:

1. Because 3D Vision puts double the load on the GPU, when the 2D GPU usage is above 50%, it can't double usage so there are dropped frames, which causes CPU usage drop as the CPU has to wait.

2. This is due to VSync.

I admit GTA5 was a bad example because the core usage in 2D is being saturated. Please find updated GTA5 results where both 2D core usage and 3D core usage is under 50%, so this is no longer an issue.



To prove that VSync and dropped frames is not the cause of the issue, It would be best to show you:

1. GTA5 GPU usage in 2D < 50%, and the effective usage statistics of what happens when 3D Vision is then toggled on. If the development's conclusion is correct then the GPU usage will double and the CPU usage will stay the same (will not drop), nor will the FPS.

2. Perform the test with both VSync OFF and VSync ON, so that even 2D is limited by frame skipping, not just 3D.

Please see the results below:

2D VSync ON in game and Control Panel.

Game: Grand Theft Auto 5 UPDATED!
Toggle 3D Vision to OFF:
70% CPU usage
58% GPU usage
102 FPS

Toggle 3D Vision to ON:
46% CPU usage
34% GPU usage
41 FPS

Summary:
Taking 2D Performance as 100%
3D Performance (50fps / 138fps) x 100 = 40%

100-40 = 60% performance drop due purely to CPU this bottleneck caused by 3D Vision.

CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 46% CPU usage)] / 2D 70% CPU usage) x 100 = 34% CPU usage drop caused by 3D Vision



==========================

2D VSync OFF - No difference. VSync OFF in game and Control Panel.


Game: Grand Theft Auto 5 UPDATED!
Toggle 3D Vision to OFF:
70% CPU usage
58% GPU usage
102 FPS

Toggle 3D Vision to ON:
46% CPU usage
34% GPU usage
41 FPS

Summary:
Taking 2D Performance as 100%
3D Performance (50fps / 138fps) x 100 = 40%

100-40 = 60% performance drop due purely to CPU this bottleneck caused by 3D Vision.

CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 46% CPU usage)] / 2D 70% CPU usage) x 100 = 34% CPU usage drop caused by 3D Vision

==========
==========

Please also see [supplemental] data below for when the GPU is <50% usage in 2D, so that it can double up to 100% usage in 3D if / when required. This should assist the Development team further as the results show the projected doubling of the GPU usage is not happening as it should (it seems to be halfing!):

Please see the results below:

2D VSync ON in game and Control Panel.

Game: Grand Theft Auto 5 UPDATED! supplemental
Toggle 3D Vision to OFF:
70% CPU usage
35% GPU usage <----- 2D GPU less than 50%
102 FPS

Toggle 3D Vision to ON:
48% CPU usage <----- CPU usage lowered! 3D Vision causing CPU bottleneck?!
18% GPU usage <----- 3D Vision GPU less than 50%; not doubling (more like halfing!) - no frame skipping!
46 FPS

Summary:
Taking 2D Performance as 100%
3D Performance (46fps / 102fps) x 100 = 45%

100-45 = 55% performance drop due purely to this CPU bottleneck caused by 3D Vision.

CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 48% CPU usage)] / 2D 70% CPU usage) x 100 = 31% CPU usage drop caused by 3D Vision.



==========================

2D VSync OFF - No difference. VSync OFF in game and Control Panel.


Game: Grand Theft Auto 5 UPDATED! supplemental
Toggle 3D Vision to OFF:
70% CPU usage
35% GPU usage <----- 2D GPU less than 50%
102 FPS

Toggle 3D Vision to ON:
48% CPU usage <----- CPU usage lowered! 3D Vision causing CPU bottleneck?!
18% GPU usage <----- 3D Vision GPU less than 50%; not doubling (more like halfing!) - no frame skipping!
46 FPS

Summary:
Taking 2D Performance as 100%
3D Performance (46fps / 102fps) x 100 = 45%

100-45 = 55% performance drop due purely to this CPU bottleneck caused by 3D Vision.

CPU usage drop in 3D Vision = [(2D 70% CPU usage) - (3D 48% CPU usage)] / 2D 70% CPU usage) x 100 = 31% CPU usage drop caused by 3D Vision.

==========================

The game seems to saturate the 6-core CPU at 70% - it looks like it's using 4 threads to saturate 4 cores.

When 3D is enabled, the CPU plummets to 48% usage from 70%, and the GPU plummets to 18% from 35. The FPS drops from 102 fps to 42 fps.

Please let me know if I can do more tests to verify or rule out other suggested conclusions. I am happy to do the leg work so you clever folk can get to the bottom of this :)

I am testing and sending this email from my work computer to rule out a single system being at fault.

Kind regards,
-- Shahzad

Eur. Ing. Shahzad Ali. Chartered Engineer, MIET.





Thanks for the update. Let me get these data to development so they can analyze and respond.

Best regards,
Ray

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#80
Posted 10/21/2016 03:43 AM   
Thanks for sharing, interested to see what they come back with!
Thanks for sharing, interested to see what they come back with!

#81
Posted 10/21/2016 04:41 AM   
Hey RAGEdemon, I can actually run "3D Vision" with Vsync off as I'm using a line-interlaced 3D monitor. Definitely get over 60fps and screen tearing galore... let me know if you would like me to do some testing, it's friday night here now so I can get cracking tomorrow!
Hey RAGEdemon, I can actually run "3D Vision" with Vsync off as I'm using a line-interlaced 3D monitor. Definitely get over 60fps and screen tearing galore... let me know if you would like me to do some testing, it's friday night here now so I can get cracking tomorrow!

OS & Driver: Win 10 w/417.35
CPU & GPU: i7 4790k, Gigabyte 980Ti G1 Gaming
MB & RAM: Asrock Z97 Extreme4, GSkill Trident 16Gb DDR3 2400Mhz
Audio: Realtek HD, Steinberg UR44
Display: Acer XB271HUA w/3D Vision 2 Kit

#82
Posted 10/21/2016 06:03 AM   
[quote="NVicious"]Hey RAGEdemon, I can actually run "3D Vision" with Vsync off as I'm using a line-interlaced 3D monitor. Definitely get over 60fps and screen tearing galore... let me know if you would like me to do some testing, it's friday night here now so I can get cracking tomorrow![/quote] Thanks NVicious, Interestingly, even with native 3D Vision, one can disable vSync in 3D Vision with the use of D3Doverrider. The GPU is sync'd to the frames which means that 3D vision works perfectly. What is unsync'd is the frame to the GPU buffers. Tearing galore indeed, as the tear is seen at a different location by each eye, but perfect 3D otherwise!
NVicious said:Hey RAGEdemon, I can actually run "3D Vision" with Vsync off as I'm using a line-interlaced 3D monitor. Definitely get over 60fps and screen tearing galore... let me know if you would like me to do some testing, it's friday night here now so I can get cracking tomorrow!


Thanks NVicious,

Interestingly, even with native 3D Vision, one can disable vSync in 3D Vision with the use of D3Doverrider.

The GPU is sync'd to the frames which means that 3D vision works perfectly. What is unsync'd is the frame to the GPU buffers. Tearing galore indeed, as the tear is seen at a different location by each eye, but perfect 3D otherwise!

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#83
Posted 10/21/2016 06:12 AM   
[color="green"]Oh, can you help provide the full detail of the system configuration (CPU, GPU, RAM, driver version, etc... used to generate the data? Development is currently working with our performance lab to setup configuration to check these data.[/color] [color="orange"] Hi Ray, No problem. Test system as under: GPU: 2x GTX 1080 in SLi with HB Bridge, both @ PCIe x16 2.0 CPU: Intel Xeon x5660 6 core / 12 thread ( Intel i7-980x Extreme Edition Six Core Processor ) Hyperthreading Disabled. RAM: 12GB 1600MHz CL9 DDR3 Triple Channel kit Motherboard: Asus P6T Deluxe V2 Drivers: 372.90 WHQL Game settings: Everything on stock except testing resolutions of 2560x1600 and 1280x800 (as in yesterday's email). Games which use a high number of threads and CPU power seems to be the most affected, and the problem manifests itself best on systems with more cores (6 cores etc), as in 2D the game can take advantage of the 6 cores whereas in 3D Vision, this ability is severely curtailed meaning only enough work for only 3 cores. A word to the wise: It has been reported that this is a problem on all CPUs, especially older AMD ones. Please remember that special attention must be paid to the clock of the CPU depending on the generation of the CPU on test. Older CPUs can be overclocked and give the same problematic result, but newer CPUs need to be underclocked to ~2GHz for testing purposes, or else the CPU brute forces the problem out of existence as it starts to saturate other areas of the system such as GPUs etc - but I'm sure you clever folk already anticipate all of this :) Please let me know if there is anything that I can help you guys with. Kind regards, -- Shahzad [/color]
Oh, can you help provide the full detail of the system configuration (CPU, GPU, RAM, driver version, etc... used to generate the data? Development is currently working with our performance lab to setup configuration to check these data.


Hi Ray,

No problem. Test system as under:

GPU: 2x GTX 1080 in SLi with HB Bridge, both @ PCIe x16 2.0
CPU: Intel Xeon x5660 6 core / 12 thread ( Intel i7-980x Extreme Edition Six Core Processor ) Hyperthreading Disabled.
RAM: 12GB 1600MHz CL9 DDR3 Triple Channel kit
Motherboard: Asus P6T Deluxe V2
Drivers: 372.90 WHQL

Game settings: Everything on stock except testing resolutions of 2560x1600 and 1280x800 (as in yesterday's email).

Games which use a high number of threads and CPU power seems to be the most affected, and the problem manifests itself best on systems with more cores (6 cores etc), as in 2D the game can take advantage of the 6 cores whereas in 3D Vision, this ability is severely curtailed meaning only enough work for only 3 cores.

A word to the wise:
It has been reported that this is a problem on all CPUs, especially older AMD ones. Please remember that special attention must be paid to the clock of the CPU depending on the generation of the CPU on test. Older CPUs can be overclocked and give the same problematic result, but newer CPUs need to be underclocked to ~2GHz for testing purposes, or else the CPU brute forces the problem out of existence as it starts to saturate other areas of the system such as GPUs etc - but I'm sure you clever folk already anticipate all of this :)

Please let me know if there is anything that I can help you guys with.

Kind regards,
-- Shahzad

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#84
Posted 10/23/2016 12:33 AM   
Some good news: -------- [color="green"]Hi Shahzad, I checked the bug and did see our internal performance lab updated benchmark results comparing stereo On and Off with GTA V. The results does show CPU usage drop when stereo is enabled, so it would appear we may have replicated similar results. No further update yet on the results so still awaiting development team to analyze the data. Best regards, Ray[/color] -------- nVidia seem to have replicated the issue, which is big step forward. Let's hope it's fixable and something comes of this...
Some good news:

--------
Hi Shahzad,

I checked the bug and did see our internal performance lab updated benchmark results comparing stereo On and Off with GTA V. The results does show CPU usage drop when stereo is enabled, so it would appear we may have replicated similar results. No further update yet on the results so still awaiting development team to analyze the data.


Best regards,
Ray

--------

nVidia seem to have replicated the issue, which is big step forward. Let's hope it's fixable and something comes of this...

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#85
Posted 10/23/2016 12:37 AM   
[quote="RAGEdemon"]Some good news: -------- [color="green"]Hi Shahzad, I checked the bug and did see our internal performance lab updated benchmark results comparing stereo On and Off with GTA V. The results does show CPU usage drop when stereo is enabled, so it would appear we may have replicated similar results. No further update yet on the results so still awaiting development team to analyze the data. Best regards, Ray[/color] -------- nVidia seem to have replicated the issue, which is big step forward. Let's hope it's fixable and something comes of this... [/quote] Not fixable mate. As is not a bug;) They said it;) I really hope I am wrong though;) But can't see how they can fix it, unless they re-write the whole driver NOT to be BOUND around VSYNC. This sounds like a timing issue though;) (Not a full dead-lock, but still) One thing waits on another, while the "another" waits on the "one thing" scenario, until somebody decides to time-out;) Really hope they can fix it for everyone though;)
RAGEdemon said:Some good news:

--------
Hi Shahzad,

I checked the bug and did see our internal performance lab updated benchmark results comparing stereo On and Off with GTA V. The results does show CPU usage drop when stereo is enabled, so it would appear we may have replicated similar results. No further update yet on the results so still awaiting development team to analyze the data.


Best regards,
Ray

--------

nVidia seem to have replicated the issue, which is big step forward. Let's hope it's fixable and something comes of this...


Not fixable mate. As is not a bug;) They said it;)
I really hope I am wrong though;) But can't see how they can fix it, unless they re-write the whole driver NOT to be BOUND around VSYNC.

This sounds like a timing issue though;) (Not a full dead-lock, but still) One thing waits on another, while the "another" waits on the "one thing" scenario, until somebody decides to time-out;) Really hope they can fix it for everyone though;)

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#86
Posted 10/23/2016 12:57 AM   
Apologies, but have not read through the thread in detail. My recommendation is to not use GTA5 as the test case, because we don't know anything about their implementation, and it could be something R* does (like lock vsync, or mutex waits) that Nvidia has no control over. This is 3D Vision Direct. The more interesting test case uses 3D Vision Automatic, what we mostly use. I would suggest sorting through your test cases, and find the most compelling example out of them all. Only one. When you add a lot of other games and details, you are actually lowering the chances they'll look at it. It's too noisy, too much data. I know you are trying to provide as much data and detail as possible, but that's not how the QA people work. You can add data later, after you've gotten their attention for a single test case. Find the most interesting test case that clearly is not affected by vsync. That will get past their first hurdle of assuming that vsync causes the problem. And, BTW, thanks for pushing this.
Apologies, but have not read through the thread in detail.

My recommendation is to not use GTA5 as the test case, because we don't know anything about their implementation, and it could be something R* does (like lock vsync, or mutex waits) that Nvidia has no control over. This is 3D Vision Direct.

The more interesting test case uses 3D Vision Automatic, what we mostly use.


I would suggest sorting through your test cases, and find the most compelling example out of them all. Only one.

When you add a lot of other games and details, you are actually lowering the chances they'll look at it. It's too noisy, too much data. I know you are trying to provide as much data and detail as possible, but that's not how the QA people work.

You can add data later, after you've gotten their attention for a single test case.


Find the most interesting test case that clearly is not affected by vsync. That will get past their first hurdle of assuming that vsync causes the problem.

And, BTW, thanks for pushing this.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#87
Posted 10/23/2016 01:10 AM   
My intuition says that GTA5 suffers from the same issue - it's not isolated. The GTA5 core issue has been widely replicated within the community, and is quite striking. The game itself is a AAA title which is very popular and is well optimised for multiple threads by a reputable developer. I believe this makes it the prime candidate to push for testing - the problem is reproduceable, and nVidia have already reproduced it. Perhaps they might come back with "it's a GTA5" issue only. If that happens, I can point to other games showing similar performance decreases. After I sent supplemental data, nVidia have already acknowledged that it isn't a VSync issue as they had previously suggested. You can also disable VSync in 3D vision using D3Doverider as stated in my previous post, and the problem still presents itself. It's quite obvious that no-one knows what is actually going on. There has been a lot of conjecture about what causes the problem but attempts to isolate the problem i.e. draw calls, vsync, CPU brand, GPU generation, driver revision etc have all failed. All we know is that there is a problem, and that nVidia have managed to replicate it. Let's wait and see what they come back with. In the mean time, what we can do is test Tri-def to see if it shows a similar problem with CPU utilisation decrease in 3D. This would be an eye opener as if it shows the same results, then it would mean that it's intrinsically a Stereo3D problem and likely can't be fixed. If on the other hand it doesn't manifest itself, then we know it's a 3D Vision driver problem. If someone can post a test or two, that would go a long way. Thank you in advance for any kind soul willing to give it the time.
My intuition says that GTA5 suffers from the same issue - it's not isolated. The GTA5 core issue has been widely replicated within the community, and is quite striking. The game itself is a AAA title which is very popular and is well optimised for multiple threads by a reputable developer. I believe this makes it the prime candidate to push for testing - the problem is reproduceable, and nVidia have already reproduced it.

Perhaps they might come back with "it's a GTA5" issue only. If that happens, I can point to other games showing similar performance decreases.

After I sent supplemental data, nVidia have already acknowledged that it isn't a VSync issue as they had previously suggested. You can also disable VSync in 3D vision using D3Doverider as stated in my previous post, and the problem still presents itself.

It's quite obvious that no-one knows what is actually going on. There has been a lot of conjecture about what causes the problem but attempts to isolate the problem i.e. draw calls, vsync, CPU brand, GPU generation, driver revision etc have all failed.

All we know is that there is a problem, and that nVidia have managed to replicate it. Let's wait and see what they come back with.


In the mean time, what we can do is test Tri-def to see if it shows a similar problem with CPU utilisation decrease in 3D. This would be an eye opener as if it shows the same results, then it would mean that it's intrinsically a Stereo3D problem and likely can't be fixed. If on the other hand it doesn't manifest itself, then we know it's a 3D Vision driver problem. If someone can post a test or two, that would go a long way.

Thank you in advance for any kind soul willing to give it the time.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#88
Posted 10/23/2016 01:51 AM   
[quote="RAGEdemon"]My intuition says that GTA5 suffers from the same issue - it's not isolated. The GTA5 core issue has been widely replicated within the community, and is quite striking. The game itself is a AAA title which is very popular and is well optimised for multiple threads by a reputable developer. I believe this makes it the prime candidate to push for testing - the problem is reproduceable, and nVidia have already reproduced it. Perhaps they might come back with "it's a GTA5" issue only. If that happens, I can point to other games showing similar performance decreases. After I sent supplemental data, nVidia have already acknowledged that it isn't a VSync issue as they had previously suggested. You can also disable VSync in 3D vision using D3Doverider as stated in my previous post, and the problem still presents itself. It's quite obvious that no-one knows what is actually going on. There has been a lot of conjecture about what causes the problem but attempts to isolate the problem i.e. draw calls, vsync, CPU brand, GPU generation, driver revision etc have all failed. All we know is that there is a problem, and that nVidia have managed to replicate it. Let's wait and see what they come back with. In the mean time, what we can do is test Tri-def to see if it shows a similar problem with CPU utilisation decrease in 3D. This would be an eye opener as if it shows the same results, then it would mean that it's intrinsically a Stereo3D problem and likely can't be fixed. If on the other hand it doesn't manifest itself, then we know it's a 3D Vision driver problem. If someone can post a test or two, that would go a long way. Thank you in advance for any kind soul willing to give it the time.[/quote] Awesome job! Really glad that they finally were able to replicate it and they are investigating it! Just don't let it get "swept under the rug" ;)) Nvidia is normally helpful and will fix these bugs if we find them. It might take a while though;) Awesome job! I more interesting in why this is happening, so if you can "extract" this info from them after/when they know what is going on, it would be awesome! Is for the "scholar" in me mostly;)) Thank you again for fighting this fight;)
RAGEdemon said:My intuition says that GTA5 suffers from the same issue - it's not isolated. The GTA5 core issue has been widely replicated within the community, and is quite striking. The game itself is a AAA title which is very popular and is well optimised for multiple threads by a reputable developer. I believe this makes it the prime candidate to push for testing - the problem is reproduceable, and nVidia have already reproduced it.

Perhaps they might come back with "it's a GTA5" issue only. If that happens, I can point to other games showing similar performance decreases.

After I sent supplemental data, nVidia have already acknowledged that it isn't a VSync issue as they had previously suggested. You can also disable VSync in 3D vision using D3Doverider as stated in my previous post, and the problem still presents itself.

It's quite obvious that no-one knows what is actually going on. There has been a lot of conjecture about what causes the problem but attempts to isolate the problem i.e. draw calls, vsync, CPU brand, GPU generation, driver revision etc have all failed.

All we know is that there is a problem, and that nVidia have managed to replicate it. Let's wait and see what they come back with.


In the mean time, what we can do is test Tri-def to see if it shows a similar problem with CPU utilisation decrease in 3D. This would be an eye opener as if it shows the same results, then it would mean that it's intrinsically a Stereo3D problem and likely can't be fixed. If on the other hand it doesn't manifest itself, then we know it's a 3D Vision driver problem. If someone can post a test or two, that would go a long way.

Thank you in advance for any kind soul willing to give it the time.


Awesome job!
Really glad that they finally were able to replicate it and they are investigating it!
Just don't let it get "swept under the rug" ;)) Nvidia is normally helpful and will fix these bugs if we find them. It might take a while though;)

Awesome job! I more interesting in why this is happening, so if you can "extract" this info from them after/when they know what is going on, it would be awesome! Is for the "scholar" in me mostly;))

Thank you again for fighting this fight;)

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#89
Posted 10/23/2016 10:34 AM   
@RAGEdemon - sorry to bump an old thread. Any further contact from Nvidia on this?
@RAGEdemon - sorry to bump an old thread. Any further contact from Nvidia on this?

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#90
Posted 11/29/2016 07:56 PM   
  6 / 22    
Scroll To Top