[quote="D-Man11"]BTW RAGEdemon had a bug recently and Nvidia was able to reproduce it and are looking into it.
[quote="RAGEdemon"]Message from nVidia:
[b][i][color="green"]Response Via Email (Ray) 05/03/2016 03:23 PM
Hello,
Just wanted to update that our lab was able to replicate the failure and bug has been escalated to our development team to investigate. Thanks again for bring this issue to our attention and we hope to have this fixed in a future driver update.
Best regards,
Ray[/color][/i]
[/b][/quote]
[url]https://forums.geforce.com/default/topic/927183/3d-vision/-364-72-has-been-released-hot-fix-driver-364-96-doom-quot-open-quot-beta/post/4871377/#4871377[/url][/quote]
Was that bug about the cpu bottleneck? I wasn't shure by reading the thread...
D-Man11 said:BTW RAGEdemon had a bug recently and Nvidia was able to reproduce it and are looking into it.
RAGEdemon said:Message from nVidia:
Response Via Email (Ray) 05/03/2016 03:23 PM
Hello,
Just wanted to update that our lab was able to replicate the failure and bug has been escalated to our development team to investigate. Thanks again for bring this issue to our attention and we hope to have this fixed in a future driver update.
His problen was crashing when using DSR and 3D Vision with 970s in SLI.
I posted this here because it's an example of how, he reported it using the official feedback and that it's going to be fixed.
He talked about it in a few threads I think. Here's one of the posts, right before he reported it last week.
https://forums.geforce.com/default/topic/847809/3d-vision/list-of-3d-vision-problems/post/4867118/#4867118
GTA V uses 3D Vision Direct not Automatic, yes? So how can the 3-core bottleneck be due to the 3D Vision driver? Isn't the game developer responsible for rendering the stereoscopic views in Direct mode?
GTA V uses 3D Vision Direct not Automatic, yes? So how can the 3-core bottleneck be due to the 3D Vision driver? Isn't the game developer responsible for rendering the stereoscopic views in Direct mode?
Inno3D RTX 2080 Ti iChill Black (330W Power Limit / +50 MHz Core / +750 MHz Memory)
Intel Core i9-9900X (4.6 GHz Core / 3.0 GHz Mesh)
Corsair Vengeance LPX 32 GB (4 x 8 GB) DDR4 CMK32GX4M4Z3200C16 (4000 MHz 18-20-20-40-1T)
MSI MEG X299 Creation
Asus ROG Swift PG27VQ / Dell S2417DG / 3D Vision 2 / Oculus Rift / Marantz SR6012 / LG OLED55B7T
Intel Optane 900P 280 GB / Tiered Storage Space (Samsung 950 PRO 512 GB / Seagate IronWolf Pro 10 TB)
Windows 10 Pro 64-bit
[quote="Monstieur"]GTA V uses 3D Vision Direct not Automatic, yes? So how can the 3-core bottleneck be due to the 3D Vision driver? Isn't the game developer responsible for rendering the stereoscopic views in Direct mode?[/quote]
Bingo! 3D Vision Direct has no problem with 1 or 1000 cores. 3D Vision Automatic, from what I saw doesn't have this limitation either....
I think is the damn Game that limits 3 core utilisation in 3D Vision. (I am talking from the perspective or coding my wrapper to use both 3D Vision Automatic and 3D Vision Direct. Never ever did I hit the 3 core bottleneck:-s) Just check-out Doom(2016) and you will see all all cores being used.
Then again look at Amnesia the Dark Descent (using the exact same wrapper and 3D Vision Auto/Direct) you will always get one core used MAXED.
Engine design...
Sure, 3D Vision Automatic adds quite a bit of overhead (translated into loss of FPS) due to all the heuristics it uses. But, 3D Vision Directs adds ZERO overhead to that! The core limitation is not part of any, from what I tested and saw!
I think GTAV simply limits to 3 cores when 3DVision = Active:) (for some reason...)
Monstieur said:GTA V uses 3D Vision Direct not Automatic, yes? So how can the 3-core bottleneck be due to the 3D Vision driver? Isn't the game developer responsible for rendering the stereoscopic views in Direct mode?
Bingo! 3D Vision Direct has no problem with 1 or 1000 cores. 3D Vision Automatic, from what I saw doesn't have this limitation either....
I think is the damn Game that limits 3 core utilisation in 3D Vision. (I am talking from the perspective or coding my wrapper to use both 3D Vision Automatic and 3D Vision Direct. Never ever did I hit the 3 core bottleneck:-s) Just check-out Doom(2016) and you will see all all cores being used.
Then again look at Amnesia the Dark Descent (using the exact same wrapper and 3D Vision Auto/Direct) you will always get one core used MAXED.
Engine design...
Sure, 3D Vision Automatic adds quite a bit of overhead (translated into loss of FPS) due to all the heuristics it uses. But, 3D Vision Directs adds ZERO overhead to that! The core limitation is not part of any, from what I tested and saw!
I think GTAV simply limits to 3 cores when 3DVision = Active:) (for some reason...)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
There does not seem to be any 3-core limit when I tested now. All CPU cores are used partially and the GPUs are also underutilized.
[url=http://imgur.com/xmiEnmz][img]http://i.imgur.com/xmiEnmzt.jpg[/img][/url]
I have spent more time doing benchmarks on another game in the hopes of ascertaining whether the problem was isolated to GTAV or whether it affected other games too.
In this case,
[b][u]Game: Call of Duty:Advanced Warfare.[/u][/b]
Configuration in signature.
Resolution = 2560x1600
VSync Off
[b]Results:[/b]
Intel Xeon x5660 @ [b]4.4GHz, 6 physical cores, Hyper Threading OFF.[/b]
1 cores 3D Vision = 26 fps, GPU1 @ 14% + GPU2 @ 12%, Total GPU Usage = 13%
2 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
3 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
4 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
5 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
6 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
1 cores 2D = 51 fps, GPU1 @ 25% + GPU2 @ 24% (toggled off), Total GPU Usage = 25%
2 cores 2D = 101fps, GPU1 @ 41% + GPU2 @ 43% (toggled off), Total GPU Usage = 42%
3 cores 2D = 142fps, GPU1 @ 64% + GPU2 @ 65% (toggled off), Total GPU Usage = 65%
4 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
5 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
6 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
[b]Conclusions:[/b]
[b]1.[/b] While Call of Duty:Advanced Warfare in 2D uses 4 cores to saturation, as soon as 3D is enabled, it drops down to only 2 cores being saturated. Adding more cores does not yield any more core usage, nor GPU usage.
[b]2.[/b] GPU usage scales well with CPU usage all the way up to 4 CPU core saturation, which indicates that there is no "GPU underutilisation" problem. I would recommend overclocking your CPU as high as possible for best results.
So, ladies and gents, from the results, again, the conclusion seems to be that the game becomes severely CPU limited as soon as 3D kicks in as it loses the ability to use more than 2 cores, thereby also severely degrading GPU scaling in the process.
The end result: 3D fps drops to ~32% of 2D fps (68% performance drop, Similar 65% drop measured in GTAV, due purely to the CPU bottleneck).
Since I have shown the same problem in GTAV, it seems as though it is a 3D Vision Driver optimisation problem.
I would also conjecture that we have not seen this before as only recently have games started using more than 2 cores. The driver may well have been designed to only work with a small number of cores.
Link to my original GTAV findings here:
https://forums.geforce.com/default/topic/825678/3d-vision/gta-v-problems-amp-solutions-list-please-keep-gta-discussion-here-/post/4515030/#4515030
The interesting difference between COD:Advanced Warfare vs GTAV is that while GTAV used 6 cores and became limited to 3 cores in 3D Vision, COD:Advanced Warfare uses 4 cores and becomes limited to 2 cores in 3D Vision.
It would seem that [these] games seem to only utilise half the CPU cores in 3D vision compared to 2D.
If someone can recommend other 3D Vision games which use multiple cores well, I would be happy to benchmark them too. It would be interesting to see if they too have the same problem.
I wonder if it could be a draw calls limit issue which DX12 purports to solve i.e. could it be possible that with 3D Vision activated, there are twice the number of draw calls being utilised thereby doubly burdening the main game thread, which results in halving the number of cores being able to be used?
============
@Monstieur: I'm afraid that's not how you measure core usage, friend. :)
The OS will randomly hop the game threads from core to core millions of times a second. You have to set the affinity to various combinations and watch carefully for core saturation and GPU usage to ascertain how many cores are actually being used. Results from cores being "partially used" are meaningless.
I have spent more time doing benchmarks on another game in the hopes of ascertaining whether the problem was isolated to GTAV or whether it affected other games too.
In this case, Game: Call of Duty:Advanced Warfare.
Configuration in signature.
Resolution = 2560x1600
VSync Off
1. While Call of Duty:Advanced Warfare in 2D uses 4 cores to saturation, as soon as 3D is enabled, it drops down to only 2 cores being saturated. Adding more cores does not yield any more core usage, nor GPU usage.
2. GPU usage scales well with CPU usage all the way up to 4 CPU core saturation, which indicates that there is no "GPU underutilisation" problem. I would recommend overclocking your CPU as high as possible for best results.
So, ladies and gents, from the results, again, the conclusion seems to be that the game becomes severely CPU limited as soon as 3D kicks in as it loses the ability to use more than 2 cores, thereby also severely degrading GPU scaling in the process.
The end result: 3D fps drops to ~32% of 2D fps (68% performance drop, Similar 65% drop measured in GTAV, due purely to the CPU bottleneck).
Since I have shown the same problem in GTAV, it seems as though it is a 3D Vision Driver optimisation problem.
I would also conjecture that we have not seen this before as only recently have games started using more than 2 cores. The driver may well have been designed to only work with a small number of cores.
Link to my original GTAV findings here:
https://forums.geforce.com/default/topic/825678/3d-vision/gta-v-problems-amp-solutions-list-please-keep-gta-discussion-here-/post/4515030/#4515030
The interesting difference between COD:Advanced Warfare vs GTAV is that while GTAV used 6 cores and became limited to 3 cores in 3D Vision, COD:Advanced Warfare uses 4 cores and becomes limited to 2 cores in 3D Vision.
It would seem that [these] games seem to only utilise half the CPU cores in 3D vision compared to 2D.
If someone can recommend other 3D Vision games which use multiple cores well, I would be happy to benchmark them too. It would be interesting to see if they too have the same problem.
I wonder if it could be a draw calls limit issue which DX12 purports to solve i.e. could it be possible that with 3D Vision activated, there are twice the number of draw calls being utilised thereby doubly burdening the main game thread, which results in halving the number of cores being able to be used?
============
@Monstieur: I'm afraid that's not how you measure core usage, friend. :)
The OS will randomly hop the game threads from core to core millions of times a second. You have to set the affinity to various combinations and watch carefully for core saturation and GPU usage to ascertain how many cores are actually being used. Results from cores being "partially used" are meaningless.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
CPU core bottleneck also confirmed for [i][u][b]The Witcher 3[/b][/u][/i]:
Configuration in signature.
Resolution = 2560x1600
VSync Off
[b]Results:[/b]
Intel Xeon x5660 @ [b]4.4GHz, 6 physical cores, Hyper Threading OFF.[/b]
1 cores 3D Vision = 13 fps, GPU1 @ 26% + GPU2 @ 24%, Total GPU Usage = 25%
2 cores 3D Vision = 27 fps, GPU1 @ 38% + GPU2 @ 37%, Total GPU Usage = 38%
3 cores 3D Vision = 38 fps, GPU1 @ 53% + GPU2 @ 54%, Total GPU Usage = 54%
4 cores 3D Vision = 41 fps, GPU1 @ 58% + GPU2 @ 58%, Total GPU Usage = 58%
5 cores 3D Vision = 41 fps, GPU1 @ 58% + GPU2 @ 58%, Total GPU Usage = 58%
6 cores 3D Vision = 41 fps, GPU1 @ 58% + GPU2 @ 58%, Total GPU Usage = 58%
1 cores 2D = 18fps, GPU1 @ 19% + GPU2 @ 21% (toggled off), Total GPU Usage = 20%
2 cores 2D = 34fps, GPU1 @ 36% + GPU2 @ 31% (toggled off), Total GPU Usage = 34%
3 cores 2D = 61fps, GPU1 @ 46% + GPU2 @ 48% (toggled off), Total GPU Usage = 47%
4 cores 2D = 73fps, GPU1 @ 55% + GPU2 @ 55% (toggled off), Total GPU Usage = 55%
5 cores 2D = 78fps, GPU1 @ 60% + GPU2 @ 60% (toggled off), Total GPU Usage = 60%
6 cores 2D = 82fps, GPU1 @ 63% + GPU2 @ 62% (toggled off), Total GPU Usage = 63%
Results show that The Witcher 3 is limited to 3 cores in 3D Vision / 3D Vision discover.
There is no such limit in CM mode.
The GPUs are not even near saturated. If there was no 3D Vsion CPU bottleneck, the results show that one would easily be getting 82FPS, not 41FPS in 3D Vision. That's a 50% decrease due to the CPU alone.
This isn't looking good fellas, it is my humble opinion that we ought to:
1. independently verify these findings - there could be something screwy with my heavily overclocked system after all.
2. If the findings are independently verified, I personally feel that we ought to get to the bottom of this issue sooner rather than later, as it potentially affect all games; it'll only get worse going into the future :-(
3. With the monstrous GPU specs nowadays, with this CPU bottleneck problem fixed, there should be no reason why every one of us can't play games in 3D at the same FPS as they would run in 2D.
Results show that The Witcher 3 is limited to 3 cores in 3D Vision / 3D Vision discover.
There is no such limit in CM mode.
The GPUs are not even near saturated. If there was no 3D Vsion CPU bottleneck, the results show that one would easily be getting 82FPS, not 41FPS in 3D Vision. That's a 50% decrease due to the CPU alone.
This isn't looking good fellas, it is my humble opinion that we ought to:
1. independently verify these findings - there could be something screwy with my heavily overclocked system after all.
2. If the findings are independently verified, I personally feel that we ought to get to the bottom of this issue sooner rather than later, as it potentially affect all games; it'll only get worse going into the future :-(
3. With the monstrous GPU specs nowadays, with this CPU bottleneck problem fixed, there should be no reason why every one of us can't play games in 3D at the same FPS as they would run in 2D.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
GTAV uses 3 cores correct?
But advanced warfare uses only 2 cores, what would be the cause of this if you have any ideas? I know that the idea was that it was using 3 cores.
I play games at 2560x1440, and so far witcher has been GPU bound (It's always at 100% with a pascal titan x), but I haven't gotten back to novigrad yet.
I will repeat tests you have done in games I have doing the cpu affinity thing (Not call of duty).
But advanced warfare uses only 2 cores, what would be the cause of this if you have any ideas? I know that the idea was that it was using 3 cores.
I play games at 2560x1440, and so far witcher has been GPU bound (It's always at 100% with a pascal titan x), but I haven't gotten back to novigrad yet.
I will repeat tests you have done in games I have doing the cpu affinity thing (Not call of duty).
Preliminary conjecture in my post above:
[quote="RAGEdemon"]The interesting difference between COD:Advanced Warfare vs GTAV is that while GTAV used 6 cores and became limited to 3 cores in 3D Vision, COD:Advanced Warfare uses 4 cores and becomes limited to 2 cores in 3D Vision.
It would seem that [these] games seem to only utilise half the CPU cores in 3D vision compared to 2D.
[/quote]
Regarding The Witcher 3: Try central Novigrad square at the lowest gfx and resolution settings so the titan XP isn't bottle-necking as you suggest :)
It would also help immensely if people would list their specs in their sigs.
RAGEdemon said:The interesting difference between COD:Advanced Warfare vs GTAV is that while GTAV used 6 cores and became limited to 3 cores in 3D Vision, COD:Advanced Warfare uses 4 cores and becomes limited to 2 cores in 3D Vision.
It would seem that [these] games seem to only utilise half the CPU cores in 3D vision compared to 2D.
Regarding The Witcher 3: Try central Novigrad square at the lowest gfx and resolution settings so the titan XP isn't bottle-necking as you suggest :)
It would also help immensely if people would list their specs in their sigs.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I am not having any issue running Witcher 3 on 4 core in 3D (Single + 3D Surround) but I am running SLI as well...
However, once 3D is enabled I do see a smaller Core utilization, but still there!
Then again, even in 2D the game doesn't eat up my CPU anyway...
Not sure what to say...
Have you tried to see if there is a difference between 3D Vision and Anaglyph Mode as well?
I am not having any issue running Witcher 3 on 4 core in 3D (Single + 3D Surround) but I am running SLI as well...
However, once 3D is enabled I do see a smaller Core utilization, but still there!
Then again, even in 2D the game doesn't eat up my CPU anyway...
Not sure what to say...
Have you tried to see if there is a difference between 3D Vision and Anaglyph Mode as well?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]I am not having any issue running Witcher 3 on 4 core in 3D (Single + 3D Surround) but I am running SLI as well...
However, once 3D is enabled I do see a smaller Core utilization, but still there!
Then again, even in 2D the game doesn't eat up my CPU anyway...
Not sure what to say...
Have you tried to see if there is a difference between 3D Vision and Anaglyph Mode as well?[/quote]
Thanks for looking into this helifax.
I can confirm same problem with 3D Vision Discover.
CPU usage is 30% with 3D Vision on, and 42% with 3D Vision toggled off.
Interestingly, CM mode, core usage remains @ 42%.
helifax said:I am not having any issue running Witcher 3 on 4 core in 3D (Single + 3D Surround) but I am running SLI as well...
However, once 3D is enabled I do see a smaller Core utilization, but still there!
Then again, even in 2D the game doesn't eat up my CPU anyway...
Not sure what to say...
Have you tried to see if there is a difference between 3D Vision and Anaglyph Mode as well?
Thanks for looking into this helifax.
I can confirm same problem with 3D Vision Discover.
CPU usage is 30% with 3D Vision on, and 42% with 3D Vision toggled off.
Interestingly, CM mode, core usage remains @ 42%.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I'll have to check my TW3 percentages.
In Novigrad (the most CPU demanding area), in 2D, I get 100% CPU usage in all 4 cores, to such extent that the game stutters because the system doesn't get what it needs. 74-85fps.
In 3D, 37-45fps, with CPU usage never reaching 100%. I think it hovers around 75% in all cores (so it uses 3 cores, I guess). I'll check it tonight.
In both cases I'm CPU limited, at 1080p (at 1440p I'm GPU limited).
In Novigrad (the most CPU demanding area), in 2D, I get 100% CPU usage in all 4 cores, to such extent that the game stutters because the system doesn't get what it needs. 74-85fps.
In 3D, 37-45fps, with CPU usage never reaching 100%. I think it hovers around 75% in all cores (so it uses 3 cores, I guess). I'll check it tonight.
In both cases I'm CPU limited, at 1080p (at 1440p I'm GPU limited).
Not to derail so ignore if not relevant, but is this potential issue at all related to my problem I have had recently with certain games in 3d mode? https://forums.geforce.com/default/topic/961616/3d-vision/stuttering-low-fps-in-3dvision-with-only-certain-games/4/
I would also recommend testing the newest Deus Ex: MD as it seems to be a very mutlithread heavy game and will scale up well to at least full 8 cores and has a great 3dVision mode. Not sure if it is 3d Direct or 3d Automatic or if that even matters.
I would also recommend testing the newest Deus Ex: MD as it seems to be a very mutlithread heavy game and will scale up well to at least full 8 cores and has a great 3dVision mode. Not sure if it is 3d Direct or 3d Automatic or if that even matters.
Have updated the Witcher 3 post above with benchmarks, showing it limited to 3 cores in 3D Vision. Conclusion: 50% drop in performance due purely to the 3D Vision CPU bottleneck :(
[quote="terintamel"]Not to derail so ignore if not relevant, but is this potential issue at all related to my problem I have had recently with certain games in 3d mode? https://forums.geforce.com/default/topic/961616/3d-vision/stuttering-low-fps-in-3dvision-with-only-certain-games/4/
I would also recommend testing the newest Deus Ex: MD as it seems to be a very mutlithread heavy game and will scale up well to at least full 8 cores and has a great 3dVision mode. Not sure if it is 3d Direct or 3d Automatic or if that even matters.
[/quote]
It's most probable, as you are on an FX-8350. AMD CPUs have very weak IPCs compared to Intel CPUs. With the core bottleneck, you would indeed get the symptoms you report. That's a huge 50% reduction in performance compared to Intel!
[img]http://www.kitguru.net/wp-content/uploads/2016/02/cinebench-single.png[/img]
Have updated the Witcher 3 post above with benchmarks, showing it limited to 3 cores in 3D Vision. Conclusion: 50% drop in performance due purely to the 3D Vision CPU bottleneck :(
I would also recommend testing the newest Deus Ex: MD as it seems to be a very mutlithread heavy game and will scale up well to at least full 8 cores and has a great 3dVision mode. Not sure if it is 3d Direct or 3d Automatic or if that even matters.
It's most probable, as you are on an FX-8350. AMD CPUs have very weak IPCs compared to Intel CPUs. With the core bottleneck, you would indeed get the symptoms you report. That's a huge 50% reduction in performance compared to Intel!
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
The Witcher 3 screenshots. Keep in mind that fps were variable even standing still. Moving Geralt caused fps drops in all situations, going down to 60fps in 2D without 3Dmigoto. I also have a mod that increases LOD. It eats a few fps, but no more than 5 in 2D.
1- 1080p, 2D, 3Dmigoto disabled (82.5fps):
[img]http://u.cubeupload.com/masterotaku/witcher311080p2D.jpg[/img]
2- 900p, 3D, with 3Dmigoto enabled (46.2fps):
[img]http://u.cubeupload.com/masterotaku/witcher32900p3D.jpg[/img]
3- 900p, 2D after disabling 3D ingame, 3Dmigoto enabled (76fps) (I just noticed that I'm in a different place. But the CPU usage is the important part):
[img]http://u.cubeupload.com/masterotaku/witcher33900p3Dto2D.jpg[/img]
4- 900p, 3D, 3Dmigoto disabled (53.1fps):
[img]http://u.cubeupload.com/masterotaku/witcher34900p3Dno3Dm.jpg[/img]
5- 900p. 2D after disabling 3D ingame, 3Dmigoto disabled (83.8fps):
[img]http://u.cubeupload.com/masterotaku/witcher35900p3Dto2Dn.jpg[/img]
I hope that an overclocked i7 Cannonlake will be able to do 60fps in 3D perfectly next year...
The Witcher 3 screenshots. Keep in mind that fps were variable even standing still. Moving Geralt caused fps drops in all situations, going down to 60fps in 2D without 3Dmigoto. I also have a mod that increases LOD. It eats a few fps, but no more than 5 in 2D.
1- 1080p, 2D, 3Dmigoto disabled (82.5fps):
2- 900p, 3D, with 3Dmigoto enabled (46.2fps):
3- 900p, 2D after disabling 3D ingame, 3Dmigoto enabled (76fps) (I just noticed that I'm in a different place. But the CPU usage is the important part):
4- 900p, 3D, 3Dmigoto disabled (53.1fps):
5- 900p. 2D after disabling 3D ingame, 3Dmigoto disabled (83.8fps):
I hope that an overclocked i7 Cannonlake will be able to do 60fps in 3D perfectly next year...
Was that bug about the cpu bottleneck? I wasn't shure by reading the thread...
ASUS X99-A, i7-5960X, GTX980, 16GB DDR4 2666MHz Corsair, Plextor M.2 SSD 512GB, 3D Vision 2
I posted this here because it's an example of how, he reported it using the official feedback and that it's going to be fixed.
He talked about it in a few threads I think. Here's one of the posts, right before he reported it last week.
https://forums.geforce.com/default/topic/847809/3d-vision/list-of-3d-vision-problems/post/4867118/#4867118
Inno3D RTX 2080 Ti iChill Black (330W Power Limit / +50 MHz Core / +750 MHz Memory)
Intel Core i9-9900X (4.6 GHz Core / 3.0 GHz Mesh)
Corsair Vengeance LPX 32 GB (4 x 8 GB) DDR4 CMK32GX4M4Z3200C16 (4000 MHz 18-20-20-40-1T)
MSI MEG X299 Creation
Asus ROG Swift PG27VQ / Dell S2417DG / 3D Vision 2 / Oculus Rift / Marantz SR6012 / LG OLED55B7T
Intel Optane 900P 280 GB / Tiered Storage Space (Samsung 950 PRO 512 GB / Seagate IronWolf Pro 10 TB)
Windows 10 Pro 64-bit
Bingo! 3D Vision Direct has no problem with 1 or 1000 cores. 3D Vision Automatic, from what I saw doesn't have this limitation either....
I think is the damn Game that limits 3 core utilisation in 3D Vision. (I am talking from the perspective or coding my wrapper to use both 3D Vision Automatic and 3D Vision Direct. Never ever did I hit the 3 core bottleneck:-s) Just check-out Doom(2016) and you will see all all cores being used.
Then again look at Amnesia the Dark Descent (using the exact same wrapper and 3D Vision Auto/Direct) you will always get one core used MAXED.
Engine design...
Sure, 3D Vision Automatic adds quite a bit of overhead (translated into loss of FPS) due to all the heuristics it uses. But, 3D Vision Directs adds ZERO overhead to that! The core limitation is not part of any, from what I tested and saw!
I think GTAV simply limits to 3 cores when 3DVision = Active:) (for some reason...)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Inno3D RTX 2080 Ti iChill Black (330W Power Limit / +50 MHz Core / +750 MHz Memory)
Intel Core i9-9900X (4.6 GHz Core / 3.0 GHz Mesh)
Corsair Vengeance LPX 32 GB (4 x 8 GB) DDR4 CMK32GX4M4Z3200C16 (4000 MHz 18-20-20-40-1T)
MSI MEG X299 Creation
Asus ROG Swift PG27VQ / Dell S2417DG / 3D Vision 2 / Oculus Rift / Marantz SR6012 / LG OLED55B7T
Intel Optane 900P 280 GB / Tiered Storage Space (Samsung 950 PRO 512 GB / Seagate IronWolf Pro 10 TB)
Windows 10 Pro 64-bit
In this case,
Game: Call of Duty:Advanced Warfare.
Configuration in signature.
Resolution = 2560x1600
VSync Off
Results:
Intel Xeon x5660 @ 4.4GHz, 6 physical cores, Hyper Threading OFF.
1 cores 3D Vision = 26 fps, GPU1 @ 14% + GPU2 @ 12%, Total GPU Usage = 13%
2 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
3 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
4 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
5 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
6 cores 3D Vision = 48 fps, GPU1 @ 25% + GPU2 @ 26%, Total GPU Usage = 26%
1 cores 2D = 51 fps, GPU1 @ 25% + GPU2 @ 24% (toggled off), Total GPU Usage = 25%
2 cores 2D = 101fps, GPU1 @ 41% + GPU2 @ 43% (toggled off), Total GPU Usage = 42%
3 cores 2D = 142fps, GPU1 @ 64% + GPU2 @ 65% (toggled off), Total GPU Usage = 65%
4 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
5 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
6 cores 2D = 171fps, GPU1 @ 71% + GPU2 @ 71% (toggled off), Total GPU Usage = 71%
Conclusions:
1. While Call of Duty:Advanced Warfare in 2D uses 4 cores to saturation, as soon as 3D is enabled, it drops down to only 2 cores being saturated. Adding more cores does not yield any more core usage, nor GPU usage.
2. GPU usage scales well with CPU usage all the way up to 4 CPU core saturation, which indicates that there is no "GPU underutilisation" problem. I would recommend overclocking your CPU as high as possible for best results.
So, ladies and gents, from the results, again, the conclusion seems to be that the game becomes severely CPU limited as soon as 3D kicks in as it loses the ability to use more than 2 cores, thereby also severely degrading GPU scaling in the process.
The end result: 3D fps drops to ~32% of 2D fps (68% performance drop, Similar 65% drop measured in GTAV, due purely to the CPU bottleneck).
Since I have shown the same problem in GTAV, it seems as though it is a 3D Vision Driver optimisation problem.
I would also conjecture that we have not seen this before as only recently have games started using more than 2 cores. The driver may well have been designed to only work with a small number of cores.
Link to my original GTAV findings here:
https://forums.geforce.com/default/topic/825678/3d-vision/gta-v-problems-amp-solutions-list-please-keep-gta-discussion-here-/post/4515030/#4515030
The interesting difference between COD:Advanced Warfare vs GTAV is that while GTAV used 6 cores and became limited to 3 cores in 3D Vision, COD:Advanced Warfare uses 4 cores and becomes limited to 2 cores in 3D Vision.
It would seem that [these] games seem to only utilise half the CPU cores in 3D vision compared to 2D.
If someone can recommend other 3D Vision games which use multiple cores well, I would be happy to benchmark them too. It would be interesting to see if they too have the same problem.
I wonder if it could be a draw calls limit issue which DX12 purports to solve i.e. could it be possible that with 3D Vision activated, there are twice the number of draw calls being utilised thereby doubly burdening the main game thread, which results in halving the number of cores being able to be used?
============
@Monstieur: I'm afraid that's not how you measure core usage, friend. :)
The OS will randomly hop the game threads from core to core millions of times a second. You have to set the affinity to various combinations and watch carefully for core saturation and GPU usage to ascertain how many cores are actually being used. Results from cores being "partially used" are meaningless.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Configuration in signature.
Resolution = 2560x1600
VSync Off
Results:
Intel Xeon x5660 @ 4.4GHz, 6 physical cores, Hyper Threading OFF.
1 cores 3D Vision = 13 fps, GPU1 @ 26% + GPU2 @ 24%, Total GPU Usage = 25%
2 cores 3D Vision = 27 fps, GPU1 @ 38% + GPU2 @ 37%, Total GPU Usage = 38%
3 cores 3D Vision = 38 fps, GPU1 @ 53% + GPU2 @ 54%, Total GPU Usage = 54%
4 cores 3D Vision = 41 fps, GPU1 @ 58% + GPU2 @ 58%, Total GPU Usage = 58%
5 cores 3D Vision = 41 fps, GPU1 @ 58% + GPU2 @ 58%, Total GPU Usage = 58%
6 cores 3D Vision = 41 fps, GPU1 @ 58% + GPU2 @ 58%, Total GPU Usage = 58%
1 cores 2D = 18fps, GPU1 @ 19% + GPU2 @ 21% (toggled off), Total GPU Usage = 20%
2 cores 2D = 34fps, GPU1 @ 36% + GPU2 @ 31% (toggled off), Total GPU Usage = 34%
3 cores 2D = 61fps, GPU1 @ 46% + GPU2 @ 48% (toggled off), Total GPU Usage = 47%
4 cores 2D = 73fps, GPU1 @ 55% + GPU2 @ 55% (toggled off), Total GPU Usage = 55%
5 cores 2D = 78fps, GPU1 @ 60% + GPU2 @ 60% (toggled off), Total GPU Usage = 60%
6 cores 2D = 82fps, GPU1 @ 63% + GPU2 @ 62% (toggled off), Total GPU Usage = 63%
Results show that The Witcher 3 is limited to 3 cores in 3D Vision / 3D Vision discover.
There is no such limit in CM mode.
The GPUs are not even near saturated. If there was no 3D Vsion CPU bottleneck, the results show that one would easily be getting 82FPS, not 41FPS in 3D Vision. That's a 50% decrease due to the CPU alone.
This isn't looking good fellas, it is my humble opinion that we ought to:
1. independently verify these findings - there could be something screwy with my heavily overclocked system after all.
2. If the findings are independently verified, I personally feel that we ought to get to the bottom of this issue sooner rather than later, as it potentially affect all games; it'll only get worse going into the future :-(
3. With the monstrous GPU specs nowadays, with this CPU bottleneck problem fixed, there should be no reason why every one of us can't play games in 3D at the same FPS as they would run in 2D.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
But advanced warfare uses only 2 cores, what would be the cause of this if you have any ideas? I know that the idea was that it was using 3 cores.
I play games at 2560x1440, and so far witcher has been GPU bound (It's always at 100% with a pascal titan x), but I haven't gotten back to novigrad yet.
I will repeat tests you have done in games I have doing the cpu affinity thing (Not call of duty).
I'm ishiki, forum screwed up my name.
7700k @4.7 GHZ, 16GBDDR4@3466MHZ, 2080 Ti
Regarding The Witcher 3: Try central Novigrad square at the lowest gfx and resolution settings so the titan XP isn't bottle-necking as you suggest :)
It would also help immensely if people would list their specs in their sigs.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
However, once 3D is enabled I do see a smaller Core utilization, but still there!
Then again, even in 2D the game doesn't eat up my CPU anyway...
Not sure what to say...
Have you tried to see if there is a difference between 3D Vision and Anaglyph Mode as well?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Thanks for looking into this helifax.
I can confirm same problem with 3D Vision Discover.
CPU usage is 30% with 3D Vision on, and 42% with 3D Vision toggled off.
Interestingly, CM mode, core usage remains @ 42%.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
In Novigrad (the most CPU demanding area), in 2D, I get 100% CPU usage in all 4 cores, to such extent that the game stutters because the system doesn't get what it needs. 74-85fps.
In 3D, 37-45fps, with CPU usage never reaching 100%. I think it hovers around 75% in all cores (so it uses 3 cores, I guess). I'll check it tonight.
In both cases I'm CPU limited, at 1080p (at 1440p I'm GPU limited).
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
I would also recommend testing the newest Deus Ex: MD as it seems to be a very mutlithread heavy game and will scale up well to at least full 8 cores and has a great 3dVision mode. Not sure if it is 3d Direct or 3d Automatic or if that even matters.
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
It's most probable, as you are on an FX-8350. AMD CPUs have very weak IPCs compared to Intel CPUs. With the core bottleneck, you would indeed get the symptoms you report. That's a huge 50% reduction in performance compared to Intel!
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
1- 1080p, 2D, 3Dmigoto disabled (82.5fps):
2- 900p, 3D, with 3Dmigoto enabled (46.2fps):
3- 900p, 2D after disabling 3D ingame, 3Dmigoto enabled (76fps) (I just noticed that I'm in a different place. But the CPU usage is the important part):
4- 900p, 3D, 3Dmigoto disabled (53.1fps):
5- 900p. 2D after disabling 3D ingame, 3Dmigoto disabled (83.8fps):
I hope that an overclocked i7 Cannonlake will be able to do 60fps in 3D perfectly next year...
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com