[quote="nimis"]
Almost same setup, he/she have UD3 board mine is high end UD7 and I only have Witcher 3 on that list and it runs well and CPU usage is same 2D and 3D. I use adaptive sync.[/quote]
I don't have Witcher 3 so I cannot attest to how my system would perform in that game. Also in all my tests I make sure to run with Vsync off, so I don't think running with adaptive vsync would make a difference to my results.
Almost same setup, he/she have UD3 board mine is high end UD7 and I only have Witcher 3 on that list and it runs well and CPU usage is same 2D and 3D. I use adaptive sync.
I don't have Witcher 3 so I cannot attest to how my system would perform in that game. Also in all my tests I make sure to run with Vsync off, so I don't think running with adaptive vsync would make a difference to my results.
[quote="RAGEdemon"]I've started a thread here:
https://forums.geforce.com/default/topic/966422/3d-vision/3d-vision-cpu-bottelneck-gathering-information-thread-/
Any suggestions welcome.
It's important that we all chip in to this, as the problem affects us all, if not now, then definitely in time.[/quote]
@RAGEdemon - Awesome job with the thread. I will add information from other games as I have time to test.
I recommend updating the instruction to also add Processor, Card, Driver, Game DX version, and OS. Also test with 3dvision driver disabled in Control Panel.
It's important that we all chip in to this, as the problem affects us all, if not now, then definitely in time.
@RAGEdemon - Awesome job with the thread. I will add information from other games as I have time to test.
I recommend updating the instruction to also add Processor, Card, Driver, Game DX version, and OS. Also test with 3dvision driver disabled in Control Panel.
While the thread is a good idea. I'm think the way where you showed each core, and then set affinity is a better way to do testing.
In mankind divided for instance I don't get any CPU bottleneck as I get between 95-99% usage on all CPU cores, and 80-99% usage on GPU, in 3D or in 2D.
But you obviously got a bottleneck, and you have 6 core CPU.
While the thread is a good idea. I'm think the way where you showed each core, and then set affinity is a better way to do testing.
In mankind divided for instance I don't get any CPU bottleneck as I get between 95-99% usage on all CPU cores, and 80-99% usage on GPU, in 3D or in 2D.
But you obviously got a bottleneck, and you have 6 core CPU.
@RAGEdemon - I am still not 100% certain my issue is due to a CPU core limit bug. Here is why. On two of the games that prompted me to bring up my issue (Far Cry 3/Arkham Origins), I was NOT able to reproduce the stuttering issues with 3dVision disabled in the Control Panel by limiting the cores via setting the Affinity.
In both games on 1 core only the FPS was under 10, but once I enabled the second core both games saw the FPS shoot and I could not reproduce any stutter at all.
Quote from other thread:
[quote="terintamel"]
I assume you wanted me to test w/3d Vision disabled to then simulate running the game with different active cores to see if I could cause it to stutter?
I loaded up FC3 in 2d mode and then ran it enabling one core at a time recording the avg framerate and seeing if there was any stutter. To my surprise I was not able to cause the game to stutter at all no matter how may cores I had active. Except the fps on 1 core was unusable.
Core
1 - 5 fps
2 - 38 fps
3 - 69 fps
4 - 70 fps
5 - 80 fps
6 - 80 fps
7 - 82 fps
8 - 83 fps
What does this tell you? It seems to tell me my FC3 stutter issues are not caused by a "weak" CPU.
I then loaded the game up w/3d Vision enabled but off in game. I did the same tests and all disabling cores did was lower my fps it did not lessen or worsen the stutter.
Tried 3d again and this time turned every setting to LOW with no Vsync, AA, and a resolution of 1440x900. Framerate greatly improved. Stutter just as bad.[/quote]
However, I do believe the issue could still be driver/CPU related like some thread important to 3dVision is getting stalled and that then stalls the whole rendering pipeline leading to stutters. I also am still of the belief that the bug was introduced in drivers after 337.88.
@RAGEdemon - I am still not 100% certain my issue is due to a CPU core limit bug. Here is why. On two of the games that prompted me to bring up my issue (Far Cry 3/Arkham Origins), I was NOT able to reproduce the stuttering issues with 3dVision disabled in the Control Panel by limiting the cores via setting the Affinity.
In both games on 1 core only the FPS was under 10, but once I enabled the second core both games saw the FPS shoot and I could not reproduce any stutter at all.
Quote from other thread:
terintamel said:
I assume you wanted me to test w/3d Vision disabled to then simulate running the game with different active cores to see if I could cause it to stutter?
I loaded up FC3 in 2d mode and then ran it enabling one core at a time recording the avg framerate and seeing if there was any stutter. To my surprise I was not able to cause the game to stutter at all no matter how may cores I had active. Except the fps on 1 core was unusable.
What does this tell you? It seems to tell me my FC3 stutter issues are not caused by a "weak" CPU.
I then loaded the game up w/3d Vision enabled but off in game. I did the same tests and all disabling cores did was lower my fps it did not lessen or worsen the stutter.
Tried 3d again and this time turned every setting to LOW with no Vsync, AA, and a resolution of 1440x900. Framerate greatly improved. Stutter just as bad.
However, I do believe the issue could still be driver/CPU related like some thread important to 3dVision is getting stalled and that then stalls the whole rendering pipeline leading to stutters. I also am still of the belief that the bug was introduced in drivers after 337.88.
I am taking this a good news that Nvidia's response to ticket is to now escalate it to a Level 2 technician. Progress :)
[quote="terintamel"]
I just got a response from Nvidia asking me to try older drivers. I went back to the oldest driver the 1060 allows and still got the same issue. Below is my response to Nvidia.
=========================
I tried the oldest windows 10 driver that the gtx 1060 can use and the issue still persists. In an effort to make this issue clearer and easier for Nvidia to possibly reproduce I want to focus for now on just one game from my list; Batman: Arkham Origins. A game that has no business providing the performance it does just by enabling the 3dvision driver.
The reason I want to focus on Arkham Origins is the hard stutter happens to me on my current system and I know with 100% certainty happened on my old system (on any driver after 337.88). That might make it a good way to prove that the issue is not solely caused by my current system and is in fact a driver bug possible introduced after 337.88.
Here are my thoughts. I ran Arkham Origins in 3dvision at 1440x900 with all settings on high (NO AA) in Windows 8 on a Phenom II X4 955BE CPU with a gtx 660ti and a driver under 337.88. The recommended CPU for the game is a Phenom II X4 965. My FX-8350 has multicore performance well above the 965 and single core performance is also moderately better.
I then upgraded to Windows 8.1 and moved to a new MB and CPU (FX-8350) but kept all the same other components (ram, sound, 660ti) and still on any driver 337.88 or lower I got NO hard stutter at all when gliding through the city in Akrham Origins.
I then upgraded to a driver over 337.88 and got the EXACT same STUTTER as I am seeing now in Win 10 on a gtx 1060 with driver 368.81 or higher.
Even if I drop my in game settings to everything off and a resolution of 1280x720 the hard stutter remains only if 3dvision is enabled (even if 3d is off in game).
Quick example -
1280x720 all settings to lowest value
With 3dvision enabled, but off in game. Standing on roof top I get 154.00fps. Gliding between roofs (my video example) I get as low as 22fps with 10% GPU usage.
With 3dvision disabled. Standing on roof top I get 173.9fps. Gliding between roofs (my video example) I get as low as 71fps with 44% GPU usage.
Here some examples including video footage showing the issue. Pay attention to CPU usage per core, Framerate and GPU usage.
1280x720 all settings to lowest (Trying to eliminate any CPU or GPU bottleneck)
Dx11 3dvision Disabled
Gliding
71.7 fps 30% GPU
Average Core usage 31.25%
https://www.youtube.com/watch?v=oeM9ANxhuII&feature=youtu.be
Dx11 3dvision Enabled but 3d off in game
Gliding
36.7 fps 13% GPU
Average Core usage 20.62%
https://www.youtube.com/watch?v=ujm7gse-_6c&feature=youtu.be
I think this is something Nvidia should be able to reproduce, if you so choose. I can even send you my save game files for this game.
I am no expert, but an idea that has been floating around the 3dVision forums is that the 3dvision driver has a serious multithreading issue where something in the driver makes it so the CPU cannot feed the GPU fast enough and that would lead to stutter, fps drops, and lower than expected GPU usage. The stutter will be more pronounced on CPUs with lower Instructions Per Clock like AMD, and might not be noticed as much on Intel CPU with their vastly superior IPC.
If interested I can link to the forum thread.
===============================[/quote]
I am taking this a good news that Nvidia's response to ticket is to now escalate it to a Level 2 technician. Progress :)
terintamel said:
I just got a response from Nvidia asking me to try older drivers. I went back to the oldest driver the 1060 allows and still got the same issue. Below is my response to Nvidia.
=========================
I tried the oldest windows 10 driver that the gtx 1060 can use and the issue still persists. In an effort to make this issue clearer and easier for Nvidia to possibly reproduce I want to focus for now on just one game from my list; Batman: Arkham Origins. A game that has no business providing the performance it does just by enabling the 3dvision driver.
The reason I want to focus on Arkham Origins is the hard stutter happens to me on my current system and I know with 100% certainty happened on my old system (on any driver after 337.88). That might make it a good way to prove that the issue is not solely caused by my current system and is in fact a driver bug possible introduced after 337.88.
Here are my thoughts. I ran Arkham Origins in 3dvision at 1440x900 with all settings on high (NO AA) in Windows 8 on a Phenom II X4 955BE CPU with a gtx 660ti and a driver under 337.88. The recommended CPU for the game is a Phenom II X4 965. My FX-8350 has multicore performance well above the 965 and single core performance is also moderately better.
I then upgraded to Windows 8.1 and moved to a new MB and CPU (FX-8350) but kept all the same other components (ram, sound, 660ti) and still on any driver 337.88 or lower I got NO hard stutter at all when gliding through the city in Akrham Origins.
I then upgraded to a driver over 337.88 and got the EXACT same STUTTER as I am seeing now in Win 10 on a gtx 1060 with driver 368.81 or higher.
Even if I drop my in game settings to everything off and a resolution of 1280x720 the hard stutter remains only if 3dvision is enabled (even if 3d is off in game).
Quick example -
1280x720 all settings to lowest value
With 3dvision enabled, but off in game. Standing on roof top I get 154.00fps. Gliding between roofs (my video example) I get as low as 22fps with 10% GPU usage.
With 3dvision disabled. Standing on roof top I get 173.9fps. Gliding between roofs (my video example) I get as low as 71fps with 44% GPU usage.
Here some examples including video footage showing the issue. Pay attention to CPU usage per core, Framerate and GPU usage.
1280x720 all settings to lowest (Trying to eliminate any CPU or GPU bottleneck)
Dx11 3dvision Disabled
Gliding
71.7 fps 30% GPU
Average Core usage 31.25%
;feature=youtu.be
Dx11 3dvision Enabled but 3d off in game
Gliding
36.7 fps 13% GPU
Average Core usage 20.62%
;feature=youtu.be
I think this is something Nvidia should be able to reproduce, if you so choose. I can even send you my save game files for this game.
I am no expert, but an idea that has been floating around the 3dVision forums is that the 3dvision driver has a serious multithreading issue where something in the driver makes it so the CPU cannot feed the GPU fast enough and that would lead to stutter, fps drops, and lower than expected GPU usage. The stutter will be more pronounced on CPUs with lower Instructions Per Clock like AMD, and might not be noticed as much on Intel CPU with their vastly superior IPC.
If interested I can link to the forum thread.
===============================
[quote="BlueSkyDefender"]Terintamel What power supply you are running with your setup? Also how old is it.[/quote]
Corsiar CX650 I believe. It is around 4 years old. Keep in mind I had no issues with the power supply powering my current system when I had a 660ti and 650ti (dedicated physx) instead of the single 1060.
BlueSkyDefender said:Terintamel What power supply you are running with your setup? Also how old is it.
Corsiar CX650 I believe. It is around 4 years old. Keep in mind I had no issues with the power supply powering my current system when I had a 660ti and 650ti (dedicated physx) instead of the single 1060.
[quote="terintamel"][quote="BlueSkyDefender"]Terintamel What power supply you are running with your setup? Also how old is it.[/quote]
Corsiar CX650 I believe. It is around 4 years old. Keep in mind I had no issues with the power supply powering my current system when I had a 660ti and 650ti (dedicated physx) instead of the single 1060. [/quote]
Ya I wonder whats going on with your system.
Here are my system specs.
CPU 1055T running Near 4ghz
MOBO MSI krate 970a With Komputer Ram 2 sticks of 4gb so 8gb total
GPU MSI armor GTX 1070
PSU Corsair HX 1050W
So I tested with your same settings with the same game and I don't get what you get. I don't think it's CPU related since your CPU is better then mine.
BlueSkyDefender said:Terintamel What power supply you are running with your setup? Also how old is it.
Corsiar CX650 I believe. It is around 4 years old. Keep in mind I had no issues with the power supply powering my current system when I had a 660ti and 650ti (dedicated physx) instead of the single 1060.
Ya I wonder whats going on with your system.
Here are my system specs.
CPU 1055T running Near 4ghz
MOBO MSI krate 970a With Komputer Ram 2 sticks of 4gb so 8gb total
GPU MSI armor GTX 1070
PSU Corsair HX 1050W
So I tested with your same settings with the same game and I don't get what you get. I don't think it's CPU related since your CPU is better then mine.
My stutter in Arkham Origins is now gone.
[url]https://forums.geforce.com/default/topic/966422/3d-vision/3d-vision-cpu-bottelneck-gathering-information-thread-/post/4982099/#4982099[/url]
Ignore...
Problem I found wasn't related to 3D Vision driver.
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
I have just installed MSI afterburner to see what happen for curiosity. I have lowered resolution to 1280x720 and everything in Nvidia CP in default (so the confiuration in the game decides).
Using 1280x720@60fps cap gives me a surpresive result of:
- Nvidia 3D Vision OFF: 47% usage of GPU and 62% average usage of 6 cpu cores (Xeon X5670 @4Ghz). As a result I get stable 60 FPS.
- Nvidia 3D Vision ON: 56% usage of GPU and 48% average usage of 6 cpu cores. I get 44 FPS.
Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 65% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
So 1st question: WHY THE SYTEM NEVER USES 100% OF THE RESOURCES TO GET MORE FPS?... I mean, if I use unlimited cap, is it not supposed to tell the system to apply all possible resources to the game to give me as many fps as possible?... why am I getting 44 FPS while only using 56%gpu and 47%cpu and none of them working at 100% (and supposedly creating bottleneck to the other)?
2dn question: is there anything we can do to solve the problem? I did not read all posts about this problem, and I have only tested with Witcher 3.
Still using Windows 7 64bits, MSI GTX 1070, CPU Xeon X5670 (still not convinced to upgrade..., why if only using 50% of the resources apparently?).
I have just installed MSI afterburner to see what happen for curiosity. I have lowered resolution to 1280x720 and everything in Nvidia CP in default (so the confiuration in the game decides).
Using 1280x720@60fps cap gives me a surpresive result of:
- Nvidia 3D Vision OFF: 47% usage of GPU and 62% average usage of 6 cpu cores (Xeon X5670 @4Ghz). As a result I get stable 60 FPS.
- Nvidia 3D Vision ON: 56% usage of GPU and 48% average usage of 6 cpu cores. I get 44 FPS.
Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 65% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
So 1st question: WHY THE SYTEM NEVER USES 100% OF THE RESOURCES TO GET MORE FPS?... I mean, if I use unlimited cap, is it not supposed to tell the system to apply all possible resources to the game to give me as many fps as possible?... why am I getting 44 FPS while only using 56%gpu and 47%cpu and none of them working at 100% (and supposedly creating bottleneck to the other)?
2dn question: is there anything we can do to solve the problem? I did not read all posts about this problem, and I have only tested with Witcher 3.
Still using Windows 7 64bits, MSI GTX 1070, CPU Xeon X5670 (still not convinced to upgrade..., why if only using 50% of the resources apparently?).
ok, I paste the same I wrote in the gathering information thread, in case somebody here knows what is happening.
Sorry,... Is there any final conclusion about what happen with CPU cut when activating 3D?
My opinion (I have not a lot of knoledgement about how cpu/gpu stress work):
1) I have read several times that stereo ON = double GPU stress and 0% increase of CPU stress, so if that is correct the FPS in a game (played in 3D) without any kind of cpu bottleneck should be exactly 1/2, but we know it does not happen, usually we lose less than that, also depending on what game we are talking about, so the sentence seems to be not true and seems to depend on many other things.
2) I thought I could determine a GPU totally stressed if msi afterburner tells me that it is worling at 100% of use (and that seems to be right), but I also thought that I could determine a CPU stress the same way, but I am reading today that that is not true and it depend on the game and how the CPU handle the different threads. So it mean that a CPU can be the cause of a bottleneck and msi afterburner is telling me that is using the cpu only at 55% (I really don't understand of of this).
3) Making a test with The Witcher 3 there is something weird that I don't understand. Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 65% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
And the question is, for me obviously, what is happening with CPU and GPU that only stress to 54% and 47% giving me only 44 fps..., is that true that the cpu bottleneck in this scenario is located only on 47% (with 3D) and 55% (with 2D)? But there must be something else because if I raise a bit the resolution of the game the GPU still is not stressed 100% and I get less FPS (theoretically that should not happen because +res only implies more gpu stress, and the CPU is causing exactly the same bottleneck than before).
ok, I paste the same I wrote in the gathering information thread, in case somebody here knows what is happening.
Sorry,... Is there any final conclusion about what happen with CPU cut when activating 3D?
My opinion (I have not a lot of knoledgement about how cpu/gpu stress work):
1) I have read several times that stereo ON = double GPU stress and 0% increase of CPU stress, so if that is correct the FPS in a game (played in 3D) without any kind of cpu bottleneck should be exactly 1/2, but we know it does not happen, usually we lose less than that, also depending on what game we are talking about, so the sentence seems to be not true and seems to depend on many other things.
2) I thought I could determine a GPU totally stressed if msi afterburner tells me that it is worling at 100% of use (and that seems to be right), but I also thought that I could determine a CPU stress the same way, but I am reading today that that is not true and it depend on the game and how the CPU handle the different threads. So it mean that a CPU can be the cause of a bottleneck and msi afterburner is telling me that is using the cpu only at 55% (I really don't understand of of this).
3) Making a test with The Witcher 3 there is something weird that I don't understand. Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 65% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
And the question is, for me obviously, what is happening with CPU and GPU that only stress to 54% and 47% giving me only 44 fps..., is that true that the cpu bottleneck in this scenario is located only on 47% (with 3D) and 55% (with 2D)? But there must be something else because if I raise a bit the resolution of the game the GPU still is not stressed 100% and I get less FPS (theoretically that should not happen because +res only implies more gpu stress, and the CPU is causing exactly the same bottleneck than before).
I don't have Witcher 3 so I cannot attest to how my system would perform in that game. Also in all my tests I make sure to run with Vsync off, so I don't think running with adaptive vsync would make a difference to my results.
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
@RAGEdemon - Awesome job with the thread. I will add information from other games as I have time to test.
I recommend updating the instruction to also add Processor, Card, Driver, Game DX version, and OS. Also test with 3dvision driver disabled in Control Panel.
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
In mankind divided for instance I don't get any CPU bottleneck as I get between 95-99% usage on all CPU cores, and 80-99% usage on GPU, in 3D or in 2D.
But you obviously got a bottleneck, and you have 6 core CPU.
I'm ishiki, forum screwed up my name.
7700k @4.7 GHZ, 16GBDDR4@3466MHZ, 2080 Ti
In both games on 1 core only the FPS was under 10, but once I enabled the second core both games saw the FPS shoot and I could not reproduce any stutter at all.
Quote from other thread:
However, I do believe the issue could still be driver/CPU related like some thread important to 3dVision is getting stalled and that then stalls the whole rendering pipeline leading to stutters. I also am still of the belief that the bug was introduced in drivers after 337.88.
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
Definitely something funky going on with drivers or pascal architecture.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Corsiar CX650 I believe. It is around 4 years old. Keep in mind I had no issues with the power supply powering my current system when I had a 660ti and 650ti (dedicated physx) instead of the single 1060.
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
Ya I wonder whats going on with your system.
Here are my system specs.
CPU 1055T running Near 4ghz
MOBO MSI krate 970a With Komputer Ram 2 sticks of 4gb so 8gb total
GPU MSI armor GTX 1070
PSU Corsair HX 1050W
So I tested with your same settings with the same game and I don't get what you get. I don't think it's CPU related since your CPU is better then mine.
https://forums.geforce.com/default/topic/966422/3d-vision/3d-vision-cpu-bottelneck-gathering-information-thread-/post/4982099/#4982099
AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709
Problem I found wasn't related to 3D Vision driver.
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Using 1280x720@60fps cap gives me a surpresive result of:
- Nvidia 3D Vision OFF: 47% usage of GPU and 62% average usage of 6 cpu cores (Xeon X5670 @4Ghz). As a result I get stable 60 FPS.
- Nvidia 3D Vision ON: 56% usage of GPU and 48% average usage of 6 cpu cores. I get 44 FPS.
Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 65% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
So 1st question: WHY THE SYTEM NEVER USES 100% OF THE RESOURCES TO GET MORE FPS?... I mean, if I use unlimited cap, is it not supposed to tell the system to apply all possible resources to the game to give me as many fps as possible?... why am I getting 44 FPS while only using 56%gpu and 47%cpu and none of them working at 100% (and supposedly creating bottleneck to the other)?
2dn question: is there anything we can do to solve the problem? I did not read all posts about this problem, and I have only tested with Witcher 3.
Still using Windows 7 64bits, MSI GTX 1070, CPU Xeon X5670 (still not convinced to upgrade..., why if only using 50% of the resources apparently?).
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Sorry,... Is there any final conclusion about what happen with CPU cut when activating 3D?
My opinion (I have not a lot of knoledgement about how cpu/gpu stress work):
1) I have read several times that stereo ON = double GPU stress and 0% increase of CPU stress, so if that is correct the FPS in a game (played in 3D) without any kind of cpu bottleneck should be exactly 1/2, but we know it does not happen, usually we lose less than that, also depending on what game we are talking about, so the sentence seems to be not true and seems to depend on many other things.
2) I thought I could determine a GPU totally stressed if msi afterburner tells me that it is worling at 100% of use (and that seems to be right), but I also thought that I could determine a CPU stress the same way, but I am reading today that that is not true and it depend on the game and how the CPU handle the different threads. So it mean that a CPU can be the cause of a bottleneck and msi afterburner is telling me that is using the cpu only at 55% (I really don't understand of of this).
3) Making a test with The Witcher 3 there is something weird that I don't understand. Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 65% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
And the question is, for me obviously, what is happening with CPU and GPU that only stress to 54% and 47% giving me only 44 fps..., is that true that the cpu bottleneck in this scenario is located only on 47% (with 3D) and 55% (with 2D)? But there must be something else because if I raise a bit the resolution of the game the GPU still is not stressed 100% and I get less FPS (theoretically that should not happen because +res only implies more gpu stress, and the CPU is causing exactly the same bottleneck than before).
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)