3D Vision CPU Bottelneck: Gathering Information thread.
7 / 22
Actually, yes.
[color="green"]Hi Shahzad,
[u][b]Our performance lab ran through similar test and confirm we are observing 10% - 20% reduction in overall CPU usage with stereo enable.[/b][/u] But we are not observing FPS drop as low as what your results shows with stereo enable. We are always getting FPS more than 50% compared to stereo disable. We used almost similar hardware configuration in our testing matching CPU, RAM, etc...
[u][b]The reduce CPU usage is definitely something we will need to analyze further[/b][/u], but development is more concern with the FPS drop. They will need your help with more specific information so they can try replicating the FPS drop scenario. Development would like to focus on GTA V since that
1. Please confirm the exact system configuration used to captured the GTA V results.
2. Please provide the exact in-game settings as well as NVIDIA Control Panel setting used for collecting the GTA V performance data.
3. Please provide exact GTA V game location (level, map) where the performance data was collected. Video may help to locate exact location in house. With this info we will try to play at same location and compare the FPS.
4. Please use GPUView tool to capture an event trace log while in the failing state. The log will be very helpful to development in analyzing why FPS is reduced with stereo enabled. More on how to use GPUView to capture event log here: http://nvidia.custhelp.com/app/answers/detail/a_id/3507/~/generating-an-event-trace-log-for-gpuview
Thanks in advanced for the details.
Best regards,
Ray
[/color]
[color="orange"]Hi Ray,
Thank you kindly for getting back to us.
I have provided the data below, but before that we would like to make a clarification:
From your comments, our understanding is that the development team are implying that 'It is perfectly normal for 3D Vision to cause a 50% drop in FPS'. They see nothing wrong with this, so are trying to ascertain scenarios where the FPS drop is larger than 50% for further investigation.
Please note that the point of contention isn't necessarily that 3D Vision makes the FPS drop more than 50% (as seems to be the implication from your queries from the development team) - this is an over simplification - we indeed understand perfectly well that 3D Vision puts double the strain on the GPU.
Our contention is that 3D Vision unfortunately causes the CPU and GPU usage to drop dramatically compared to 2D, which causes a huge drop in performance. We contend that if the CPU and GPU usage did not drop with 3D Vision toggling ON, the game FPS would be far greater. Our understanding is that in scenarios where the GPU isn't being bottlenecked in 2D or 3D, when 3D Vision is toggled on, while the 3D Vision CPU usage should remain the same as the 2D CPU usage, and GPU usage should be higher (about double, to account for double the power required for 3D Vision). In such scenarios, the 2D and 3D Vision FPS should be similar.
The development team say that they do not observe more than a 50% drop in FPS. Do they not observe a dramatic decrease in GPU usage (as well as the confirmed drop in CPU usage)? I hope they can acknowledge that indeed if the GPU usage (and CPU usage) did not drop so strikingly (GPU usage should in fact increase while CPU usage stays the same), the 3D Vision FPS would be a lot higher.
In this context, the apparent understanding of the development team that 'It is normal for 3D Vision to cause a 50% drop in FPS' is unfortunately not accurate :(
Regarding your questions:
1.
Exact system configuration attached in file "SHAHZAD-PC.txt".
GPU: 2x GTX 1080 in SLi with HB Bridge, both @ PCIe x16 2.0
Note 1: CPU is being OC'd to 4.4 GHz but this doesn't make any difference to the points in contention.
Note 2: The CPU is a 6 core 12 thread Xeon which you might find difficult to get a hold of. It is identical in every practical way to an i7 980X, which you should be able to use in place.
2.
All GTA5 settings at default. Resolution: 2560x1600, 120Hz, VSync ON, 3D Vision ON.
nVidia CP settings: DSR 4.00x, texture filtering "High Quality. SLi Enabled via High Bandwidth bridge.
3.
After the prologue, you start the game as the protagonist Franklin (the black gentleman), in an alley. If you run straight ahead for 5 seconds, you are standing on a highway. Stand on the striped lines between the outside and middle traffic lanes and face left so you are viewing the long highway directly at oncoming traffic - traffic on both lanes should pass either side of you.
4.
Please find the file Merged.zip (Size: ~500 MB) on the following google drive link:
https://drive.google.com/drive/folders/0B--vBaCOZKVeTjFZaDE1TldHdVU?usp=sharing
The loading of the game takes a while but once in, I take ~10 seconds to run to the test spot with 3D Vision ENABLED. The 3D FPS here is 38 FPS and GPU usage is 37% on each GPU. I wait for ~10 seconds and then toggle 3D Vision OFF.
The 2D FPS is 95 FPS and the GPU usage has increased to 58% on each GPU. I wait for ~10 seconds and then toggle 3D Vision ON.
The FPS again jumps down to 38 FPS and GPU usage is back down to 37% on each GPU. I wait for ~10 seconds and then exit the game.
If there is anything else that I can help you guys with, please let me know.
Kind regards,
-- Shahzad
[/color]
[color="green"]
Thanks for the update and additional details. I will add these to the bug report and let development review/comment.
Best regards,
Ray
[/color]
As we have seen, it is quite a challenge to bring even the professionals up to speed on performance metrics and their meaning. I think the problem is that people are used to dealing with kid gamers, not us grownup professionals with a technical background in various engineering disciplines who just happen to have gaming as a hobby.
Not only do we have to explain the problem in detail, but we have to also explain what the data means, and more importantly, how to interpret that information correctly.
They are clever folk. We just need to ensure that they don't write off the problem as something it has proven to be not, as is the tendency.
Some progress is being made however, for which I am glad.
Our performance lab ran through similar test and confirm we are observing 10% - 20% reduction in overall CPU usage with stereo enable. But we are not observing FPS drop as low as what your results shows with stereo enable. We are always getting FPS more than 50% compared to stereo disable. We used almost similar hardware configuration in our testing matching CPU, RAM, etc...
The reduce CPU usage is definitely something we will need to analyze further, but development is more concern with the FPS drop. They will need your help with more specific information so they can try replicating the FPS drop scenario. Development would like to focus on GTA V since that
1. Please confirm the exact system configuration used to captured the GTA V results.
2. Please provide the exact in-game settings as well as NVIDIA Control Panel setting used for collecting the GTA V performance data.
3. Please provide exact GTA V game location (level, map) where the performance data was collected. Video may help to locate exact location in house. With this info we will try to play at same location and compare the FPS.
4. Please use GPUView tool to capture an event trace log while in the failing state. The log will be very helpful to development in analyzing why FPS is reduced with stereo enabled. More on how to use GPUView to capture event log here: http://nvidia.custhelp.com/app/answers/detail/a_id/3507/~/generating-an-event-trace-log-for-gpuview
Thanks in advanced for the details.
Best regards,
Ray
Hi Ray,
Thank you kindly for getting back to us.
I have provided the data below, but before that we would like to make a clarification:
From your comments, our understanding is that the development team are implying that 'It is perfectly normal for 3D Vision to cause a 50% drop in FPS'. They see nothing wrong with this, so are trying to ascertain scenarios where the FPS drop is larger than 50% for further investigation.
Please note that the point of contention isn't necessarily that 3D Vision makes the FPS drop more than 50% (as seems to be the implication from your queries from the development team) - this is an over simplification - we indeed understand perfectly well that 3D Vision puts double the strain on the GPU.
Our contention is that 3D Vision unfortunately causes the CPU and GPU usage to drop dramatically compared to 2D, which causes a huge drop in performance. We contend that if the CPU and GPU usage did not drop with 3D Vision toggling ON, the game FPS would be far greater. Our understanding is that in scenarios where the GPU isn't being bottlenecked in 2D or 3D, when 3D Vision is toggled on, while the 3D Vision CPU usage should remain the same as the 2D CPU usage, and GPU usage should be higher (about double, to account for double the power required for 3D Vision). In such scenarios, the 2D and 3D Vision FPS should be similar.
The development team say that they do not observe more than a 50% drop in FPS. Do they not observe a dramatic decrease in GPU usage (as well as the confirmed drop in CPU usage)? I hope they can acknowledge that indeed if the GPU usage (and CPU usage) did not drop so strikingly (GPU usage should in fact increase while CPU usage stays the same), the 3D Vision FPS would be a lot higher.
In this context, the apparent understanding of the development team that 'It is normal for 3D Vision to cause a 50% drop in FPS' is unfortunately not accurate :(
Regarding your questions:
1.
Exact system configuration attached in file "SHAHZAD-PC.txt".
GPU: 2x GTX 1080 in SLi with HB Bridge, both @ PCIe x16 2.0
Note 1: CPU is being OC'd to 4.4 GHz but this doesn't make any difference to the points in contention.
Note 2: The CPU is a 6 core 12 thread Xeon which you might find difficult to get a hold of. It is identical in every practical way to an i7 980X, which you should be able to use in place.
2.
All GTA5 settings at default. Resolution: 2560x1600, 120Hz, VSync ON, 3D Vision ON.
nVidia CP settings: DSR 4.00x, texture filtering "High Quality. SLi Enabled via High Bandwidth bridge.
3.
After the prologue, you start the game as the protagonist Franklin (the black gentleman), in an alley. If you run straight ahead for 5 seconds, you are standing on a highway. Stand on the striped lines between the outside and middle traffic lanes and face left so you are viewing the long highway directly at oncoming traffic - traffic on both lanes should pass either side of you.
The loading of the game takes a while but once in, I take ~10 seconds to run to the test spot with 3D Vision ENABLED. The 3D FPS here is 38 FPS and GPU usage is 37% on each GPU. I wait for ~10 seconds and then toggle 3D Vision OFF.
The 2D FPS is 95 FPS and the GPU usage has increased to 58% on each GPU. I wait for ~10 seconds and then toggle 3D Vision ON.
The FPS again jumps down to 38 FPS and GPU usage is back down to 37% on each GPU. I wait for ~10 seconds and then exit the game.
If there is anything else that I can help you guys with, please let me know.
Kind regards,
-- Shahzad
Thanks for the update and additional details. I will add these to the bug report and let development review/comment.
Best regards,
Ray
As we have seen, it is quite a challenge to bring even the professionals up to speed on performance metrics and their meaning. I think the problem is that people are used to dealing with kid gamers, not us grownup professionals with a technical background in various engineering disciplines who just happen to have gaming as a hobby.
Not only do we have to explain the problem in detail, but we have to also explain what the data means, and more importantly, how to interpret that information correctly.
They are clever folk. We just need to ensure that they don't write off the problem as something it has proven to be not, as is the tendency.
Some progress is being made however, for which I am glad.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
A further update:
[color="orange"]Hi Ray,
We would like to check in. Have you guys made progress over the '3D Vision causing low CPU and GPU usage' investigation in the last month? :)
Kind regards,
-- Shahzad.[/color]
[color="green"]Hi Shahzad,
Unfortunately I don't have much to update. [u][b]The last update from the developer assigned to investigate was that internal performance data did confirm multiple causes for low FPS with stereo enable (High cross GPU transfer time, Wait on CPU thread until we complete the copy, Game thread spending more time on CPU on case of stereo)[/b][/u], and that they will begin and look into these cases. It appears they have begun their investigation, but nothing yet in terms of details of the cause or any possible solution.
Best regards,
Ray
[/color]
I don't believe the cross GPU copy issue to be a huge deal as it is an SLi only issue whereas we have witnessed the problem in both SLi and non-SLi setups. I am using a High Bandwidth SLi bridge with PCIe 2.0 @ x16 on both cards, and the PCIe bus performance metrics show that it is nowhere near saturated during testing (actually remains close to 10% usage IIRC) - if they can fix this, then great. The second problem of "Game thread spending more time on CPU on case of stereo" matches up with the hypothesis in the first post of this thread however, which is intriguing. Let's hope something is found, and ultimately fixed...
We would like to check in. Have you guys made progress over the '3D Vision causing low CPU and GPU usage' investigation in the last month? :)
Kind regards,
-- Shahzad.
Hi Shahzad,
Unfortunately I don't have much to update. The last update from the developer assigned to investigate was that internal performance data did confirm multiple causes for low FPS with stereo enable (High cross GPU transfer time, Wait on CPU thread until we complete the copy, Game thread spending more time on CPU on case of stereo), and that they will begin and look into these cases. It appears they have begun their investigation, but nothing yet in terms of details of the cause or any possible solution.
Best regards,
Ray
I don't believe the cross GPU copy issue to be a huge deal as it is an SLi only issue whereas we have witnessed the problem in both SLi and non-SLi setups. I am using a High Bandwidth SLi bridge with PCIe 2.0 @ x16 on both cards, and the PCIe bus performance metrics show that it is nowhere near saturated during testing (actually remains close to 10% usage IIRC) - if they can fix this, then great. The second problem of "Game thread spending more time on CPU on case of stereo" matches up with the hypothesis in the first post of this thread however, which is intriguing. Let's hope something is found, and ultimately fixed...
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="RAGEdemon"]A further update:
[color="orange"]Hi Ray,
We would like to check in. Have you guys made progress over the '3D Vision causing low CPU and GPU usage' investigation in the last month? :)
Kind regards,
-- Shahzad.[/color]
[color="green"]Hi Shahzad,
Unfortunately I don't have much to update. [u][b]The last update from the developer assigned to investigate was that internal performance data did confirm multiple causes for low FPS with stereo enable (High cross GPU transfer time, Wait on CPU thread until we complete the copy, Game thread spending more time on CPU on case of stereo)[/b][/u], and that they will begin and look into these cases. It appears they have begun their investigation, but nothing yet in terms of details of the cause or any possible solution.
Best regards,
Ray
[/color]
I don't believe the cross GPU copy issue to be a huge deal as it is an SLi only issue whereas we have witnessed the problem in both SLi and non-SLi setups. I am using a High Bandwidth SLi bridge with PCIe 2.0 @ x16 on both cards, and the PCIe bus performance metrics show that it is nowhere near saturated during testing (actually remains close to 10% usage IIRC) - if they can fix this, then great. The second problem of "Game thread spending more time on CPU on case of stereo" matches up with the hypothesis in the first post of this thread however, which is intriguing. Let's hope something is found, and ultimately fixed...
[/quote]
All SLI busses have the data transfer on that bus at 1GB/sec (or 2GB/sec for SLI HB). Doesn't matter if is newer or older. Only the Pixel Clock differs.
[url]https://en.wikipedia.org/wiki/Scalable_Link_Interface[/url]
Lots of other data is transferred through the PCI-E lane. A "transfer" doesn't necessary mean any "external" bus is used. It could be "inside the GPU" as GPUs are designed to run in parallel. Just an idea;) Don't know exactly what they refer to as their description is very vague and could mean lots of things. But it definitely sounds like a multi-threading lock issue (somewhere)...
We would like to check in. Have you guys made progress over the '3D Vision causing low CPU and GPU usage' investigation in the last month? :)
Kind regards,
-- Shahzad.
Hi Shahzad,
Unfortunately I don't have much to update. The last update from the developer assigned to investigate was that internal performance data did confirm multiple causes for low FPS with stereo enable (High cross GPU transfer time, Wait on CPU thread until we complete the copy, Game thread spending more time on CPU on case of stereo), and that they will begin and look into these cases. It appears they have begun their investigation, but nothing yet in terms of details of the cause or any possible solution.
Best regards,
Ray
I don't believe the cross GPU copy issue to be a huge deal as it is an SLi only issue whereas we have witnessed the problem in both SLi and non-SLi setups. I am using a High Bandwidth SLi bridge with PCIe 2.0 @ x16 on both cards, and the PCIe bus performance metrics show that it is nowhere near saturated during testing (actually remains close to 10% usage IIRC) - if they can fix this, then great. The second problem of "Game thread spending more time on CPU on case of stereo" matches up with the hypothesis in the first post of this thread however, which is intriguing. Let's hope something is found, and ultimately fixed...
All SLI busses have the data transfer on that bus at 1GB/sec (or 2GB/sec for SLI HB). Doesn't matter if is newer or older. Only the Pixel Clock differs.
Lots of other data is transferred through the PCI-E lane. A "transfer" doesn't necessary mean any "external" bus is used. It could be "inside the GPU" as GPUs are designed to run in parallel. Just an idea;) Don't know exactly what they refer to as their description is very vague and could mean lots of things. But it definitely sounds like a multi-threading lock issue (somewhere)...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
I hope nvidia does something regarding this issue, but it is very unlikely.
Not expecting anything good from them lately. It seems they dropped the project to make 3D Vision work with VR headsets, so I guess 3D Vision is dead for them, my feeling is that things are just gonna get worst with time. My hope is on VR now to push devs and games to support native S3D in SBS.
Is anyone running a single GTX 1070 to give me some fps results in some games to compare with my system?
I want to be sure that my low fps is related to my CPU, before I go ahead and do an upgrade and wanted to know how big is the gap from my system to one with better CPU. (I still need to try overclocking memory and CPU, but it just a pain...)
If anyone can post results for any games I would really appreciate as I can try to test the same games here.
At the moment I am playing Quantum Breat and fps keeps dropping below 30 in 720p (i don't mind 30 but below it is annoying to the point of considering 2D or superdepth3D).
BF1 had some drops too below 30, but very few and it was very playable.
Another game I have just started is Just Cause 3 and fps is also dropping under 30. It doesn't matter if the games runs at 720p or 1080p. The thing that makes me angry is that in 2D fps is never below 60 and vary around 120 to 70 in this game.
I am starting to think that it seems fps in 3D Vision depends on the game and lucky, doesn't matter if it is a new game or old game in some cases, it is a roulette for me, but new games have a higher chance to have bad performance obviously.
I just finished the DLC of Rise of Tomb Raider in side by side (not 3D Vision) and it was perfect, 60 fps and some drops to 50, and I get similar performance in 2D to the benchmarks of systems with much better processor that I have compared. So if it wasn't for the 3D Vision I would never consider a CPU upgrade as the gain is too little in 2D or games with native 3D to justify an upgrade
I hope nvidia does something regarding this issue, but it is very unlikely.
Not expecting anything good from them lately. It seems they dropped the project to make 3D Vision work with VR headsets, so I guess 3D Vision is dead for them, my feeling is that things are just gonna get worst with time. My hope is on VR now to push devs and games to support native S3D in SBS.
Is anyone running a single GTX 1070 to give me some fps results in some games to compare with my system?
I want to be sure that my low fps is related to my CPU, before I go ahead and do an upgrade and wanted to know how big is the gap from my system to one with better CPU. (I still need to try overclocking memory and CPU, but it just a pain...)
If anyone can post results for any games I would really appreciate as I can try to test the same games here.
At the moment I am playing Quantum Breat and fps keeps dropping below 30 in 720p (i don't mind 30 but below it is annoying to the point of considering 2D or superdepth3D).
BF1 had some drops too below 30, but very few and it was very playable.
Another game I have just started is Just Cause 3 and fps is also dropping under 30. It doesn't matter if the games runs at 720p or 1080p. The thing that makes me angry is that in 2D fps is never below 60 and vary around 120 to 70 in this game.
I am starting to think that it seems fps in 3D Vision depends on the game and lucky, doesn't matter if it is a new game or old game in some cases, it is a roulette for me, but new games have a higher chance to have bad performance obviously.
I just finished the DLC of Rise of Tomb Raider in side by side (not 3D Vision) and it was perfect, 60 fps and some drops to 50, and I get similar performance in 2D to the benchmarks of systems with much better processor that I have compared. So if it wasn't for the 3D Vision I would never consider a CPU upgrade as the gain is too little in 2D or games with native 3D to justify an upgrade
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Easily test it yourself mate, with instructions from the first post [MSI Afterburner OSD section].
CPU = massively serial - very difficult to do more than one thread.
GPU = massively parallel, opposite of a CPU in most regards.
If your GPU is running at <90% when you experience unacceptable FPS, then it's your CPU bottlenecking.
I can practically guarantee that this will be the case, as your CPU isn't great.
Perhaps you might want to wait for Zen/Kaby Lake release in January before choosing a CPU?
Worst case scenario, it'll bring all current gen CPU prices down.
You can also try overclocking your CPU as an experiment to see if it helps your game scenarios. If you are truly CPU bound and not something else, it should move the needle.
Another possibility is to use the new VRMark to compare your current system to a possible future system. The reason VRMark is interesting is because it's a dual-view benchmark and quite a lot closer to a 3D Vision type benchmark. 3DMark and the other 2D stuff is not interesting. VR tends to be CPU bound.
You can run the free version, and compare to online results to get a feel for how much difference it might make.
You can also try overclocking your CPU as an experiment to see if it helps your game scenarios. If you are truly CPU bound and not something else, it should move the needle.
Another possibility is to use the new VRMark to compare your current system to a possible future system. The reason VRMark is interesting is because it's a dual-view benchmark and quite a lot closer to a 3D Vision type benchmark. 3DMark and the other 2D stuff is not interesting. VR tends to be CPU bound.
You can run the free version, and compare to online results to get a feel for how much difference it might make.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="RAGEdemon"]Easily test it yourself mate, with instructions from the first post [MSI Afterburner OSD section].
CPU = massively serial - very difficult to do more than one thread.
GPU = massively parallel, opposite of a CPU in most regards.
If your GPU is running at <90% when you experience unacceptable FPS, then it's your CPU bottlenecking.
I can practically guarantee that this will be the case, as your CPU isn't great.
Perhaps you might want to wait for Zen/Kaby Lake release in January before choosing a CPU?
Worst case scenario, it'll bring all current gen CPU prices down.
[/quote]
I asked a friend to test BF1 in 3D as he also have a GTX 1070 and a better processor. The fps for him was around 70 to 100, better than mine around 30 to 45. I will take your advice and wait for the release of the new processors, perhaps I can get a better price on a i7 6700.
For now, I will try to do an overclock to see if I can get a few more FPS, so I can just make it to a minimum of 30 fps in recent games. In case overclock doesn't make any difference I will have to put the new titles in the bottom of my games playlist :(
RAGEdemon said:Easily test it yourself mate, with instructions from the first post [MSI Afterburner OSD section].
CPU = massively serial - very difficult to do more than one thread.
GPU = massively parallel, opposite of a CPU in most regards.
If your GPU is running at <90% when you experience unacceptable FPS, then it's your CPU bottlenecking.
I can practically guarantee that this will be the case, as your CPU isn't great.
Perhaps you might want to wait for Zen/Kaby Lake release in January before choosing a CPU?
Worst case scenario, it'll bring all current gen CPU prices down.
I asked a friend to test BF1 in 3D as he also have a GTX 1070 and a better processor. The fps for him was around 70 to 100, better than mine around 30 to 45. I will take your advice and wait for the release of the new processors, perhaps I can get a better price on a i7 6700.
For now, I will try to do an overclock to see if I can get a few more FPS, so I can just make it to a minimum of 30 fps in recent games. In case overclock doesn't make any difference I will have to put the new titles in the bottom of my games playlist :(
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Update: Not much progress so far...
[color="orange"]Hi Ray,
We hope that you guys are making good progress in the investigation! Is there any news on this front?
Happy Holidays!
Kind regards,
-- Shahzad
Cambridge
UK[/color]
[color="green"]
Hi Shahzad,
Sorry for the delay in getting back to you, I've been out of the office for the Holidays. I checked the status on the bug and don't see any update since my last reply unfortunately. The bug is still assigned to a developer to investigate but there have been no update yet. I'm afraid I don't have much visibility into developer schedule or priorities so sorry I don't have more info.
Best regards,
Ray
[/color]
It's worrying that we talk about "priorities". I am sure nVidia devs have their hands full and 3D Vision isn't even on nvidia's radar any more. At least they have recognised it as a pretty major bug, and are actively trying to fix it. If this does get fixed, we can expect huge performance increases across all games, assuming of course the GPU saturation isn't the bottleneck in said game. Let's keep our fingers crossed... :)
We hope that you guys are making good progress in the investigation! Is there any news on this front?
Happy Holidays!
Kind regards,
-- Shahzad
Cambridge
UK
Hi Shahzad,
Sorry for the delay in getting back to you, I've been out of the office for the Holidays. I checked the status on the bug and don't see any update since my last reply unfortunately. The bug is still assigned to a developer to investigate but there have been no update yet. I'm afraid I don't have much visibility into developer schedule or priorities so sorry I don't have more info.
Best regards,
Ray
It's worrying that we talk about "priorities". I am sure nVidia devs have their hands full and 3D Vision isn't even on nvidia's radar any more. At least they have recognised it as a pretty major bug, and are actively trying to fix it. If this does get fixed, we can expect huge performance increases across all games, assuming of course the GPU saturation isn't the bottleneck in said game. Let's keep our fingers crossed... :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I have a little favor to ask, RAGEdemon, because you are in contact with Nvidia. Can you make them know about the G-Sync + ULMB trick?
I opened a support ticket a month ago but either the ticket got forgotten at level 2 support, or they are ignoring my ass or it didn't go to the place it should have.
I have already commented about the trick in the two most recent drivers thread, but not even users were curious except for one person that didn't comment back after an answer.
The info about the trick is here: https://www.reddit.com/r/nvidia/comments/553q20/gsync_ulmb_at_the_same_time_proof_and_howto/
I have a little favor to ask, RAGEdemon, because you are in contact with Nvidia. Can you make them know about the G-Sync + ULMB trick?
I opened a support ticket a month ago but either the ticket got forgotten at level 2 support, or they are ignoring my ass or it didn't go to the place it should have.
I have already commented about the trick in the two most recent drivers thread, but not even users were curious except for one person that didn't comment back after an answer.
No offense masterotaku, but I would advise RAGEdemon to not dilute the support he's getting on his matter with another. I know all too well how support staff are always looking for an easy out of difficult matters, so would I would forsee would be him raising this matter, and then they would somehow discard looking into his issue any further, probably by providing some sort of generic response to your inquiry and saying they've resolved his by doing so (and, in effect, not resolving anything). I would suggest that you follow his example and be diligent with following up on your own inquiry until it gets the attention you feel it deserves.
No offense masterotaku, but I would advise RAGEdemon to not dilute the support he's getting on his matter with another. I know all too well how support staff are always looking for an easy out of difficult matters, so would I would forsee would be him raising this matter, and then they would somehow discard looking into his issue any further, probably by providing some sort of generic response to your inquiry and saying they've resolved his by doing so (and, in effect, not resolving anything). I would suggest that you follow his example and be diligent with following up on your own inquiry until it gets the attention you feel it deserves.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
Sorry about that. I didn't mean to derail this thread, which is about a very important issue that affects me like all other 3D Vision users. I'll try to press Nvidia through the support ticket instead of putting other people under pressure. It feels weird that something this huge (G-Sync+ULMB. The biggest monitor user discovery since G-Sync was created?) gained so little attention over the internet.
Again, sorry for this and I hope the CPU usage thing gets solved or has a real reason to be so demanding (I prefer it to be a bug that can be fixed).
Sorry about that. I didn't mean to derail this thread, which is about a very important issue that affects me like all other 3D Vision users. I'll try to press Nvidia through the support ticket instead of putting other people under pressure. It feels weird that something this huge (G-Sync+ULMB. The biggest monitor user discovery since G-Sync was created?) gained so little attention over the internet.
Again, sorry for this and I hope the CPU usage thing gets solved or has a real reason to be so demanding (I prefer it to be a bug that can be fixed).
Thanks for the input DJ-RK.
No worries masterotaku. As DJ-RK says, I would try to contact them again. There is only so much that support can do except escalate a new feature request to the higher ups, who then have to decide what to do. Perhaps you would like to create a thread on this forum where we can discuss your excellent findings?
No worries masterotaku. As DJ-RK says, I would try to contact them again. There is only so much that support can do except escalate a new feature request to the higher ups, who then have to decide what to do. Perhaps you would like to create a thread on this forum where we can discuss your excellent findings?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Sorry, I got bored in page 3 reading a fighting and I did't read anymore. Is there any final conclusion about what happen with CPU cut when activating 3D?
My opinion (I have not a lot of knoledgement about how cpu/gpu stress work):
1) I have read several times that stereo ON = double GPU stress and 0% increase of CPU stress, so if that is correct the FPS in a game (played in 3D) without any kind of cpu bottleneck should be exactly 1/2, but we know it does not happen, usually we lose less than that, also depending on what game we are talking about, so the sentence seems to be not true and seems to depend on many other things.
2) I thought I could determine a GPU totally stressed if msi afterburner tells me that it is worling at 100% of use (and that seems to be right), but I also thought that I could determine a CPU stress the same way, but I am reading today that that is not true and it depend on the game and how the CPU handle the different threads. So it mean that a CPU can be the cause of a bottleneck and msi afterburner is telling me that is using the cpu only at 55% (I really don't understand of of this).
3) Making a test with The Witcher 3 there is something weird that I don't understand. Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 60% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
And the question is, for me obviously, what is happening with CPU and GPU that only stress to 54% and 47% giving me only 44 fps..., is that true that the cpu bottleneck in this scenario is located only on 47% (with 3D) and 55% (with 2D)? But there must be something else because if I rease a bit the resolution of the game the GPU still is not stressed 100% and I get less FPS (theoretically that should not happen because +res only implies more gpu stress, and the CPU is causing exactly the same bottleneck than before).
Sorry, I got bored in page 3 reading a fighting and I did't read anymore. Is there any final conclusion about what happen with CPU cut when activating 3D?
My opinion (I have not a lot of knoledgement about how cpu/gpu stress work):
1) I have read several times that stereo ON = double GPU stress and 0% increase of CPU stress, so if that is correct the FPS in a game (played in 3D) without any kind of cpu bottleneck should be exactly 1/2, but we know it does not happen, usually we lose less than that, also depending on what game we are talking about, so the sentence seems to be not true and seems to depend on many other things.
2) I thought I could determine a GPU totally stressed if msi afterburner tells me that it is worling at 100% of use (and that seems to be right), but I also thought that I could determine a CPU stress the same way, but I am reading today that that is not true and it depend on the game and how the CPU handle the different threads. So it mean that a CPU can be the cause of a bottleneck and msi afterburner is telling me that is using the cpu only at 55% (I really don't understand of of this).
3) Making a test with The Witcher 3 there is something weird that I don't understand. Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 60% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
And the question is, for me obviously, what is happening with CPU and GPU that only stress to 54% and 47% giving me only 44 fps..., is that true that the cpu bottleneck in this scenario is located only on 47% (with 3D) and 55% (with 2D)? But there must be something else because if I rease a bit the resolution of the game the GPU still is not stressed 100% and I get less FPS (theoretically that should not happen because +res only implies more gpu stress, and the CPU is causing exactly the same bottleneck than before).
Individual core usage among the six cores of my Xeon X5670 is dancing all the time, I focus always on the average CPU result. I have another simple question..., has anybody checked if there is any recomendation about enable or disable HiperThreading in the bios when using Nvidia 3D Vision? Now I have it disabled, but I don't know if that is a good idea.
Another thing is that I used the "trick" to force the system to use all the cores. Supposedly is a big advance in performace. But couldn't be this an obstacule for 3D Vision with some games? I mean, maybe using only one CPU core may give more performance in some scenarios (I know it sounds extrange, but I would like to be sure, because the 1st thing I don't understand is why Windows7 does noto use all the cores by default).
Individual core usage among the six cores of my Xeon X5670 is dancing all the time, I focus always on the average CPU result. I have another simple question..., has anybody checked if there is any recomendation about enable or disable HiperThreading in the bios when using Nvidia 3D Vision? Now I have it disabled, but I don't know if that is a good idea.
Another thing is that I used the "trick" to force the system to use all the cores. Supposedly is a big advance in performace. But couldn't be this an obstacule for 3D Vision with some games? I mean, maybe using only one CPU core may give more performance in some scenarios (I know it sounds extrange, but I would like to be sure, because the 1st thing I don't understand is why Windows7 does noto use all the cores by default).
Hi Shahzad,
Our performance lab ran through similar test and confirm we are observing 10% - 20% reduction in overall CPU usage with stereo enable. But we are not observing FPS drop as low as what your results shows with stereo enable. We are always getting FPS more than 50% compared to stereo disable. We used almost similar hardware configuration in our testing matching CPU, RAM, etc...
The reduce CPU usage is definitely something we will need to analyze further, but development is more concern with the FPS drop. They will need your help with more specific information so they can try replicating the FPS drop scenario. Development would like to focus on GTA V since that
1. Please confirm the exact system configuration used to captured the GTA V results.
2. Please provide the exact in-game settings as well as NVIDIA Control Panel setting used for collecting the GTA V performance data.
3. Please provide exact GTA V game location (level, map) where the performance data was collected. Video may help to locate exact location in house. With this info we will try to play at same location and compare the FPS.
4. Please use GPUView tool to capture an event trace log while in the failing state. The log will be very helpful to development in analyzing why FPS is reduced with stereo enabled. More on how to use GPUView to capture event log here: http://nvidia.custhelp.com/app/answers/detail/a_id/3507/~/generating-an-event-trace-log-for-gpuview
Thanks in advanced for the details.
Best regards,
Ray
Hi Ray,
Thank you kindly for getting back to us.
I have provided the data below, but before that we would like to make a clarification:
From your comments, our understanding is that the development team are implying that 'It is perfectly normal for 3D Vision to cause a 50% drop in FPS'. They see nothing wrong with this, so are trying to ascertain scenarios where the FPS drop is larger than 50% for further investigation.
Please note that the point of contention isn't necessarily that 3D Vision makes the FPS drop more than 50% (as seems to be the implication from your queries from the development team) - this is an over simplification - we indeed understand perfectly well that 3D Vision puts double the strain on the GPU.
Our contention is that 3D Vision unfortunately causes the CPU and GPU usage to drop dramatically compared to 2D, which causes a huge drop in performance. We contend that if the CPU and GPU usage did not drop with 3D Vision toggling ON, the game FPS would be far greater. Our understanding is that in scenarios where the GPU isn't being bottlenecked in 2D or 3D, when 3D Vision is toggled on, while the 3D Vision CPU usage should remain the same as the 2D CPU usage, and GPU usage should be higher (about double, to account for double the power required for 3D Vision). In such scenarios, the 2D and 3D Vision FPS should be similar.
The development team say that they do not observe more than a 50% drop in FPS. Do they not observe a dramatic decrease in GPU usage (as well as the confirmed drop in CPU usage)? I hope they can acknowledge that indeed if the GPU usage (and CPU usage) did not drop so strikingly (GPU usage should in fact increase while CPU usage stays the same), the 3D Vision FPS would be a lot higher.
In this context, the apparent understanding of the development team that 'It is normal for 3D Vision to cause a 50% drop in FPS' is unfortunately not accurate :(
Regarding your questions:
1.
Exact system configuration attached in file "SHAHZAD-PC.txt".
GPU: 2x GTX 1080 in SLi with HB Bridge, both @ PCIe x16 2.0
Note 1: CPU is being OC'd to 4.4 GHz but this doesn't make any difference to the points in contention.
Note 2: The CPU is a 6 core 12 thread Xeon which you might find difficult to get a hold of. It is identical in every practical way to an i7 980X, which you should be able to use in place.
2.
All GTA5 settings at default. Resolution: 2560x1600, 120Hz, VSync ON, 3D Vision ON.
nVidia CP settings: DSR 4.00x, texture filtering "High Quality. SLi Enabled via High Bandwidth bridge.
3.
After the prologue, you start the game as the protagonist Franklin (the black gentleman), in an alley. If you run straight ahead for 5 seconds, you are standing on a highway. Stand on the striped lines between the outside and middle traffic lanes and face left so you are viewing the long highway directly at oncoming traffic - traffic on both lanes should pass either side of you.
4.
Please find the file Merged.zip (Size: ~500 MB) on the following google drive link:
https://drive.google.com/drive/folders/0B--vBaCOZKVeTjFZaDE1TldHdVU?usp=sharing
The loading of the game takes a while but once in, I take ~10 seconds to run to the test spot with 3D Vision ENABLED. The 3D FPS here is 38 FPS and GPU usage is 37% on each GPU. I wait for ~10 seconds and then toggle 3D Vision OFF.
The 2D FPS is 95 FPS and the GPU usage has increased to 58% on each GPU. I wait for ~10 seconds and then toggle 3D Vision ON.
The FPS again jumps down to 38 FPS and GPU usage is back down to 37% on each GPU. I wait for ~10 seconds and then exit the game.
If there is anything else that I can help you guys with, please let me know.
Kind regards,
-- Shahzad
Thanks for the update and additional details. I will add these to the bug report and let development review/comment.
Best regards,
Ray
As we have seen, it is quite a challenge to bring even the professionals up to speed on performance metrics and their meaning. I think the problem is that people are used to dealing with kid gamers, not us grownup professionals with a technical background in various engineering disciplines who just happen to have gaming as a hobby.
Not only do we have to explain the problem in detail, but we have to also explain what the data means, and more importantly, how to interpret that information correctly.
They are clever folk. We just need to ensure that they don't write off the problem as something it has proven to be not, as is the tendency.
Some progress is being made however, for which I am glad.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Hi Ray,
We would like to check in. Have you guys made progress over the '3D Vision causing low CPU and GPU usage' investigation in the last month? :)
Kind regards,
-- Shahzad.
Hi Shahzad,
Unfortunately I don't have much to update. The last update from the developer assigned to investigate was that internal performance data did confirm multiple causes for low FPS with stereo enable (High cross GPU transfer time, Wait on CPU thread until we complete the copy, Game thread spending more time on CPU on case of stereo), and that they will begin and look into these cases. It appears they have begun their investigation, but nothing yet in terms of details of the cause or any possible solution.
Best regards,
Ray
I don't believe the cross GPU copy issue to be a huge deal as it is an SLi only issue whereas we have witnessed the problem in both SLi and non-SLi setups. I am using a High Bandwidth SLi bridge with PCIe 2.0 @ x16 on both cards, and the PCIe bus performance metrics show that it is nowhere near saturated during testing (actually remains close to 10% usage IIRC) - if they can fix this, then great. The second problem of "Game thread spending more time on CPU on case of stereo" matches up with the hypothesis in the first post of this thread however, which is intriguing. Let's hope something is found, and ultimately fixed...
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
All SLI busses have the data transfer on that bus at 1GB/sec (or 2GB/sec for SLI HB). Doesn't matter if is newer or older. Only the Pixel Clock differs.
https://en.wikipedia.org/wiki/Scalable_Link_Interface
Lots of other data is transferred through the PCI-E lane. A "transfer" doesn't necessary mean any "external" bus is used. It could be "inside the GPU" as GPUs are designed to run in parallel. Just an idea;) Don't know exactly what they refer to as their description is very vague and could mean lots of things. But it definitely sounds like a multi-threading lock issue (somewhere)...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Not expecting anything good from them lately. It seems they dropped the project to make 3D Vision work with VR headsets, so I guess 3D Vision is dead for them, my feeling is that things are just gonna get worst with time. My hope is on VR now to push devs and games to support native S3D in SBS.
Is anyone running a single GTX 1070 to give me some fps results in some games to compare with my system?
I want to be sure that my low fps is related to my CPU, before I go ahead and do an upgrade and wanted to know how big is the gap from my system to one with better CPU. (I still need to try overclocking memory and CPU, but it just a pain...)
If anyone can post results for any games I would really appreciate as I can try to test the same games here.
At the moment I am playing Quantum Breat and fps keeps dropping below 30 in 720p (i don't mind 30 but below it is annoying to the point of considering 2D or superdepth3D).
BF1 had some drops too below 30, but very few and it was very playable.
Another game I have just started is Just Cause 3 and fps is also dropping under 30. It doesn't matter if the games runs at 720p or 1080p. The thing that makes me angry is that in 2D fps is never below 60 and vary around 120 to 70 in this game.
I am starting to think that it seems fps in 3D Vision depends on the game and lucky, doesn't matter if it is a new game or old game in some cases, it is a roulette for me, but new games have a higher chance to have bad performance obviously.
I just finished the DLC of Rise of Tomb Raider in side by side (not 3D Vision) and it was perfect, 60 fps and some drops to 50, and I get similar performance in 2D to the benchmarks of systems with much better processor that I have compared. So if it wasn't for the 3D Vision I would never consider a CPU upgrade as the gain is too little in 2D or games with native 3D to justify an upgrade
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
CPU = massively serial - very difficult to do more than one thread.
GPU = massively parallel, opposite of a CPU in most regards.
If your GPU is running at <90% when you experience unacceptable FPS, then it's your CPU bottlenecking.
I can practically guarantee that this will be the case, as your CPU isn't great.
Perhaps you might want to wait for Zen/Kaby Lake release in January before choosing a CPU?
Worst case scenario, it'll bring all current gen CPU prices down.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Another possibility is to use the new VRMark to compare your current system to a possible future system. The reason VRMark is interesting is because it's a dual-view benchmark and quite a lot closer to a 3D Vision type benchmark. 3DMark and the other 2D stuff is not interesting. VR tends to be CPU bound.
You can run the free version, and compare to online results to get a feel for how much difference it might make.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I asked a friend to test BF1 in 3D as he also have a GTX 1070 and a better processor. The fps for him was around 70 to 100, better than mine around 30 to 45. I will take your advice and wait for the release of the new processors, perhaps I can get a better price on a i7 6700.
For now, I will try to do an overclock to see if I can get a few more FPS, so I can just make it to a minimum of 30 fps in recent games. In case overclock doesn't make any difference I will have to put the new titles in the bottom of my games playlist :(
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Hi Ray,
We hope that you guys are making good progress in the investigation! Is there any news on this front?
Happy Holidays!
Kind regards,
-- Shahzad
Cambridge
UK
Hi Shahzad,
Sorry for the delay in getting back to you, I've been out of the office for the Holidays. I checked the status on the bug and don't see any update since my last reply unfortunately. The bug is still assigned to a developer to investigate but there have been no update yet. I'm afraid I don't have much visibility into developer schedule or priorities so sorry I don't have more info.
Best regards,
Ray
It's worrying that we talk about "priorities". I am sure nVidia devs have their hands full and 3D Vision isn't even on nvidia's radar any more. At least they have recognised it as a pretty major bug, and are actively trying to fix it. If this does get fixed, we can expect huge performance increases across all games, assuming of course the GPU saturation isn't the bottleneck in said game. Let's keep our fingers crossed... :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I opened a support ticket a month ago but either the ticket got forgotten at level 2 support, or they are ignoring my ass or it didn't go to the place it should have.
I have already commented about the trick in the two most recent drivers thread, but not even users were curious except for one person that didn't comment back after an answer.
The info about the trick is here: https://www.reddit.com/r/nvidia/comments/553q20/gsync_ulmb_at_the_same_time_proof_and_howto/
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
Like my fixes? Dontations can be made to: www.paypal.me/DShanz or rshannonca@gmail.com
Like electronic music? Check out: www.soundcloud.com/dj-ryan-king
Again, sorry for this and I hope the CPU usage thing gets solved or has a real reason to be so demanding (I prefer it to be a bug that can be fixed).
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
No worries masterotaku. As DJ-RK says, I would try to contact them again. There is only so much that support can do except escalate a new feature request to the higher ups, who then have to decide what to do. Perhaps you would like to create a thread on this forum where we can discuss your excellent findings?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
My opinion (I have not a lot of knoledgement about how cpu/gpu stress work):
1) I have read several times that stereo ON = double GPU stress and 0% increase of CPU stress, so if that is correct the FPS in a game (played in 3D) without any kind of cpu bottleneck should be exactly 1/2, but we know it does not happen, usually we lose less than that, also depending on what game we are talking about, so the sentence seems to be not true and seems to depend on many other things.
2) I thought I could determine a GPU totally stressed if msi afterburner tells me that it is worling at 100% of use (and that seems to be right), but I also thought that I could determine a CPU stress the same way, but I am reading today that that is not true and it depend on the game and how the CPU handle the different threads. So it mean that a CPU can be the cause of a bottleneck and msi afterburner is telling me that is using the cpu only at 55% (I really don't understand of of this).
3) Making a test with The Witcher 3 there is something weird that I don't understand. Using 1280x720@unlimited cap gives me:
- Nvidia 3D Vision OFF: 54% usage of GPU and 55% average usage of 6 cores. I get 70 FPS.
- Nvidia 3D Vision ON: 60% usage of GPU and 47% average usage of 6 CPU cores. I get 44 FPS.
And the question is, for me obviously, what is happening with CPU and GPU that only stress to 54% and 47% giving me only 44 fps..., is that true that the cpu bottleneck in this scenario is located only on 47% (with 3D) and 55% (with 2D)? But there must be something else because if I rease a bit the resolution of the game the GPU still is not stressed 100% and I get less FPS (theoretically that should not happen because +res only implies more gpu stress, and the CPU is causing exactly the same bottleneck than before).
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Another thing is that I used the "trick" to force the system to use all the cores. Supposedly is a big advance in performace. But couldn't be this an obstacule for 3D Vision with some games? I mean, maybe using only one CPU core may give more performance in some scenarios (I know it sounds extrange, but I would like to be sure, because the 1st thing I don't understand is why Windows7 does noto use all the cores by default).
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)