3D Vision Cpu Core Bottleneck
  2 / 14    
That's amazing bo3b. I don't think we knew you were personally in contact with the 3D Vision team at nVidia. I'd like to do the 3 core test with older drivers as you suggested but unfortunately, I am locked down to late drivers due to the only recent 9XX series release. It is promising that they are not aware of such a problem. The way I see it, we as a community are up against 3 factors which haunt us: 1. Game compatibility 2. Hardware compatibility 3. Hardware scaling. We have discussed point 3. Anything that they might be able to do towards remedying point 2, to lift arbitrary limitations imposed on the 3D Vision driver, would be a godsend! Since they no longer wish to support or license hardware, can they, for example, be open to removing limitations on the driver which restrict which hardware the driver is compatible with so it is like the old days when you can (attempt) to use it with anything? This would allow us to be able to attempt to use 3D Vision with the latest and greatest projectors and monitors out there without any resolution / refresh rate / Stereo3D Method limitations that even 3D Play imposes. If not, can they maybe sell an "open version" of the driver? I'm sure there are many people out there who would be willing to pay a fair amount to be able to play on the hardware they want. It's just a more open version of their 3D Play structure. It's a win-win if they can't get any more money from either the manufacturers or the users from 3D Vision any more. At this point, arbitrary software limitations are just a hindrance to us all. They have already accepted this and proposed the free-for-all VR support, now on the horizon by both AMD and nVidia.
That's amazing bo3b. I don't think we knew you were personally in contact with the 3D Vision team at nVidia.

I'd like to do the 3 core test with older drivers as you suggested but unfortunately, I am locked down to late drivers due to the only recent 9XX series release.

It is promising that they are not aware of such a problem.

The way I see it, we as a community are up against 3 factors which haunt us:

1. Game compatibility
2. Hardware compatibility
3. Hardware scaling.

We have discussed point 3.

Anything that they might be able to do towards remedying point 2, to lift arbitrary limitations imposed on the 3D Vision driver, would be a godsend!

Since they no longer wish to support or license hardware, can they, for example, be open to removing limitations on the driver which restrict which hardware the driver is compatible with so it is like the old days when you can (attempt) to use it with anything? This would allow us to be able to attempt to use 3D Vision with the latest and greatest projectors and monitors out there without any resolution / refresh rate / Stereo3D Method limitations that even 3D Play imposes.

If not, can they maybe sell an "open version" of the driver? I'm sure there are many people out there who would be willing to pay a fair amount to be able to play on the hardware they want.

It's just a more open version of their 3D Play structure.

It's a win-win if they can't get any more money from either the manufacturers or the users from 3D Vision any more. At this point, arbitrary software limitations are just a hindrance to us all. They have already accepted this and proposed the free-for-all VR support, now on the horizon by both AMD and nVidia.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#16
Posted 05/17/2015 12:27 AM   
Maybe Helix is still the one true god of 3D (of 3Dvision's Old Testament, at least), but you bo3b are definitely its prophet. That's really encouraging news. It makes perfect sense that there would be a few engineers who would be interested; Nvidia is after all a huge and diverse company, and the actions of a juggernaut like that won't always correlate with the desires of individuals within it. Those engineers are still being paid a salary though, so one assumes that the amount of time they can spend on 3Dvision might be limited, since their time is Nvidia's money. Which is why I suggested you bolster their efforts with outside help on a volunteer basis. Even if it's not easier to fix from inside, just being able to work together would surely be a plus. But you guys are the ones with the tech know-how, and you're the one with the existing relationship, so you obviously know better than me about how to progress from here. Good luck, and thanks! :)
Maybe Helix is still the one true god of 3D (of 3Dvision's Old Testament, at least), but you bo3b are definitely its prophet.

That's really encouraging news. It makes perfect sense that there would be a few engineers who would be interested; Nvidia is after all a huge and diverse company, and the actions of a juggernaut like that won't always correlate with the desires of individuals within it.

Those engineers are still being paid a salary though, so one assumes that the amount of time they can spend on 3Dvision might be limited, since their time is Nvidia's money. Which is why I suggested you bolster their efforts with outside help on a volunteer basis. Even if it's not easier to fix from inside, just being able to work together would surely be a plus.

But you guys are the ones with the tech know-how, and you're the one with the existing relationship, so you obviously know better than me about how to progress from here. Good luck, and thanks! :)

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#17
Posted 05/17/2015 02:07 AM   
[quote="bo3b"][quote="terintamel"]Well if you have contact with the 3d vision devs can you ask them about making these fixes? Especially the Batman Arkham series issues with drivers after 337.88. This is what is keeping me from upgrading to later drivers. Both are Unreal engine games. [url]https://forums.geforce.com/default/topic/809465/poor-3dvision-performance-in-all-batman-arkham-games/#4453515[/url] [url]https://forums.geforce.com/default/topic/814169/tron-evolution-sub-30fps-in-3dmode/#4471672[/url][/quote] I'd be happy to. I want to be sure to have high quality bug reports though, so can you test this on 350.12 and let me know if you still see it? 350.12 is a different branch and seems to be better all around for 3D so far. The very first question they'll ask will be 'did you try the latest driver'. Also did you file a bug here? [url]http://www.nvidia.com/object/driverqualityassurance.html[/url] BTW, relative to the 3 core CPU problem, it would be awesome if someone could do the 3 core experiment on older drivers. If this is something introduced, that is really helpful info. GTA5 would be a good test case, because it runs on old drivers.[/quote] Just to make sure I understand the 3 core issue. You say with the latest drivers if I run a game in 3dvision it will only utilize 3 cores? I just want to understand as I can do a test with older drivers as I have a 660ti. I just tested Arkham Origins and on 337.88 (Last working driver for Batman in 3d for me) and I utilize all cores in this game. In 3d mode I most definitely see worse performance when forcing my CPU to use 3 cores vs 8 cores (FX 8350). Did I do the test right, or is this not a good game to test with? I plan on testing the latest drivers to see if Batman/Tron 3d vision issues still exist, and if the 3 Core Arkham Origins test is valid I will test it again on the latest drivers. Update: I also just tested Far Cry 3 and Metro LL Redux and they both use all CPU cores in 3D mode fine on 337.88. Performance is better in 3d mode when all cores are set as active vs on 3 cores. I will update again after testing with the latest driver unless I am told my testing method is flawed.
bo3b said:
terintamel said:Well if you have contact with the 3d vision devs can you ask them about making these fixes? Especially the Batman Arkham series issues with drivers after 337.88. This is what is keeping me from upgrading to later drivers. Both are Unreal engine games.

https://forums.geforce.com/default/topic/809465/poor-3dvision-performance-in-all-batman-arkham-games/#4453515
https://forums.geforce.com/default/topic/814169/tron-evolution-sub-30fps-in-3dmode/#4471672

I'd be happy to. I want to be sure to have high quality bug reports though, so can you test this on 350.12 and let me know if you still see it? 350.12 is a different branch and seems to be better all around for 3D so far. The very first question they'll ask will be 'did you try the latest driver'.

Also did you file a bug here? http://www.nvidia.com/object/driverqualityassurance.html


BTW, relative to the 3 core CPU problem, it would be awesome if someone could do the 3 core experiment on older drivers. If this is something introduced, that is really helpful info. GTA5 would be a good test case, because it runs on old drivers.


Just to make sure I understand the 3 core issue. You say with the latest drivers if I run a game in 3dvision it will only utilize 3 cores? I just want to understand as I can do a test with older drivers as I have a 660ti.

I just tested Arkham Origins and on 337.88 (Last working driver for Batman in 3d for me) and I utilize all cores in this game. In 3d mode I most definitely see worse performance when forcing my CPU to use 3 cores vs 8 cores (FX 8350). Did I do the test right, or is this not a good game to test with?

I plan on testing the latest drivers to see if Batman/Tron 3d vision issues still exist, and if the 3 Core Arkham Origins test is valid I will test it again on the latest drivers.

Update:

I also just tested Far Cry 3 and Metro LL Redux and they both use all CPU cores in 3D mode fine on 337.88. Performance is better in 3d mode when all cores are set as active vs on 3 cores. I will update again after testing with the latest driver unless I am told my testing method is flawed.

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#18
Posted 05/18/2015 03:59 AM   
[quote="terintamel"]In 3d mode I most definitely see worse performance when forcing my CPU to use 3 cores vs 8 cores (FX 8350).[/quote]does performance improve when you set affinity to core 0, 2, 4 rather than 0, 1, 2?
terintamel said:In 3d mode I most definitely see worse performance when forcing my CPU to use 3 cores vs 8 cores (FX 8350).
does performance improve when you set affinity to core 0, 2, 4 rather than 0, 1, 2?

NVIDIA TITAN X (Pascal), Intel Core i7-6900K, Win 10 Pro,
ASUS ROG Rampage V Edition 10, G.Skill RipJaws V 4x 8GB DDR4-3200 CL14-14-14-34,
ASUS ROG Swift PG258Q, ASUS ROG Swift PG278Q, Acer Predator XB280HK, BenQ W710ST

#19
Posted 05/18/2015 07:32 AM   
Could it be limited to Intel chips? I just did the test on Metro LL (not redux) and Farcry 4, and dropping to three cores doesn't effect fps or gpu usage in either of these games in 3d or 2d so can't see the 3 core limit. Terintamel, you don't have GTA 5 to check do you? I'm going to try and do more testing later this week, in a more ordered way with a spreadsheet and everything! I have gtx 980's though, so can't check older drivers. I be very interested if anyone else with an AMD cpu could do the test too!
Could it be limited to Intel chips?

I just did the test on Metro LL (not redux) and Farcry 4, and dropping to three cores doesn't effect fps or gpu usage in either of these games in 3d or 2d so can't see the 3 core limit.


Terintamel, you don't have GTA 5 to check do you?

I'm going to try and do more testing later this week, in a more ordered way with a spreadsheet and everything! I have gtx 980's though, so can't check older drivers. I be very interested if anyone else with an AMD cpu could do the test too!

#20
Posted 05/18/2015 07:35 AM   
@termintamel: the key thing to look out for is how much of each core is being used. For example, even if 8 cores get some activity, bit it's only about 15% per core, that could mean it's effectively single core. I believe the cpu is able to spread what is essen tially a single core operation to multiple cores.
@termintamel: the key thing to look out for is how much of each core is being used. For example, even if 8 cores get some activity, bit it's only about 15% per core, that could mean it's effectively single core. I believe the cpu is able to spread what is essen
tially a single core operation to multiple cores.

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#21
Posted 05/18/2015 11:49 AM   
@T_L_T - I do not have GTA V as I am not a fan of the GTA series. Did some more tests with Far Cry 3 Driver 337.88 AMD FX8530 Windows 8.1 64bit All cores - 3d Core 0-7 CPU usage avg 40-50% GPU 94% 41fps Cores 0-2 CPu usage avg 70-90% GPU 86% 37fps Cores 0,2,4 Core 0 - 91% Core 2 - 3% Core 4 - 90% 28fps Metro LL Redux Benchmark tool All Cores - 3d Core 0-7 cpu usage avg. 40-50% Min fps 15 Max fps 110 Avg fps 38 Cores 0-2 Core CPU usage avg 88% Min fps .21 (3 major pauses for a few seconds where fps went to near 0) Max fps 81 Avg fps 31 I will test with the latest drivers to see if it is a driver issue or a Driver+Intel CPU issue.
@T_L_T - I do not have GTA V as I am not a fan of the GTA series.

Did some more tests with Far Cry 3

Driver 337.88
AMD FX8530
Windows 8.1 64bit

All cores - 3d
Core 0-7 CPU usage avg 40-50%
GPU 94%
41fps

Cores 0-2
CPu usage avg 70-90%
GPU 86%
37fps

Cores 0,2,4
Core 0 - 91%
Core 2 - 3%
Core 4 - 90%
28fps

Metro LL Redux Benchmark tool
All Cores - 3d
Core 0-7 cpu usage avg. 40-50%
Min fps 15
Max fps 110
Avg fps 38

Cores 0-2
Core CPU usage avg 88%
Min fps .21 (3 major pauses for a few seconds where fps went to near 0)
Max fps 81
Avg fps 31

I will test with the latest drivers to see if it is a driver issue or a Driver+Intel CPU issue.

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#22
Posted 05/18/2015 11:08 PM   
Updated to 350.12 - CPU core usage in 3d mode still appears to use all cores. Far Cry 3 test results same as above on 337.88. Metro LL Benchmark would lock up on loading in 3d mode, but game itself loads fine and uses all cores. It must be an Intel+Nvidia issue?
Updated to 350.12 - CPU core usage in 3d mode still appears to use all cores. Far Cry 3 test results same as above on 337.88.
Metro LL Benchmark would lock up on loading in 3d mode, but game itself loads fine and uses all cores.

It must be an Intel+Nvidia issue?

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#23
Posted 05/18/2015 11:42 PM   
[quote="bo3b"][quote="terintamel"]Well if you have contact with the 3d vision devs can you ask them about making these fixes? Especially the Batman Arkham series issues with drivers after 337.88. This is what is keeping me from upgrading to later drivers. Both are Unreal engine games. [url]https://forums.geforce.com/default/topic/809465/poor-3dvision-performance-in-all-batman-arkham-games/#4453515[/url] [url]https://forums.geforce.com/default/topic/814169/tron-evolution-sub-30fps-in-3dmode/#4471672[/url][/quote] I'd be happy to. I want to be sure to have high quality bug reports though, so can you test this on 350.12 and let me know if you still see it? 350.12 is a different branch and seems to be better all around for 3D so far. The very first question they'll ask will be 'did you try the latest driver'. Also did you file a bug here? [url]http://www.nvidia.com/object/driverqualityassurance.html[/url] BTW, relative to the 3 core CPU problem, it would be awesome if someone could do the 3 core experiment on older drivers. If this is something introduced, that is really helpful info. GTA5 would be a good test case, because it runs on old drivers.[/quote] Loaded 350.12 - Batman Arkahm Orgins issues returned. Stalling, pausing, hitching, and low GPU usage at times. Tron Evolution - Still the same. Sub 30fps in 3d mode. Not a CPU core bug as I get 60fps in 2d mode with only 3 cores enabled.
bo3b said:
terintamel said:Well if you have contact with the 3d vision devs can you ask them about making these fixes? Especially the Batman Arkham series issues with drivers after 337.88. This is what is keeping me from upgrading to later drivers. Both are Unreal engine games.

https://forums.geforce.com/default/topic/809465/poor-3dvision-performance-in-all-batman-arkham-games/#4453515
https://forums.geforce.com/default/topic/814169/tron-evolution-sub-30fps-in-3dmode/#4471672

I'd be happy to. I want to be sure to have high quality bug reports though, so can you test this on 350.12 and let me know if you still see it? 350.12 is a different branch and seems to be better all around for 3D so far. The very first question they'll ask will be 'did you try the latest driver'.

Also did you file a bug here? http://www.nvidia.com/object/driverqualityassurance.html


BTW, relative to the 3 core CPU problem, it would be awesome if someone could do the 3 core experiment on older drivers. If this is something introduced, that is really helpful info. GTA5 would be a good test case, because it runs on old drivers.


Loaded 350.12 -

Batman Arkahm Orgins issues returned. Stalling, pausing, hitching, and low GPU usage at times.
Tron Evolution - Still the same. Sub 30fps in 3d mode. Not a CPU core bug as I get 60fps in 2d mode with only 3 cores enabled.

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#24
Posted 05/18/2015 11:44 PM   
Or.. the drivers are fucked up as hell... Found various bugs in 3D Vision and 2D/3D Surround with the 350/352 branch... Nvidia... it seems isn't what it used to be... really disappointed overall... Edit: @terintamel: I get the exact same results in Batman AO as in Witcher 3 with 350.12 and 352.86...yuck...
Or.. the drivers are fucked up as hell...
Found various bugs in 3D Vision and 2D/3D Surround with the 350/352 branch...

Nvidia... it seems isn't what it used to be... really disappointed overall...

Edit:
@terintamel: I get the exact same results in Batman AO as in Witcher 3 with 350.12 and 352.86...yuck...

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#25
Posted 05/19/2015 12:43 AM   
Is there a guide on what drivers work best for what games/GPUs in 3dvison? For example with a 660ti should I upgrade drivers or stay on 337.88?
Is there a guide on what drivers work best for what games/GPUs in 3dvison? For example with a 660ti should I upgrade drivers or stay on 337.88?

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

#26
Posted 05/20/2015 10:51 PM   
I'd be a little more tolerant of the situation, but a lot of this seems to have coincided with the whole Geforce Experience introduction. I'm not sure I know anyone who has that installed on their computer anymore. It's just a bunch of buggy bloatware in many people's opinion (mine included). So it's pretty disappointing to see that this has diverted their attention away from the meat and potatoes they used to excel at.
I'd be a little more tolerant of the situation, but a lot of this seems to have coincided with the whole Geforce Experience introduction. I'm not sure I know anyone who has that installed on their computer anymore. It's just a bunch of buggy bloatware in many people's opinion (mine included). So it's pretty disappointing to see that this has diverted their attention away from the meat and potatoes they used to excel at.

#27
Posted 05/20/2015 11:23 PM   
[quote="terintamel"]Is there a guide on what drivers work best for what games/GPUs in 3dvison? For example with a 660ti should I upgrade drivers or stay on 337.88?[/quote] Not exactly a full guide, but it's something: https://forums.geforce.com/default/topic/777954/suggested-driver-to-use-/
terintamel said:Is there a guide on what drivers work best for what games/GPUs in 3dvison? For example with a 660ti should I upgrade drivers or stay on 337.88?

Not exactly a full guide, but it's something:

https://forums.geforce.com/default/topic/777954/suggested-driver-to-use-/

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#28
Posted 05/21/2015 08:38 AM   
Dying light as an example. I get a steady 100+ fps in 1440p and i mean closer to 120. Now once I enable 3d vision, regardless if im at 1440p or at 1080p i get dips to around 30 fps. This means in some areas (large chunk of city on screen), at 1440p for example, i get roughly a 75% performance drop comparing 3d vision against non-3d in the same spot. I would expect the normal 50% drop but not 75%. This is the same for 1440p and 1080p. Without 3d vision the performance is much MUCH better. Heck, even at 4k i never dip below 60 fps, far from it, so why does this dip so low with 3d vision? Almost suggests an issue with core utilization, no?
Dying light as an example. I get a steady 100+ fps in 1440p and i mean closer to 120.

Now once I enable 3d vision, regardless if im at 1440p or at 1080p i get dips to around 30 fps.

This means in some areas (large chunk of city on screen), at 1440p for example, i get roughly a 75% performance drop comparing 3d vision against non-3d in the same spot. I would expect the normal 50% drop but not 75%. This is the same for 1440p and 1080p. Without 3d vision the performance is much MUCH better.

Heck, even at 4k i never dip below 60 fps, far from it, so why does this dip so low with 3d vision?

Almost suggests an issue with core utilization, no?

ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)

G-Pat on Helixmod

#29
Posted 06/06/2015 11:22 AM   
[quote=""]Dying light as an example. I get a steady 100+ fps in 1440p and i mean closer to 120. Now once I enable 3d vision, regardless if im at 1440p or at 1080p i get dips to around 30 fps. This means in some areas (large chunk of city on screen), at 1440p for example, i get roughly a 75% performance drop comparing 3d vision against non-3d in the same spot. I would expect the normal 50% drop but not 75%. This is the same for 1440p and 1080p. Without 3d vision the performance is much MUCH better. Heck, even at 4k i never dip below 60 fps, far from it, so why does this dip so low with 3d vision? Almost suggests an issue with core utilization, no?[/quote] You got two things going on in case of 1440p - obvious is that higher res drops fps a lot but the most annoying is that SLI is not working at all at this resolution or better explained on ROG Swift. Since this monitor have issues with SLI support you won`t be getting it at 1080p as well. Roughly that will come down to 70% less then average gameplay.
said:Dying light as an example. I get a steady 100+ fps in 1440p and i mean closer to 120.

Now once I enable 3d vision, regardless if im at 1440p or at 1080p i get dips to around 30 fps.

This means in some areas (large chunk of city on screen), at 1440p for example, i get roughly a 75% performance drop comparing 3d vision against non-3d in the same spot. I would expect the normal 50% drop but not 75%. This is the same for 1440p and 1080p. Without 3d vision the performance is much MUCH better.

Heck, even at 4k i never dip below 60 fps, far from it, so why does this dip so low with 3d vision?

Almost suggests an issue with core utilization, no?

You got two things going on in case of 1440p - obvious is that higher res drops fps a lot but the most annoying is that SLI is not working at all at this resolution or better explained on ROG Swift. Since this monitor have issues with SLI support you won`t be getting it at 1080p as well. Roughly that will come down to 70% less then average gameplay.
  2 / 14    
Scroll To Top