3D Vision severe FPS drop issues
  2 / 2    
[quote="RAGEdemon"]When 3D is enabled and toggled on, the GPU usage stays the same (40-60%), but the FPS drops to about 50-65%. This is strange because the GPU cores should increase at least a little. I have in fact seen them decrease. This leads me to conjecture that enabling 3D somehow puts extra load onto the main game thread thereby CPU bottlenecking the entire game. It might explain low gpu usage as well as GPU fps downward spikes... I'll continue scratching my head and wait for more input :)[/quote] As far as I can tell, this is definitely the case for AC4. It's severely CPU bound, and anything that perturbs the CPU will cause frame rate drops. That's what happened with the prior bug. It was using 3.7% of the CPU which would not seem to be that significant a hit, but it translated directly to drop from 60 to 40, as the CPU was unable to keep the pipeline full. Multi core CPUs are still basically useless for gaming, and AC4 only uses two threads that I can tell. The other part of the equation is that when 3D is enabled- that also takes extra CPU time on the main game thread, and will also slowdown the frame rate. So, check for anything and everything that might impact the primary game thread.
RAGEdemon said:When 3D is enabled and toggled on, the GPU usage stays the same (40-60%), but the FPS drops to about 50-65%. This is strange because the GPU cores should increase at least a little. I have in fact seen them decrease.

This leads me to conjecture that enabling 3D somehow puts extra load onto the main game thread thereby CPU bottlenecking the entire game. It might explain low gpu usage as well as GPU fps downward spikes...

I'll continue scratching my head and wait for more input :)

As far as I can tell, this is definitely the case for AC4. It's severely CPU bound, and anything that perturbs the CPU will cause frame rate drops.

That's what happened with the prior bug. It was using 3.7% of the CPU which would not seem to be that significant a hit, but it translated directly to drop from 60 to 40, as the CPU was unable to keep the pipeline full.

Multi core CPUs are still basically useless for gaming, and AC4 only uses two threads that I can tell.

The other part of the equation is that when 3D is enabled- that also takes extra CPU time on the main game thread, and will also slowdown the frame rate.

So, check for anything and everything that might impact the primary game thread.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#16
Posted 01/16/2015 04:14 AM   
RageDemon, Just to chime in briefly. I've experienced exactly what you do in AC4, both with my GTX580 SLI, and with my new GTX980 SLI. Both with the old fix version, and with the new optimised one. This has made me abandon AC4 as soon as the story was finished, despite being one of the most impressive games (with the fix - big credit to Mike and Co). I've suffered the same symptoms with FarCry3, when I replayed it with the 980s recently, with Helix's fix. Constant dips when driving, or reaching certain zones (as Pirate has described it, as if reaching the borders of a "game zone" that needs to be loaded). Not to mention the stutter-fest that was Watch Dogs, but that was even in 2D I felt pretty disgusted, so I took to the net, to get some info. The common element was Ubisoft. And the common theme in all the comments on the net is that Ubisoft game engines have a big issue with this zone loading/streaming technique. Something they do ends up overloading the CPU thred, like you put it. You seem to say that you have this issue even with COD:AW, so I don't know. And users such as Pirate claim that upgrading their CPU improved the issue. But I'm simply not willing to upgrade an i7-2600K just so that I can play Ubisoft games without stutter (not that it would be guaranteed). I think from all the input and tests, done by Bo3b, Pirate, Helifax and others, there is definitely a resource toll imposed on the CPU in 3D Vision, so just some decent cards are no longer enough for 3D gaming, when the games are badly optimised. Not to mention that the recent driver updates seem to make everything worse, performancewise, especially SLI. I think the new crop of 9xx cards have yet to reach their potential, if drivers mature and Nvidia get their act together.
RageDemon,
Just to chime in briefly.
I've experienced exactly what you do in AC4, both with my GTX580 SLI, and with my new GTX980 SLI.
Both with the old fix version, and with the new optimised one.

This has made me abandon AC4 as soon as the story was finished, despite being one of the most impressive games (with the fix - big credit to Mike and Co).

I've suffered the same symptoms with FarCry3, when I replayed it with the 980s recently, with Helix's fix.
Constant dips when driving, or reaching certain zones (as Pirate has described it, as if reaching the borders of a "game zone" that needs to be loaded).
Not to mention the stutter-fest that was Watch Dogs, but that was even in 2D

I felt pretty disgusted, so I took to the net, to get some info. The common element was Ubisoft. And the common theme in all the comments on the net is that Ubisoft game engines have a big issue with this zone loading/streaming technique. Something they do ends up overloading the CPU thred, like you put it.

You seem to say that you have this issue even with COD:AW, so I don't know.
And users such as Pirate claim that upgrading their CPU improved the issue. But I'm simply not willing to upgrade an i7-2600K just so that I can play Ubisoft games without stutter (not that it would be guaranteed).

I think from all the input and tests, done by Bo3b, Pirate, Helifax and others, there is definitely a resource toll imposed on the CPU in 3D Vision, so just some decent cards are no longer enough for 3D gaming, when the games are badly optimised. Not to mention that the recent driver updates seem to make everything worse, performancewise, especially SLI. I think the new crop of 9xx cards have yet to reach their potential, if drivers mature and Nvidia get their act together.

#17
Posted 01/16/2015 09:12 AM   
I'm running an i5-4690K, non-OC (for now). I still get occasional dips, but most of the time things are pretty smooth.
I'm running an i5-4690K, non-OC (for now). I still get occasional dips, but most of the time things are pretty smooth.

#18
Posted 01/16/2015 01:31 PM   
Thanks for the input Zappologist. The interesting thing is that there is no real difference in gaming performance clock for clock between the oldest gen i7 and the latest gen i7. Granted their pipelines are more efficient, but this hasn't really translated into tangible gains in gaming where the CPU has been bottlenecked. In theory, Pirateguybrush's i5-4690K (Boost to 3.9GHz) as only 2 cores are really being used should be on par with my xeon at 4.3. I have tried disabling Hyper-threading and limiting the affinity of the AC4BF.exe to only 4 cores to simulate Pirateguybrush's setup. All tests have not revealed any differences unfortunately. Perhaps its is something intrinsic in the architecture. A quote from bo3b seems eerily true: "It was using 3.7% of the CPU which would not seem to be that significant a hit, but it translated directly to drop from 60 to 40, as the CPU was unable to keep the pipeline full." It's interesting that AC4 minimum specs show quad core as bare minimum but the game only ever uses 1.5 cores max. Fascinating, Captain... /:-|
Thanks for the input Zappologist.

The interesting thing is that there is no real difference in gaming performance clock for clock between the oldest gen i7 and the latest gen i7. Granted their pipelines are more efficient, but this hasn't really translated into tangible gains in gaming where the CPU has been bottlenecked.

In theory, Pirateguybrush's i5-4690K (Boost to 3.9GHz) as only 2 cores are really being used should be on par with my xeon at 4.3. I have tried disabling Hyper-threading and limiting the affinity of the AC4BF.exe to only 4 cores to simulate Pirateguybrush's setup. All tests have not revealed any differences unfortunately.

Perhaps its is something intrinsic in the architecture.

A quote from bo3b seems eerily true:
"It was using 3.7% of the CPU which would not seem to be that significant a hit, but it translated directly to drop from 60 to 40, as the CPU was unable to keep the pipeline full."

It's interesting that AC4 minimum specs show quad core as bare minimum but the game only ever uses 1.5 cores max.

Fascinating, Captain... /:-|

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#19
Posted 01/16/2015 11:47 PM   
Not that I have any idea how to use it, but Nvidia has a SLI Profile tool that they released when they changed how profiles were stored. I think it works different than Nvidia Inspector. "Instead of editing NvApps.xml, we have created a simple tool that enables SLI customers to export their SLI profiles to a text file, edit them, and then import them back into the driver." http://nvidia.custhelp.com/app/answers/detail/a_id/2625
Not that I have any idea how to use it, but Nvidia has a SLI Profile tool that they released when they changed how profiles were stored. I think it works different than Nvidia Inspector.


"Instead of editing NvApps.xml, we have created a simple tool that enables SLI customers to export their SLI profiles to a text file, edit them, and then import them back into the driver."


http://nvidia.custhelp.com/app/answers/detail/a_id/2625

#20
Posted 01/17/2015 05:53 AM   
RAGEdemon, I'm running SLI 970's too. I don't think I'm having the same issues but I don't have either of those 2 games. Check me out on Steam and see if we have some of the same games and I'll do some testing with you. I'm "Naga64" - friend me :-)
RAGEdemon, I'm running SLI 970's too. I don't think I'm having the same issues but I don't have either of those 2 games. Check me out on Steam and see if we have some of the same games and I'll do some testing with you.
I'm "Naga64" - friend me :-)

GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310

#21
Posted 01/17/2015 03:12 PM   
Friend invite sent :) Upon further testing, although fps in other games are comparatively quite low when 3D is engaged, the downward spiked only happen in AC4:BF... hmm...
Friend invite sent :)

Upon further testing, although fps in other games are comparatively quite low when 3D is engaged, the downward spiked only happen in AC4:BF... hmm...

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#22
Posted 01/18/2015 04:46 PM   
Yes. I'm seeing the same issue in many games. With 3D Vision option turned on but 3D not enabled in the game the GPU usage is almost half.
Yes. I'm seeing the same issue in many games. With 3D Vision option turned on but 3D not enabled in the game the GPU usage is almost half.

#23
Posted 01/19/2015 06:58 AM   
In other games I am getting something similar to this... It seems it may not be 3D related in some games. I hope it clarifies the issue... [url]https://www.youtube.com/watch?v=XWffQrveWao[/url]
In other games I am getting something similar to this... It seems it may not be 3D related in some games. I hope it clarifies the issue...

" rel="nofollow" target = "_blank">

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#24
Posted 01/19/2015 04:51 PM   
  2 / 2    
Scroll To Top