GTA V - Problems & Solutions List (Please keep GTA discussion here)
48 / 94
@Laithan: Those images are too compressed, so it's not possible to verify your results. I don't accept that there is any cap on GPU utilization.
How are you deciding that your CPU is being used? Most monitoring software is stupid and shows a % of the maximum available, without taking into account cores. It might say CPU usage of 40%, but that doesn't mean anything relative to cores. If you have 12 cores/threads and it only actively uses 3 cores, it will look like 25% max usage. Plenty of headroom left, it would appear, but the problem is that 3 cores are running as fast as they can, the others are idle.
What really matters is cores. If you use the Affinity, you can do the experiment that RageDemon did, where you can see that it stops scaling properly after you get to 4 cores or above.
@Laithan: Those images are too compressed, so it's not possible to verify your results. I don't accept that there is any cap on GPU utilization.
How are you deciding that your CPU is being used? Most monitoring software is stupid and shows a % of the maximum available, without taking into account cores. It might say CPU usage of 40%, but that doesn't mean anything relative to cores. If you have 12 cores/threads and it only actively uses 3 cores, it will look like 25% max usage. Plenty of headroom left, it would appear, but the problem is that 3 cores are running as fast as they can, the others are idle.
What really matters is cores. If you use the Affinity, you can do the experiment that RageDemon did, where you can see that it stops scaling properly after you get to 4 cores or above.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
So sounds like I am the only one that's had the fix completely break in terms of shadows after the most recent patches and driver updates. =/
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
[quote=""]So sounds like I am the only one that's had the fix completely break in terms of shadows after the most recent patches and driver updates. =/[/quote]
I will give you the same recommendation I got:
Post your hardware setup in your signature and make sure to include which driver you are using. This will make it easier in general for people to help you. If you are too lazy to do it dont expect too much help :P
That said, my shadows were broken too with the latest driver; I rolled back to 350.12. So yes, I had the exact same issue like you had, the solution was to uninstall the driver and roll back to an older driver. That will fix your shadows issue.
said:So sounds like I am the only one that's had the fix completely break in terms of shadows after the most recent patches and driver updates. =/
I will give you the same recommendation I got:
Post your hardware setup in your signature and make sure to include which driver you are using. This will make it easier in general for people to help you. If you are too lazy to do it dont expect too much help :P
That said, my shadows were broken too with the latest driver; I rolled back to 350.12. So yes, I had the exact same issue like you had, the solution was to uninstall the driver and roll back to an older driver. That will fix your shadows issue.
ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)
[quote=""]I respect your opinion but I've shown it for all to see.
I have only 4 cores. [/quote]
Well, you've come to the wrong conclusion.
Your second test clearly demonstrates that it is CPU bound.
When the image becomes more complicated, if you are CPU bound, then you cannot increase either CPU usage, nor GPU usage, because it's already running as fast as it can. That's why you don't see a bump in either. Thus, that extra complexity translates directly to lowered frame rates.
You can easily prove this by overclocking your CPU to see it open up more headroom on your GPU for this test case. If you are overclocked already, go back to stock and note the drop in GPU utilization.
Your images can't be viewed by us, the overlay is too small and too fuzzy to read. It's asking a lot to ask us to just take your word for it.
said:I respect your opinion but I've shown it for all to see.
I have only 4 cores.
Well, you've come to the wrong conclusion.
Your second test clearly demonstrates that it is CPU bound.
When the image becomes more complicated, if you are CPU bound, then you cannot increase either CPU usage, nor GPU usage, because it's already running as fast as it can. That's why you don't see a bump in either. Thus, that extra complexity translates directly to lowered frame rates.
You can easily prove this by overclocking your CPU to see it open up more headroom on your GPU for this test case. If you are overclocked already, go back to stock and note the drop in GPU utilization.
Your images can't be viewed by us, the overlay is too small and too fuzzy to read. It's asking a lot to ask us to just take your word for it.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote=""]@Laithan: Those images are too compressed, so it's not possible to verify your results. I don't accept that there is any cap on GPU utilization.
How are you deciding that your CPU is being used? Most monitoring software is stupid and shows a % of the maximum available, without taking into account cores. It might say CPU usage of 40%, but that doesn't mean anything relative to cores. If you have 12 cores/threads and it only actively uses 3 cores, it will look like 25% max usage. Plenty of headroom left, it would appear, but the problem is that 3 cores are running as fast as they can, the others are idle.
What really matters is cores. If you use the Affinity, you can do the experiment that RageDemon did, where you can see that it stops scaling properly after you get to 4 cores or above.[/quote]
hey bo3b, PX allows you to monitor utilisation for each cpu core (or thread) in the OSD (so does AB) - are you saying this is not correct?
Anyway, so am I concluding here that basically our cpu's are too slow(?), barring we have some of the fastest CPU's out there? You mentioned earlier that GTAV/Dying Light will use as many cores as you throw at it, and agreed in non 3D these games run butter smooth with high fps at any resolution - but once we move into 3d, resolution becomes irrelevant. There will be horrible fps dips in cpu intense area's and bad gpu scaling/utilisation at the same time.
I really dont have much headroom left to overclock my cpu any more for 24/7 use or I am going to fry my poor cpu and will be forced to move to 5960x facing the exact same problem, again, and abusing my wallet while at it.
said:@Laithan: Those images are too compressed, so it's not possible to verify your results. I don't accept that there is any cap on GPU utilization.
How are you deciding that your CPU is being used? Most monitoring software is stupid and shows a % of the maximum available, without taking into account cores. It might say CPU usage of 40%, but that doesn't mean anything relative to cores. If you have 12 cores/threads and it only actively uses 3 cores, it will look like 25% max usage. Plenty of headroom left, it would appear, but the problem is that 3 cores are running as fast as they can, the others are idle.
What really matters is cores. If you use the Affinity, you can do the experiment that RageDemon did, where you can see that it stops scaling properly after you get to 4 cores or above.
hey bo3b, PX allows you to monitor utilisation for each cpu core (or thread) in the OSD (so does AB) - are you saying this is not correct?
Anyway, so am I concluding here that basically our cpu's are too slow(?), barring we have some of the fastest CPU's out there? You mentioned earlier that GTAV/Dying Light will use as many cores as you throw at it, and agreed in non 3D these games run butter smooth with high fps at any resolution - but once we move into 3d, resolution becomes irrelevant. There will be horrible fps dips in cpu intense area's and bad gpu scaling/utilisation at the same time.
I really dont have much headroom left to overclock my cpu any more for 24/7 use or I am going to fry my poor cpu and will be forced to move to 5960x facing the exact same problem, again, and abusing my wallet while at it.
ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)
[quote=""][quote=""]So sounds like I am the only one that's had the fix completely break in terms of shadows after the most recent patches and driver updates. =/[/quote]
I will give you the same recommendation I got:
Post your hardware setup in your signature and make sure to include which driver you are using. This will make it easier in general for people to help you. If you are too lazy to do it dont expect too much help :P
That said, my shadows were broken too with the latest driver; I rolled back to 350.12. So yes, I had the exact same issue like you had, the solution was to uninstall the driver and roll back to an older driver. That will fix your shadows issue.[/quote]
Some reason I can never get my signature to work here.
That said I wasn't really asking for help specifically that posting my specs would achieve much. Its obviously driver related. I was just curious if anyone else had the same issue.
So on that note, thank you. I guess I'll put this game back in the queue and try again later rolling back the driver if nessesary
said:So sounds like I am the only one that's had the fix completely break in terms of shadows after the most recent patches and driver updates. =/
I will give you the same recommendation I got:
Post your hardware setup in your signature and make sure to include which driver you are using. This will make it easier in general for people to help you. If you are too lazy to do it dont expect too much help :P
That said, my shadows were broken too with the latest driver; I rolled back to 350.12. So yes, I had the exact same issue like you had, the solution was to uninstall the driver and roll back to an older driver. That will fix your shadows issue.
Some reason I can never get my signature to work here.
That said I wasn't really asking for help specifically that posting my specs would achieve much. Its obviously driver related. I was just curious if anyone else had the same issue.
So on that note, thank you. I guess I'll put this game back in the queue and try again later rolling back the driver if nessesary
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
Haven't heard or seen anyone mention this obscure little feature that made it's debut in the latest R* update (.372)
It resides under the Advanced Graphic settings, which may help to explain why it may be overlooked, either you have tried those options and maybe left one or two enabled, or many may be put off by the warning that accompanies this submenu.. Either way I would gamble it isn't entered often, and is utilized even less.
To the Point:
'Frame Scaling Mode'
A quick shuffle through the different sub-settings along with it's name led me to the conclusion that it is an attempt at built in 'down-sampling' somewhat like NV's own 'DSR' settings.
So, I finally had the stability (and the cahonnies) to play with this for a bit and eventually maxxed it out at the x2.5 setting. Thinking I was in for a slideshow at best and a crash at worst, I pressed on and, to my surprise, it came back and asked me if I wanted to save my settings, I obliged and you know what happened?
I had near PERFECT scaling on my SLI'd 970's, I mean high 80's to mid 90's consistently! (And yes, with 3D enabled) The best part, it's more than playable, feels and LOOKs great.
Not sure if I have stumbled upon anything really helpful to you all, but it seems promising, check it out and report back.
I have a pic I am attaching of the settings
~Nutz
EDIT - Clearly this lends, at least, more evidence to the 'CPU' side of things being the prime issue (regarding performance) as it relates to 3D, SLI, and the combo of using them together.
EDIT # 2 - looking at the pic again reminded me of something else I wanted to draw attention to. Also, since the last update, the memory reporting/tallying seems to be hooking the SYSTEM memory instead of the GPU's own Video Memory.
Prior to this update, the memory would accurately reflect the GPU SubSystem memory only.
Once in a while, after a fresh install (GTAV) or significant settings changes, it would slip and report BOTH GPU's memory 'stacked,' so in my case NORMAL would be 4GB's, but under those conditions I mentioned it would report it as 8GB's, you get used to it, but this latest update seems to baffle me as it reports 12GB's, and in my rig, the only memory that adds up to 12 is the systems..
One other thing, maybe related, the game NEEDs a paging file, I hate them personally, but sometimes it won't even finish loading if it is disabled, even with 12GB's + of system memory. Just try to put it on a separate drive from the game & your OS if possible.
Wow, this game is something else, the time dedicated to this is indeed of epic proportions, imagine the games that will be releasing in the years to come, more features, bigger sizes, better graphics, win 10, the rift~ all add up to more things that can go wrong - well, i'm in it for the long haul, my wallet on the other hand has been boycotting my pants something fierce since those 970's burned a hole in it!
Haven't heard or seen anyone mention this obscure little feature that made it's debut in the latest R* update (.372)
It resides under the Advanced Graphic settings, which may help to explain why it may be overlooked, either you have tried those options and maybe left one or two enabled, or many may be put off by the warning that accompanies this submenu.. Either way I would gamble it isn't entered often, and is utilized even less.
To the Point:
'Frame Scaling Mode'
A quick shuffle through the different sub-settings along with it's name led me to the conclusion that it is an attempt at built in 'down-sampling' somewhat like NV's own 'DSR' settings.
So, I finally had the stability (and the cahonnies) to play with this for a bit and eventually maxxed it out at the x2.5 setting. Thinking I was in for a slideshow at best and a crash at worst, I pressed on and, to my surprise, it came back and asked me if I wanted to save my settings, I obliged and you know what happened?
I had near PERFECT scaling on my SLI'd 970's, I mean high 80's to mid 90's consistently! (And yes, with 3D enabled) The best part, it's more than playable, feels and LOOKs great.
Not sure if I have stumbled upon anything really helpful to you all, but it seems promising, check it out and report back.
I have a pic I am attaching of the settings
~Nutz
EDIT - Clearly this lends, at least, more evidence to the 'CPU' side of things being the prime issue (regarding performance) as it relates to 3D, SLI, and the combo of using them together.
EDIT # 2 - looking at the pic again reminded me of something else I wanted to draw attention to. Also, since the last update, the memory reporting/tallying seems to be hooking the SYSTEM memory instead of the GPU's own Video Memory.
Prior to this update, the memory would accurately reflect the GPU SubSystem memory only.
Once in a while, after a fresh install (GTAV) or significant settings changes, it would slip and report BOTH GPU's memory 'stacked,' so in my case NORMAL would be 4GB's, but under those conditions I mentioned it would report it as 8GB's, you get used to it, but this latest update seems to baffle me as it reports 12GB's, and in my rig, the only memory that adds up to 12 is the systems..
One other thing, maybe related, the game NEEDs a paging file, I hate them personally, but sometimes it won't even finish loading if it is disabled, even with 12GB's + of system memory. Just try to put it on a separate drive from the game & your OS if possible.
Wow, this game is something else, the time dedicated to this is indeed of epic proportions, imagine the games that will be releasing in the years to come, more features, bigger sizes, better graphics, win 10, the rift~ all add up to more things that can go wrong - well, i'm in it for the long haul, my wallet on the other hand has been boycotting my pants something fierce since those 970's burned a hole in it!
[quote=""][quote=""]I respect your opinion but I've shown it for all to see.
I have only 4 cores. [/quote]
Well, you've come to the wrong conclusion.
Your second test clearly demonstrates that it is CPU bound.
When the image becomes more complicated, if you are CPU bound, then you cannot increase either CPU usage, nor GPU usage, because it's already running as fast as it can. That's why you don't see a bump in either. Thus, that extra complexity translates directly to lowered frame rates.
You can easily prove this by overclocking your CPU to see it open up more headroom on your GPU for this test case. If you are overclocked already, go back to stock and note the drop in GPU utilization.
Your images can't be viewed by us, the overlay is too small and too fuzzy to read. It's asking a lot to ask us to just take your word for it.[/quote]
Ok this is my last post here, your personal charge against me isn't warranted. I am here to contribute are you are here to criticize. I also don't appreciate questioning my integrity. I never did that to you, [i]if you can't handle a difference of opinion like an adult[/i], perhaps you should go back to etch-a-sketch.
So guess what, I'll bite one last time and [b][i]dumb things down for you[/i][/b] as there is clear evidence it is needed. I do apologize to the forum guests that have their time WASTED by this inadequate comprehension and/or overactive Bo3b ego.
Dumbed down [color="green"]just for you[/color]
Bo3b:
(1) Those are called stereo screenshots. They are taken through ALT-F1, built right into the NVIDIA driver.
(2) The screenshots are ORIGINAL and NOT MODIFIED. They CANNOT get any more original.
(3) There is a program called NVIDIA 3D Vision Photo viewer. Maybe you should try using the correct viewer?
(4) You are being lazy. If you don't want to view/analyze my evidence, then don't make a comment about it.
(5) Your "Theory" about multiple cores blah blah blah, you're lecturing someone that knows all that crap and more, and that IT DOESN'T APPLY TO ME, or this siutation AT ALL. Useless jaw flapping.. I have 4 cores, no hyperthreading. I don't need your lectures and attempts to show everyone how smart you are.. Save it for someone else...
(6) Oh and I bet you didn't bother to test this yourself.. You'll see it clearly.
(7) I also stated that I can get 60fps @ 4k, full SLI scaling, full gpu utilization, NO 3D, WITH EVERYTHING MAXXED OUT. Did you selectively not read that part? [b][color="orange"]If I was CPU bound, why am I able to peg 60fps @ 4k?[/color]
[/b]
(8) [b][u]DON'T speak for others here in this thread[/u][/b]. Speak for YOURSELF. [b]How the hell did [color="green"]YOU [/color]
become [color="orange"]WE[/color]
?!?[/b]
In case you missed it, re-read #7 above. [b][size="L"]CASE CLOSED[/size]
[/b]
said:I respect your opinion but I've shown it for all to see.
I have only 4 cores.
Well, you've come to the wrong conclusion.
Your second test clearly demonstrates that it is CPU bound.
When the image becomes more complicated, if you are CPU bound, then you cannot increase either CPU usage, nor GPU usage, because it's already running as fast as it can. That's why you don't see a bump in either. Thus, that extra complexity translates directly to lowered frame rates.
You can easily prove this by overclocking your CPU to see it open up more headroom on your GPU for this test case. If you are overclocked already, go back to stock and note the drop in GPU utilization.
Your images can't be viewed by us, the overlay is too small and too fuzzy to read. It's asking a lot to ask us to just take your word for it.
Ok this is my last post here, your personal charge against me isn't warranted. I am here to contribute are you are here to criticize. I also don't appreciate questioning my integrity. I never did that to you, if you can't handle a difference of opinion like an adult, perhaps you should go back to etch-a-sketch.
So guess what, I'll bite one last time and dumb things down for you as there is clear evidence it is needed. I do apologize to the forum guests that have their time WASTED by this inadequate comprehension and/or overactive Bo3b ego.
Dumbed down just for you
Bo3b:
(1) Those are called stereo screenshots. They are taken through ALT-F1, built right into the NVIDIA driver.
(2) The screenshots are ORIGINAL and NOT MODIFIED. They CANNOT get any more original.
(3) There is a program called NVIDIA 3D Vision Photo viewer. Maybe you should try using the correct viewer?
(4) You are being lazy. If you don't want to view/analyze my evidence, then don't make a comment about it.
(5) Your "Theory" about multiple cores blah blah blah, you're lecturing someone that knows all that crap and more, and that IT DOESN'T APPLY TO ME, or this siutation AT ALL. Useless jaw flapping.. I have 4 cores, no hyperthreading. I don't need your lectures and attempts to show everyone how smart you are.. Save it for someone else...
(6) Oh and I bet you didn't bother to test this yourself.. You'll see it clearly.
(7) I also stated that I can get 60fps @ 4k, full SLI scaling, full gpu utilization, NO 3D, WITH EVERYTHING MAXXED OUT. Did you selectively not read that part? If I was CPU bound, why am I able to peg 60fps @ 4k?
(8) DON'T speak for others here in this thread. Speak for YOURSELF. How the hell did YOU
become WE
?!?
In case you missed it, re-read #7 above. CASE CLOSED
wow. Didn`t see that coming. I`ve read Bo3b`s post three times to find out what is all about. Didn`t find anything else then information, communication, suggestion. Where is that something against Laithan ?
wow. Didn`t see that coming. I`ve read Bo3b`s post three times to find out what is all about. Didn`t find anything else then information, communication, suggestion. Where is that something against Laithan ?
[quote=""]
baseless rant directed at Bo3b
[/quote]
Please check your tone, and understand who you are addressing. This is a community where we help each other, not talk down to each other. And right at or near the top of the list of people who help this community is the person you decided to talk down to.
And you may have your opinion, but no one will care about your opinion if that is how you choose to conduct yourself on our forums. And, of course, any game which exhibits the behavior you present in your evidence, is a game that is CPU-limited. As Bo3b suggested, please either take off your overclock, or add one, and you'll see for yourself.
Please check your tone, and understand who you are addressing. This is a community where we help each other, not talk down to each other. And right at or near the top of the list of people who help this community is the person you decided to talk down to.
And you may have your opinion, but no one will care about your opinion if that is how you choose to conduct yourself on our forums. And, of course, any game which exhibits the behavior you present in your evidence, is a game that is CPU-limited. As Bo3b suggested, please either take off your overclock, or add one, and you'll see for yourself.
3D Vision Surround | Driver 359.00 | Windows 7
GTX 980 SLI | i7 3770K @ 4.2 GHz | 16 GB RAM
3x ASUS VG248QE w/ G-SYNC
@Laithan:
Hmm.. you know... I think you have some real problems there my friend...
Also, all your 7/8 statements are false:)) and clearly NO understanding of how software works...
But the most hilarious ones are these two:
"
(1) Those are called stereo screenshots. They are taken through ALT-F1, built right into the NVIDIA driver.
(2) The screenshots are ORIGINAL and NOT MODIFIED. They CANNOT get any more original.
"
For 1: Big thank your for "enlightening" us.. mortal fools as we had no idea about it...
For 2: Did you know that JPS is actually a SIDE-BY-SIDE JPG FILE ? And Jpeg files are always USING COMPRESSION ? (is not a lossless format like BMP for example). There is also a setting in registry that determines the compression ratio. -_- (Do your homework).
If you do... then it shows you are a "snob". If you don't then you are just a retarded and miss-informed kid.
SO GO BACK TO SQUARE ZERO AND WATCH YOUR TONE :)
(See we can use the same language as you.. you little @~!"~) :)
As for your tiny knowledge...You can see the CPU bottleneck plain and clear...
Hmm.. you know... I think you have some real problems there my friend...
Also, all your 7/8 statements are false:)) and clearly NO understanding of how software works...
But the most hilarious ones are these two:
"
(1) Those are called stereo screenshots. They are taken through ALT-F1, built right into the NVIDIA driver.
(2) The screenshots are ORIGINAL and NOT MODIFIED. They CANNOT get any more original.
"
For 1: Big thank your for "enlightening" us.. mortal fools as we had no idea about it...
For 2: Did you know that JPS is actually a SIDE-BY-SIDE JPG FILE ? And Jpeg files are always USING COMPRESSION ? (is not a lossless format like BMP for example). There is also a setting in registry that determines the compression ratio. -_- (Do your homework).
If you do... then it shows you are a "snob". If you don't then you are just a retarded and miss-informed kid.
SO GO BACK TO SQUARE ZERO AND WATCH YOUR TONE :)
(See we can use the same language as you.. you little @~!"~) :)
As for your tiny knowledge...You can see the CPU bottleneck plain and clear...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
@Laithan: yes, sorry you took offense. I'm not on some ego trip, I just care about facts and data, and you are not supplying proper data for your test. I genuinely expect people, including myself, to be open to new ideas and possibilities- but only with proof. You said, I said, I could care less about. Prove me wrong with data and I will accept it. I try very hard to backup my points with data, I expect the same of everyone here.
I wasn't being clear with my criticism of your pictures. You put them up on TinyPic.com, which is not a good choice. It shreds the image to where it is unviewable. I did in fact try to look at your images natively, and the interesting part is the overlay, right?
Here's what your images look like at 100%:
[img]http://sg.bo3b.net/gtav/laithan%20fuzz.JPG[/img]
I defy anyone to make out anything meaningful from these shredded images. The best choice for images of this form is to just attach them to your post using the forum paperclip icon (top right AFTER you've made a post).
Please step back and listen to my argument for why I think your conclusion is wrong. You can of course believe whatever you wish, but your facts actually support a 3 core CPU scaling problem [i]IN 3D[/i].
Not in 2D, 2D is not of interest to us here. There is perfect SLI scaling in 2D, and that is the crux of the problem, it changes in 3D to using only 3 cores (that we can tell).
I use the "we", because this is not about me, it's about what other people have found and verified as well, including the key discovery from RageDemon. Using "I" for something I did not find would be pretty arrogant.
Please keep in mind that when I criticize your test or your data, that is not a personal attack. I think that you have made a misleading post and come to the wrong conclusion, and I like to make sure that other readers don't get misled for things I think are wrong.
You are specifically calling into question our conclusion that there is a 3 core limit in 3D, and your data does not back that up. If you can prove me wrong, I will revise my posts.
@Laithan: yes, sorry you took offense. I'm not on some ego trip, I just care about facts and data, and you are not supplying proper data for your test. I genuinely expect people, including myself, to be open to new ideas and possibilities- but only with proof. You said, I said, I could care less about. Prove me wrong with data and I will accept it. I try very hard to backup my points with data, I expect the same of everyone here.
I wasn't being clear with my criticism of your pictures. You put them up on TinyPic.com, which is not a good choice. It shreds the image to where it is unviewable. I did in fact try to look at your images natively, and the interesting part is the overlay, right?
Here's what your images look like at 100%:
I defy anyone to make out anything meaningful from these shredded images. The best choice for images of this form is to just attach them to your post using the forum paperclip icon (top right AFTER you've made a post).
Please step back and listen to my argument for why I think your conclusion is wrong. You can of course believe whatever you wish, but your facts actually support a 3 core CPU scaling problem IN 3D.
Not in 2D, 2D is not of interest to us here. There is perfect SLI scaling in 2D, and that is the crux of the problem, it changes in 3D to using only 3 cores (that we can tell).
I use the "we", because this is not about me, it's about what other people have found and verified as well, including the key discovery from RageDemon. Using "I" for something I did not find would be pretty arrogant.
Please keep in mind that when I criticize your test or your data, that is not a personal attack. I think that you have made a misleading post and come to the wrong conclusion, and I like to make sure that other readers don't get misled for things I think are wrong.
You are specifically calling into question our conclusion that there is a 3 core limit in 3D, and your data does not back that up. If you can prove me wrong, I will revise my posts.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote=""]hey bo3b, PX allows you to monitor utilisation for each cpu core (or thread) in the OSD (so does AB) - are you saying this is not correct?
Anyway, so am I concluding here that basically our cpu's are too slow(?), barring we have some of the fastest CPU's out there? You mentioned earlier that GTAV/Dying Light will use as many cores as you throw at it, and agreed in non 3D these games run butter smooth with high fps at any resolution - but once we move into 3d, resolution becomes irrelevant. There will be horrible fps dips in cpu intense area's and bad gpu scaling/utilisation at the same time.
I really dont have much headroom left to overclock my cpu any more for 24/7 use or I am going to fry my poor cpu and will be forced to move to 5960x facing the exact same problem, again, and abusing my wallet while at it.[/quote]
Yes, that's the crux of it, even the best CPU you can buy today isn't all that different from what you could get 3 years ago. The time of big performance wins is over, and they are trying to cheat it now by using multiple cores. This makes the developer job a lot harder, and leads to weird problems like this no-scaling when in 3D. Probably something in the 3D driver is preventing it from using all cores available.
I'm not familiar with PX, but the problem with the Overlays and the Windows Task Manager in particular is that they show averages across all cores without giving you the ability to see what a particular process is using. And the OS will bounce the active duty around to all cores, so it LOOKS like it's using all cores, but is misleading.
When I run GTA5 on my 4 core system with no hyperthreading, it shows as using 80% of maximum CPU. Seems to imply there is more headroom, but it's actually 3 cores running flat out, one core idling for background junk. So only 75% is real, 3 cores. If you have 12 cores/threads, it will show up as 25% of maximum, with the same problem.
The only way I presently know that you can tell is to use the affinity and see the results. If you set affinity to only 3 cores, you'll not see any difference in performance. This was what RageDemon discovered.
said:hey bo3b, PX allows you to monitor utilisation for each cpu core (or thread) in the OSD (so does AB) - are you saying this is not correct?
Anyway, so am I concluding here that basically our cpu's are too slow(?), barring we have some of the fastest CPU's out there? You mentioned earlier that GTAV/Dying Light will use as many cores as you throw at it, and agreed in non 3D these games run butter smooth with high fps at any resolution - but once we move into 3d, resolution becomes irrelevant. There will be horrible fps dips in cpu intense area's and bad gpu scaling/utilisation at the same time.
I really dont have much headroom left to overclock my cpu any more for 24/7 use or I am going to fry my poor cpu and will be forced to move to 5960x facing the exact same problem, again, and abusing my wallet while at it.
Yes, that's the crux of it, even the best CPU you can buy today isn't all that different from what you could get 3 years ago. The time of big performance wins is over, and they are trying to cheat it now by using multiple cores. This makes the developer job a lot harder, and leads to weird problems like this no-scaling when in 3D. Probably something in the 3D driver is preventing it from using all cores available.
I'm not familiar with PX, but the problem with the Overlays and the Windows Task Manager in particular is that they show averages across all cores without giving you the ability to see what a particular process is using. And the OS will bounce the active duty around to all cores, so it LOOKS like it's using all cores, but is misleading.
When I run GTA5 on my 4 core system with no hyperthreading, it shows as using 80% of maximum CPU. Seems to imply there is more headroom, but it's actually 3 cores running flat out, one core idling for background junk. So only 75% is real, 3 cores. If you have 12 cores/threads, it will show up as 25% of maximum, with the same problem.
The only way I presently know that you can tell is to use the affinity and see the results. If you set affinity to only 3 cores, you'll not see any difference in performance. This was what RageDemon discovered.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Hey guys. Finally decided to make an account so I can post here. I've been using 3D Vision since 2013 and thankfully Helix fixes always work for every game I've played so far. I find it very ironic that this game which natively supports and is actually advertised as 3D Vision Ready with a rating of "excellent" is the first game where I can't get 3D Vision to run correctly.
Not sure I can bring anything new to the table but so far what I've noticed is a fresh install of drivers without touching anything else gives the best results so long as you do not touch any in game settings or tweak anything in the game profile itself. The game will work for about 30 minutes to an hour and then will crash with the event log showing nvwgf2umx.dll crashed with a 0xc000005. Once it crashes all shaders and shadows are broken. No number of restarts or tweaks will fix it. Enabling/disabling 3D, SLI or no SLI, etc. The only thing that seems to fix the issue with the shadows and shaders is a fresh install of the NVIDIA drivers. After a fresh install everything works FLAWLESSLY with max settings (except advanced settings, I leave all off for performance issues). Only issue is the crash after a few minutes of playing.
I have never experienced any significant performance drops or the "3D disable" issue. I tried backreading through this thread but was wondering if I missed a fix for nvwgf2umx.dll crashing. I have yet to try the MP3 profile and deinstalling NV Experience, will do this next.
Thanks for taking the time to read this long post of mine. Rig is in the sig.
Hey guys. Finally decided to make an account so I can post here. I've been using 3D Vision since 2013 and thankfully Helix fixes always work for every game I've played so far. I find it very ironic that this game which natively supports and is actually advertised as 3D Vision Ready with a rating of "excellent" is the first game where I can't get 3D Vision to run correctly.
Not sure I can bring anything new to the table but so far what I've noticed is a fresh install of drivers without touching anything else gives the best results so long as you do not touch any in game settings or tweak anything in the game profile itself. The game will work for about 30 minutes to an hour and then will crash with the event log showing nvwgf2umx.dll crashed with a 0xc000005. Once it crashes all shaders and shadows are broken. No number of restarts or tweaks will fix it. Enabling/disabling 3D, SLI or no SLI, etc. The only thing that seems to fix the issue with the shadows and shaders is a fresh install of the NVIDIA drivers. After a fresh install everything works FLAWLESSLY with max settings (except advanced settings, I leave all off for performance issues). Only issue is the crash after a few minutes of playing.
I have never experienced any significant performance drops or the "3D disable" issue. I tried backreading through this thread but was wondering if I missed a fix for nvwgf2umx.dll crashing. I have yet to try the MP3 profile and deinstalling NV Experience, will do this next.
Thanks for taking the time to read this long post of mine. Rig is in the sig.
PROJECT PROMETHEUS:
Intel Core i5 3570K @ 4.6GHz - Corsair H100 Liquid Cooling
Palit JetStream GTX680/EVGA Superclocked GTX680 in SLI
Asus P8Z77-V
Crucial M4 128GB SATA3 SSD/WD Caviar Green 3TB SATA3 HDD
Corsair Vengeance 2x4GB @ 1600
Seasonic M12II 850W Bronze Certified
Asus VG278H 1920x1080 120Hz 3D Vision 2
CM Storm Trooper
CM Storm Quickfire Pro (MX Black Switch)
CM Storm Spawn
Logitech Z623/Fiio E10 DAC/Superlux HD681 EVO/Superlux HD661
How are you deciding that your CPU is being used? Most monitoring software is stupid and shows a % of the maximum available, without taking into account cores. It might say CPU usage of 40%, but that doesn't mean anything relative to cores. If you have 12 cores/threads and it only actively uses 3 cores, it will look like 25% max usage. Plenty of headroom left, it would appear, but the problem is that 3 cores are running as fast as they can, the others are idle.
What really matters is cores. If you use the Affinity, you can do the experiment that RageDemon did, where you can see that it stops scaling properly after you get to 4 cores or above.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I have only 4 cores.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
I will give you the same recommendation I got:
Post your hardware setup in your signature and make sure to include which driver you are using. This will make it easier in general for people to help you. If you are too lazy to do it dont expect too much help :P
That said, my shadows were broken too with the latest driver; I rolled back to 350.12. So yes, I had the exact same issue like you had, the solution was to uninstall the driver and roll back to an older driver. That will fix your shadows issue.
ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)
G-Pat on Helixmod
Well, you've come to the wrong conclusion.
Your second test clearly demonstrates that it is CPU bound.
When the image becomes more complicated, if you are CPU bound, then you cannot increase either CPU usage, nor GPU usage, because it's already running as fast as it can. That's why you don't see a bump in either. Thus, that extra complexity translates directly to lowered frame rates.
You can easily prove this by overclocking your CPU to see it open up more headroom on your GPU for this test case. If you are overclocked already, go back to stock and note the drop in GPU utilization.
Your images can't be viewed by us, the overlay is too small and too fuzzy to read. It's asking a lot to ask us to just take your word for it.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
hey bo3b, PX allows you to monitor utilisation for each cpu core (or thread) in the OSD (so does AB) - are you saying this is not correct?
Anyway, so am I concluding here that basically our cpu's are too slow(?), barring we have some of the fastest CPU's out there? You mentioned earlier that GTAV/Dying Light will use as many cores as you throw at it, and agreed in non 3D these games run butter smooth with high fps at any resolution - but once we move into 3d, resolution becomes irrelevant. There will be horrible fps dips in cpu intense area's and bad gpu scaling/utilisation at the same time.
I really dont have much headroom left to overclock my cpu any more for 24/7 use or I am going to fry my poor cpu and will be forced to move to 5960x facing the exact same problem, again, and abusing my wallet while at it.
ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)
G-Pat on Helixmod
Some reason I can never get my signature to work here.
That said I wasn't really asking for help specifically that posting my specs would achieve much. Its obviously driver related. I was just curious if anyone else had the same issue.
So on that note, thank you. I guess I'll put this game back in the queue and try again later rolling back the driver if nessesary
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
It resides under the Advanced Graphic settings, which may help to explain why it may be overlooked, either you have tried those options and maybe left one or two enabled, or many may be put off by the warning that accompanies this submenu.. Either way I would gamble it isn't entered often, and is utilized even less.
To the Point:
'Frame Scaling Mode'
A quick shuffle through the different sub-settings along with it's name led me to the conclusion that it is an attempt at built in 'down-sampling' somewhat like NV's own 'DSR' settings.
So, I finally had the stability (and the cahonnies) to play with this for a bit and eventually maxxed it out at the x2.5 setting. Thinking I was in for a slideshow at best and a crash at worst, I pressed on and, to my surprise, it came back and asked me if I wanted to save my settings, I obliged and you know what happened?
I had near PERFECT scaling on my SLI'd 970's, I mean high 80's to mid 90's consistently! (And yes, with 3D enabled) The best part, it's more than playable, feels and LOOKs great.
Not sure if I have stumbled upon anything really helpful to you all, but it seems promising, check it out and report back.
I have a pic I am attaching of the settings
~Nutz
EDIT - Clearly this lends, at least, more evidence to the 'CPU' side of things being the prime issue (regarding performance) as it relates to 3D, SLI, and the combo of using them together.
EDIT # 2 - looking at the pic again reminded me of something else I wanted to draw attention to. Also, since the last update, the memory reporting/tallying seems to be hooking the SYSTEM memory instead of the GPU's own Video Memory.
Prior to this update, the memory would accurately reflect the GPU SubSystem memory only.
Once in a while, after a fresh install (GTAV) or significant settings changes, it would slip and report BOTH GPU's memory 'stacked,' so in my case NORMAL would be 4GB's, but under those conditions I mentioned it would report it as 8GB's, you get used to it, but this latest update seems to baffle me as it reports 12GB's, and in my rig, the only memory that adds up to 12 is the systems..
One other thing, maybe related, the game NEEDs a paging file, I hate them personally, but sometimes it won't even finish loading if it is disabled, even with 12GB's + of system memory. Just try to put it on a separate drive from the game & your OS if possible.
Wow, this game is something else, the time dedicated to this is indeed of epic proportions, imagine the games that will be releasing in the years to come, more features, bigger sizes, better graphics, win 10, the rift~ all add up to more things that can go wrong - well, i'm in it for the long haul, my wallet on the other hand has been boycotting my pants something fierce since those 970's burned a hole in it!
---- Core System Components ----
(MBD) EVGA® Classified™ (x58) E760
(CPU) Intel® i7™ '980x' (OC'd) @ 4.8Ghz
(CPU) Corsair® (CPU) Cooling™ (H50)
(MEM) Corsair® (MEM) Dominator(GT)™ 12GB @ 2000Mhz
(PSU) PC)P&Câ„¢ (PSU)'T12W' @ 1200w
(CSE) Cooler Master® Stacker™ (830)
---- (3D) Graphics Sub'Sys ----
(2x) EVGA® GTX'970 (SC) - Nvidia® SLi™
(1x) EVGA® GTX'660 (Ti) - Nvidia® PhysX™
(1x) ACER® (GN) 246(HL) - Nvidia® 3DVision™
(1x) ASUS® (VG) 248(QE) - Nvidia® 3DVision™
(1x) ACER® (GN) 246(HL) - Nvidia® 3DVision™
---- Audio & System Control ----
(1x) ASUS® - Xonar™ (HDAV1.3)
(1x) VL'Sys® - MPlay202+ 'GUI' & (RF) Remote
---- Storage (HDD's) & Media (ODD's) PB & REC ----
(1x) (SSD) Samsung® - 850(PRO) '3D'Vertical™
(1x) (2TB) Seagate® - Hybrid Series™
(4x) (2TB) W.Digital® - 'Blacks'™
(2x) (ODD) LG® BluRay™ - 'Play'n'Burn'
---- Nvidia® (WHQL) Drivers (x64) In Use ----
(NV®)DR - v347.88 (WHQL) - Primary (GTA V)
(NV®)DR - v350.12 (WHQL) - Testing (Stable)
(NV®)DR - v353.06 (WHQL) - All Other Titles
Ok this is my last post here, your personal charge against me isn't warranted. I am here to contribute are you are here to criticize. I also don't appreciate questioning my integrity. I never did that to you, if you can't handle a difference of opinion like an adult, perhaps you should go back to etch-a-sketch.
So guess what, I'll bite one last time and dumb things down for you as there is clear evidence it is needed. I do apologize to the forum guests that have their time WASTED by this inadequate comprehension and/or overactive Bo3b ego.
Dumbed down just for you
Bo3b:
(1) Those are called stereo screenshots. They are taken through ALT-F1, built right into the NVIDIA driver.
(2) The screenshots are ORIGINAL and NOT MODIFIED. They CANNOT get any more original.
(3) There is a program called NVIDIA 3D Vision Photo viewer. Maybe you should try using the correct viewer?
(4) You are being lazy. If you don't want to view/analyze my evidence, then don't make a comment about it.
(5) Your "Theory" about multiple cores blah blah blah, you're lecturing someone that knows all that crap and more, and that IT DOESN'T APPLY TO ME, or this siutation AT ALL. Useless jaw flapping.. I have 4 cores, no hyperthreading. I don't need your lectures and attempts to show everyone how smart you are.. Save it for someone else...
(6) Oh and I bet you didn't bother to test this yourself.. You'll see it clearly.
(7) I also stated that I can get 60fps @ 4k, full SLI scaling, full gpu utilization, NO 3D, WITH EVERYTHING MAXXED OUT. Did you selectively not read that part? If I was CPU bound, why am I able to peg 60fps @ 4k?
(8) DON'T speak for others here in this thread. Speak for YOURSELF. How the hell did YOU
become WE
?!?
In case you missed it, re-read #7 above. CASE CLOSED
https://steamcommunity.com/profiles/76561198014296177/
Please check your tone, and understand who you are addressing. This is a community where we help each other, not talk down to each other. And right at or near the top of the list of people who help this community is the person you decided to talk down to.
And you may have your opinion, but no one will care about your opinion if that is how you choose to conduct yourself on our forums. And, of course, any game which exhibits the behavior you present in your evidence, is a game that is CPU-limited. As Bo3b suggested, please either take off your overclock, or add one, and you'll see for yourself.
3D Vision Surround | Driver 359.00 | Windows 7
GTX 980 SLI | i7 3770K @ 4.2 GHz | 16 GB RAM
3x ASUS VG248QE w/ G-SYNC
Hmm.. you know... I think you have some real problems there my friend...
Also, all your 7/8 statements are false:)) and clearly NO understanding of how software works...
But the most hilarious ones are these two:
"
(1) Those are called stereo screenshots. They are taken through ALT-F1, built right into the NVIDIA driver.
(2) The screenshots are ORIGINAL and NOT MODIFIED. They CANNOT get any more original.
"
For 1: Big thank your for "enlightening" us.. mortal fools as we had no idea about it...
For 2: Did you know that JPS is actually a SIDE-BY-SIDE JPG FILE ? And Jpeg files are always USING COMPRESSION ? (is not a lossless format like BMP for example). There is also a setting in registry that determines the compression ratio. -_- (Do your homework).
If you do... then it shows you are a "snob". If you don't then you are just a retarded and miss-informed kid.
SO GO BACK TO SQUARE ZERO AND WATCH YOUR TONE :)
(See we can use the same language as you.. you little @~!"~) :)
As for your tiny knowledge...You can see the CPU bottleneck plain and clear...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
I wasn't being clear with my criticism of your pictures. You put them up on TinyPic.com, which is not a good choice. It shreds the image to where it is unviewable. I did in fact try to look at your images natively, and the interesting part is the overlay, right?
Here's what your images look like at 100%:
I defy anyone to make out anything meaningful from these shredded images. The best choice for images of this form is to just attach them to your post using the forum paperclip icon (top right AFTER you've made a post).
Please step back and listen to my argument for why I think your conclusion is wrong. You can of course believe whatever you wish, but your facts actually support a 3 core CPU scaling problem IN 3D.
Not in 2D, 2D is not of interest to us here. There is perfect SLI scaling in 2D, and that is the crux of the problem, it changes in 3D to using only 3 cores (that we can tell).
I use the "we", because this is not about me, it's about what other people have found and verified as well, including the key discovery from RageDemon. Using "I" for something I did not find would be pretty arrogant.
Please keep in mind that when I criticize your test or your data, that is not a personal attack. I think that you have made a misleading post and come to the wrong conclusion, and I like to make sure that other readers don't get misled for things I think are wrong.
You are specifically calling into question our conclusion that there is a 3 core limit in 3D, and your data does not back that up. If you can prove me wrong, I will revise my posts.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Yes, that's the crux of it, even the best CPU you can buy today isn't all that different from what you could get 3 years ago. The time of big performance wins is over, and they are trying to cheat it now by using multiple cores. This makes the developer job a lot harder, and leads to weird problems like this no-scaling when in 3D. Probably something in the 3D driver is preventing it from using all cores available.
I'm not familiar with PX, but the problem with the Overlays and the Windows Task Manager in particular is that they show averages across all cores without giving you the ability to see what a particular process is using. And the OS will bounce the active duty around to all cores, so it LOOKS like it's using all cores, but is misleading.
When I run GTA5 on my 4 core system with no hyperthreading, it shows as using 80% of maximum CPU. Seems to imply there is more headroom, but it's actually 3 cores running flat out, one core idling for background junk. So only 75% is real, 3 cores. If you have 12 cores/threads, it will show up as 25% of maximum, with the same problem.
The only way I presently know that you can tell is to use the affinity and see the results. If you set affinity to only 3 cores, you'll not see any difference in performance. This was what RageDemon discovered.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Not sure I can bring anything new to the table but so far what I've noticed is a fresh install of drivers without touching anything else gives the best results so long as you do not touch any in game settings or tweak anything in the game profile itself. The game will work for about 30 minutes to an hour and then will crash with the event log showing nvwgf2umx.dll crashed with a 0xc000005. Once it crashes all shaders and shadows are broken. No number of restarts or tweaks will fix it. Enabling/disabling 3D, SLI or no SLI, etc. The only thing that seems to fix the issue with the shadows and shaders is a fresh install of the NVIDIA drivers. After a fresh install everything works FLAWLESSLY with max settings (except advanced settings, I leave all off for performance issues). Only issue is the crash after a few minutes of playing.
I have never experienced any significant performance drops or the "3D disable" issue. I tried backreading through this thread but was wondering if I missed a fix for nvwgf2umx.dll crashing. I have yet to try the MP3 profile and deinstalling NV Experience, will do this next.
Thanks for taking the time to read this long post of mine. Rig is in the sig.
PROJECT PROMETHEUS:
Intel Core i5 3570K @ 4.6GHz - Corsair H100 Liquid Cooling
Palit JetStream GTX680/EVGA Superclocked GTX680 in SLI
Asus P8Z77-V
Crucial M4 128GB SATA3 SSD/WD Caviar Green 3TB SATA3 HDD
Corsair Vengeance 2x4GB @ 1600
Seasonic M12II 850W Bronze Certified
Asus VG278H 1920x1080 120Hz 3D Vision 2
CM Storm Trooper
CM Storm Quickfire Pro (MX Black Switch)
CM Storm Spawn
Logitech Z623/Fiio E10 DAC/Superlux HD681 EVO/Superlux HD661