GTA V - Problems & Solutions List (Please keep GTA discussion here)
29 / 94
I still don't understand the percent things you are talking about (doesn't matter, forget it) and I'm not sure to understand everything you said but I just would like to confirm that, after testing, [i]Multi-display/MixedGPU[/i] was useless just like you said and that [i]Maximum pre-rendered frames[/i] makes the game run better (an increase of 2-3 minimum fps) with default settings, so thank you.
Now, just to clarify again, I tested 3D at the beginning but not anymore with these settings, that's right. And so, yes, I should have said it earlier. But as already said, I thought that if these settings work well in 2D, they could also in 3D. And it's maybe the case. And note that I used these settings before the patch, so their efficiency is verified, at least in 2D. A lot of people have already confirmed.
Don't think our discussion is useless at the moment, isn't it?
Now, finally, just to resume and to be totally clear, here are the only useful settings to apply in NVIDIA Control Panel if you need better performances in 2D (and maybe in 3D) and additionally better texture quality without any performance loss:
__________________________________________________
[color="green"][size="M"]Power Management mode: [b]Prefer maximum performance[/b]
Threaded optimization: [b]On[/b]
Anisotropic filtering: [b]16x[/b]
Texture filtering: [b]High quality[/b]
Vertical sync: [b]Use global setting (application-controlled)[/b][/size][/color]
__________________________________________________
About Vsync, that applies if you don't have a G-SYNC screen of course.
And so, for a well working Vsync, one last time:
__________________________________________________
[color="orange"][size="M"]Disable and enable again the in-game Active Vsync at every game start.[/size][/color]
__________________________________________________
I still don't understand the percent things you are talking about (doesn't matter, forget it) and I'm not sure to understand everything you said but I just would like to confirm that, after testing, Multi-display/MixedGPU was useless just like you said and that Maximum pre-rendered frames makes the game run better (an increase of 2-3 minimum fps) with default settings, so thank you.
Now, just to clarify again, I tested 3D at the beginning but not anymore with these settings, that's right. And so, yes, I should have said it earlier. But as already said, I thought that if these settings work well in 2D, they could also in 3D. And it's maybe the case. And note that I used these settings before the patch, so their efficiency is verified, at least in 2D. A lot of people have already confirmed.
Don't think our discussion is useless at the moment, isn't it?
Now, finally, just to resume and to be totally clear, here are the only useful settings to apply in NVIDIA Control Panel if you need better performances in 2D (and maybe in 3D) and additionally better texture quality without any performance loss:
__________________________________________________
Power Management mode: Prefer maximum performance
Threaded optimization: On
Anisotropic filtering: 16x
Texture filtering: High quality
Vertical sync: Use global setting (application-controlled)
__________________________________________________
About Vsync, that applies if you don't have a G-SYNC screen of course.
And so, for a well working Vsync, one last time:
__________________________________________________
Disable and enable again the in-game Active Vsync at every game start.
__________________________________________________
Intel Core i7-4930K | GeForce GTX Titan X Superclocked (SLI) | G.Skill Trident X DDR3-2400 CL10 32 Go | ASUS Rampage IV Extreme | ASUS Xonar Phoebus | ASUS VG248QE (144 Hz)
@nolankotulan: OK, thanks, I'll give those specific settings a try later today. Yeah, any test is still worthwhile as long as it's comparable to others, because even negative results show what we don't have to worry about. I just wouldn't have run it multiple times.
I'll also try leaving vsync on for a test. And also your off, then leave on test. If it is dropping to 30 fps as some bug, that should definitely show in a MIN.
The 99% piece is only useful for people using the benchmark tool built into the game. I'm writing this here for other people to review as well. Anybody running the benchmark, phase 4, I'm suggesting that the 99% percentile is the most interesting and most reliable number.
As a visual example, here is the benchmark result:
[code]Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 9.599676, 78.716019, 48.277714
Time in milliseconds(ms). (Lower is better). Min, Max, Avg
Pass 0, 12.703895, 104.170181, 20.713491
Frames under 16ms (for 60fps):
Pass 0: 327/5367 frames (6.09%)
Frames under 33ms (for 30fps):
Pass 0: 5251/5367 frames (97.84%)
Percentiles in ms for pass 0
50%, 20.00
75%, 24.00
80%, 25.00
85%, 26.00
90%, 28.00
91%, 28.00
92%, 28.00
93%, 29.00
94%, 29.00
95%, 30.00
96%, 30.00
97%, 31.00
98%, 33.00
99%, 36.00 <-- This one
=== SYSTEM ===
Windows 7 Home Premium 64-bit (6.1, Build 7601)
DX Feature Level: 11.0
Intel(R) Core(TM) i5-4670K CPU @ 3.40GHz (4 CPUs), ~3.4GHz
8192MB RAM
NVIDIA GeForce GTX 690, 2052MB, Driver Version 350.12
Graphics Card Vendor Id 0x10de with Device ID 0x1188
=== SETTINGS ===
Display: 1280x720 (FullScreen) @ 119Hz VSync OFFNVSTEREO
Tessellation: 0
LodScale: 1.000000
PedLodBias: 0.200000
...
[/code]
Note the Percentiles in ms. I'm saying that I think the 99% number here the 36 ms is the best representation from this benchmark for MIN. Nearly all frames were faster than 36ms. 1000/36=28 fps.
(It says Pass 0, but that's because I disabled the others.)
The caveat for all numbers of course are that this is my machine, only one set of hardware. That's why I've provided the settings and technique and which number to look at, in case someone else wants to add their hardware results.
@nolankotulan: OK, thanks, I'll give those specific settings a try later today. Yeah, any test is still worthwhile as long as it's comparable to others, because even negative results show what we don't have to worry about. I just wouldn't have run it multiple times.
I'll also try leaving vsync on for a test. And also your off, then leave on test. If it is dropping to 30 fps as some bug, that should definitely show in a MIN.
The 99% piece is only useful for people using the benchmark tool built into the game. I'm writing this here for other people to review as well. Anybody running the benchmark, phase 4, I'm suggesting that the 99% percentile is the most interesting and most reliable number.
As a visual example, here is the benchmark result:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 9.599676, 78.716019, 48.277714
Time in milliseconds(ms). (Lower is better). Min, Max, Avg
Pass 0, 12.703895, 104.170181, 20.713491
Frames under 16ms (for 60fps):
Pass 0: 327/5367 frames (6.09%)
Frames under 33ms (for 30fps):
Pass 0: 5251/5367 frames (97.84%)
Percentiles in ms for pass 0
50%, 20.00
75%, 24.00
80%, 25.00
85%, 26.00
90%, 28.00
91%, 28.00
92%, 28.00
93%, 29.00
94%, 29.00
95%, 30.00
96%, 30.00
97%, 31.00
98%, 33.00
99%, 36.00 <-- This one
=== SYSTEM ===
Windows 7 Home Premium 64-bit (6.1, Build 7601)
DX Feature Level: 11.0
Intel(R) Core(TM) i5-4670K CPU @ 3.40GHz (4 CPUs), ~3.4GHz
8192MB RAM
NVIDIA GeForce GTX 690, 2052MB, Driver Version 350.12
Graphics Card Vendor Id 0x10de with Device ID 0x1188
Note the Percentiles in ms. I'm saying that I think the 99% number here the 36 ms is the best representation from this benchmark for MIN. Nearly all frames were faster than 36ms. 1000/36=28 fps.
(It says Pass 0, but that's because I disabled the others.)
The caveat for all numbers of course are that this is my machine, only one set of hardware. That's why I've provided the settings and technique and which number to look at, in case someone else wants to add their hardware results.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I have a feeling this could be Graphics card related, because after setting the options I described in an earlier post here, I am yet to see the 3d break.
using Gtx 780 sli
I have a feeling this could be Graphics card related, because after setting the options I described in an earlier post here, I am yet to see the 3d break.
using Gtx 780 sli
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
Rockstar have implemented cheat protection. "All injectors, including ScriptHook cheat, are verified through memory when you connect to GTAO. Any failed checks will result in a ban."
So if anyone tries 3d migoto on this one, you can play singleplayer fine, but don't try and go online without removing it. 3d migoto is a wrapper rather than an injector so it might not trigger their anticheat, but still better safe than sorry.
https://www.reddit.com/r/pcgaming/comments/34ky0g/rockstar_now_issuing_bans_for_using_fov_mod_in/
Although it's not as if we have (or will likely need) a 3dmigoto fix for this one, just giving you all a heads up.
Rockstar have implemented cheat protection. "All injectors, including ScriptHook cheat, are verified through memory when you connect to GTAO. Any failed checks will result in a ban."
After applying the latest update today ( 2nd May ) I have been playing the game in 3d with everything set on high for over 4 hours without a single issue. I haven't checked the frame rate but everything is as smooth as butter.
Before this latest update the game would only play for around 10 mins before the 3D breaks.
Fingers crossed that this update has really addressed all the issues because this game in 3D with the graphics maxed out is by far the best out there . It's almost photo realistic.
After applying the latest update today ( 2nd May ) I have been playing the game in 3d with everything set on high for over 4 hours without a single issue. I haven't checked the frame rate but everything is as smooth as butter.
Before this latest update the game would only play for around 10 mins before the 3D breaks.
Fingers crossed that this update has really addressed all the issues because this game in 3D with the graphics maxed out is by far the best out there . It's almost photo realistic.
Now it keeps crashing during the bullion heist. Really annoying, since it's a very long mission, and each time it crashes I'm right back at the beginning.
I hate the save system in this game. Checkpoints are way too far apart for a game jam packed with bugs.
And the latest update has made zero difference.
Now it keeps crashing during the bullion heist. Really annoying, since it's a very long mission, and each time it crashes I'm right back at the beginning.
I hate the save system in this game. Checkpoints are way too far apart for a game jam packed with bugs.
Hi i have noticed a strange thing while trying to test my oc stability. I have set a 10 iterations benchmark (on pass 4) and the framerate is geting worse at each new iteration.
I have done the test 3 times (don't compare the figures between the 3 times, CPU and GPU were set differently each time)
Tomorow i'm receiving a 980 SLI :-)
let's see how it compares with my actual Titan SLI !
and my titan don't OC very well so i should get even more from the 980...
Added another 30 data points to the spreadsheet for performance testing: [url=https://docs.google.com/spreadsheets/d/14HuIxQpcU2gekkXW6s2esojYJ5BSKDhoAJfOtwKknbA/edit#gid=2115501553]GTA 5 3D Performance[/url]
The short story is that only one thing improved performance, overclocking the CPU.
[code]Convenient data table, 3D only, SLI 2, driver 350.12, 3 runs per test:
27.0 28.6 29.4 CP settings, vsync on, Adaptive on.
24.4 27.8 26.3 CP settings, vsync off, adaptive On
27.0 27.8 27.0 CP default, vsync off.
26.3 27.0 29.4 CP default, vsync on.
23.8 24.4 25.0 CP default, vsync on at 30 cap
27.8 26.3 26.3 new nolankotulan CP, vSync on
27.8 26.3 27.0 new nolankotulan CP, manual vsync off, then back on.
27.8 26.3 27.8 Reddit settings+smooth
23.3 25.6 25.6 CP default, vsync off, CPU stock clock 3.8GHz max.[/code]
My conclusions (last column/3rd run is perhaps best to use):
1) @nolankotulan: I followed your settings from post 446, and can not reproduce any improvement. Being overly generous, we might say it adds 5% to MIN, but realistically it's all within the variance of the benchmark. Note the 29.4 best case value for both CP default, and your earlier CP settings.
2) @nolankotulan: I also did three runs of your manual vsync off, then back on test, and cannot reproduce any performance improvement. Possibly there is a stability benefit, but I see no performance bump. I ran tests back to back, changing one parameter at a time for comparison, so compare to the previous row with just vsync on.
3) Trying to make sure that the benchmark test at 99% is valid, I tested with vsync set to cap at 30, line 7 above. Note that it made a measurable difference in MIN of perhaps 10%.
4) Not shown in the table above, but shown on spreadsheet is that the Reddit parameters from the [url=http://i.imgur.com/hrQgXpi.png]Img[/url] did in fact improve the [u]2D[/u] performance. But does nothing for 3D.
5) These tests were all done with CPU overclocked to 4.2GHz Max for 4 cores. Dropping back to stock clocks is line 11, and we can see a perhaps 10% drop in MIN. Normally we'd expect better than a simple 10% bump, which suggests CPU is a limiting factor, but not a bottleneck.
6) CPU overclock and driver 350.12 are so far the [i]only [/i]things tested that measurably improve MIN performance in 3D.
Anyone: Please let me know if you think the test is flawed, or I've made a mistake.
Added another 30 data points to the spreadsheet for performance testing: GTA 5 3D Performance
The short story is that only one thing improved performance, overclocking the CPU.
Convenient data table, 3D only, SLI 2, driver 350.12, 3 runs per test:
27.0 28.6 29.4 CP settings, vsync on, Adaptive on.
24.4 27.8 26.3 CP settings, vsync off, adaptive On
27.0 27.8 27.0 CP default, vsync off.
26.3 27.0 29.4 CP default, vsync on.
23.8 24.4 25.0 CP default, vsync on at 30 cap
27.8 26.3 26.3 new nolankotulan CP, vSync on
27.8 26.3 27.0 new nolankotulan CP, manual vsync off, then back on.
27.8 26.3 27.8 Reddit settings+smooth
23.3 25.6 25.6 CP default, vsync off, CPU stock clock 3.8GHz max.
My conclusions (last column/3rd run is perhaps best to use):
1) @nolankotulan: I followed your settings from post 446, and can not reproduce any improvement. Being overly generous, we might say it adds 5% to MIN, but realistically it's all within the variance of the benchmark. Note the 29.4 best case value for both CP default, and your earlier CP settings.
2) @nolankotulan: I also did three runs of your manual vsync off, then back on test, and cannot reproduce any performance improvement. Possibly there is a stability benefit, but I see no performance bump. I ran tests back to back, changing one parameter at a time for comparison, so compare to the previous row with just vsync on.
3) Trying to make sure that the benchmark test at 99% is valid, I tested with vsync set to cap at 30, line 7 above. Note that it made a measurable difference in MIN of perhaps 10%.
4) Not shown in the table above, but shown on spreadsheet is that the Reddit parameters from the Img did in fact improve the 2D performance. But does nothing for 3D.
5) These tests were all done with CPU overclocked to 4.2GHz Max for 4 cores. Dropping back to stock clocks is line 11, and we can see a perhaps 10% drop in MIN. Normally we'd expect better than a simple 10% bump, which suggests CPU is a limiting factor, but not a bottleneck.
6) CPU overclock and driver 350.12 are so far the only things tested that measurably improve MIN performance in 3D.
Anyone: Please let me know if you think the test is flawed, or I've made a mistake.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="cyrilp"]Hi i have noticed a strange thing while trying to test my oc stability. I have set a 10 iterations benchmark (on pass 4) and the framerate is geting worse at each new iteration.
I have done the test 3 times (don't compare the figures between the 3 times, CPU and GPU were set differently each time)
... data
CPU and memory usage was more or less constant between iterations[/quote]
It's hard to reconcile those numbers. The Average is drifting down, but the Min and Max both get better, or perhaps just swing wildly.
Since you probably still have the benchmark files on hand, please snip out the numbers for the 99% level in Percentiles section. It's my theory that this is the best parameter for MIN. You can get all 10 from the single last file.
cyrilp said:Hi i have noticed a strange thing while trying to test my oc stability. I have set a 10 iterations benchmark (on pass 4) and the framerate is geting worse at each new iteration.
I have done the test 3 times (don't compare the figures between the 3 times, CPU and GPU were set differently each time)
... data
CPU and memory usage was more or less constant between iterations
It's hard to reconcile those numbers. The Average is drifting down, but the Min and Max both get better, or perhaps just swing wildly.
Since you probably still have the benchmark files on hand, please snip out the numbers for the 99% level in Percentiles section. It's my theory that this is the best parameter for MIN. You can get all 10 from the single last file.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Bo3b, it might be worth disabling the nVidia Streamer Service, as described in this thread I linked earlier. I wouldn't expect a big difference, but some people are reporting a coupe frames of gain.
http://www.reddit.com/r/pcmasterrace/comments/344z7f/nvidia_usersraise_your_fps_and_save_cpu_usage_by/
No Steam service for my case, as it's the R* version. In any case, I disabled those Steam streaming/broadcast things, and did not install GeForce Experience.
No Steam service for my case, as it's the R* version. In any case, I disabled those Steam streaming/broadcast things, and did not install GeForce Experience.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
As of today, May 3rd 15, my own gameplay experience in 3D with GTA V, has significantly improved since it's initial release, due to the last couple of patches. Have they cured the glitches that have been mentioned on this thread. No, not for me. However, the time that elapses before they eventually do occur appear to be significantly longer now. Of course, now that I have stated this again, I'll have to a expect a glitch just a few minutes in when I next play, like before. Be that as it may, it would be useful to gauge how players are faring at the moment. Some have been able to discover the right combination that works for them, straight after release. Is that still the case? Others were having significant problems at the start. Once again, is that still the case now?
Is anybody nearer to seeing a reason as to why these problems are still occurring? How much of an impact does anyone think that the following factors might be having?
1. Using a 900 Series card, or, using a pre-900 Series card.
2. Using a single card, or using SLI.
3. Updating to a newer/newest driver.
4. The version of Direct X being used.
5. Regularly updating MS Windows (Which I do not do).
Any thoughts would be helpful, just to see if any sort of pattern emerges.
Below is my latest reply to Rockstar, dated 1.5.2015, although it is important to state that there was a further patch from Rockstar since then.
"Hi there,
I’ve been advised by Arnold G. a Tier 1 TSA, to contact somebody at Rockstar regarding any prevailing issues with stereoscopic 3D that I’ve already mentioned initially to Zendesk. Nvidia drivers ranging from 320.XX up to 350.12 are not providing a stable stereo 3D platform for GTA V. Running 331.82, as I currently do, being older and more stable, but perhaps does not optimise frame rates anywhere near as well as newer NVidia drivers, is still a very reliable driver that enabled me to get all the games I wanted to get working in 3D, working properly.
As of today’s update (May 1st 2015), those following issues with respect to 3D Vision 2 stereoscopy appear not to have been resolved so far. These are;
1.Brief flickering, or change of focus in 3D usually lasting under a second in length, before self-correcting
2.Shadows permanently rendering incorrectly to 2D screen depth, sometimes 2/3 minutes after loading, but sometimes well over an hour into gameplay. These incorrectly rendered shadows do not correct themselves in-game or are able to be corrected in-game, making 3D viewing impossible.
3.The game crashes to the desktop on occasion, again possibly related to problems GTA V has in rendering in 3D at present.
4.Grass cannot be rendered above normal settings without causing problems. Text only showing in one eye is also an issue.
Point number 4, does NOT apply to me, simply because I am using an older driver which has better SLI functionality, and is a version that is prior to the introduction of Nvidia’s comparatively inferior ‘Compatibility Mode’. I am using Ultra grass settings with no problems, whereas many other users of 3D Vision 2, cannot enable this setting with newer cards or up to date drivers.
The game’s 3D visuals themselves, as I have been at pains to point out to your aforementioned colleagues are of a truly phenomenal quality, before any rendering errors eventually do occur. Visually in 3D, the game is a masterpiece of programing, artwork and design. I can’t speak highly enough about it.
Unfortunately, GTA V’s in-game recording facility does not record the rendering errors themselves, and in fact renders the shadows perfectly upon play back. Hence, I have no visual proof. Only that when a rendering glitch is actually occurring whilst being recorded, does pressing Alt-F1 cause the game to crash to the desktop, confirming that in fact, all is not well.
I was hoping to get confirmation that certain individuals at Rockstar, have themselves been able to get GTA V running correctly in stereoscopic 3D without any issues. We at the Geforce 3D forum on NVidia’s website, are truly struggling to see how that is possible, considering the universal issues that we are all experiencing. I stress, that this is not in any way a criticism of Rockstar. We are all genuinely interested as to how somebody is managing to get it working in 3D stereo at Rockstar.
I would be most grateful for any response.
Best regards,"
I intend keeping my ticket open in order to provide direct feedback. I will have to simply assume, although obviously I can't confirm this, that Rockstar technicians are monitoring this thread for reactions to patches.
As of today, May 3rd 15, my own gameplay experience in 3D with GTA V, has significantly improved since it's initial release, due to the last couple of patches. Have they cured the glitches that have been mentioned on this thread. No, not for me. However, the time that elapses before they eventually do occur appear to be significantly longer now. Of course, now that I have stated this again, I'll have to a expect a glitch just a few minutes in when I next play, like before. Be that as it may, it would be useful to gauge how players are faring at the moment. Some have been able to discover the right combination that works for them, straight after release. Is that still the case? Others were having significant problems at the start. Once again, is that still the case now?
Is anybody nearer to seeing a reason as to why these problems are still occurring? How much of an impact does anyone think that the following factors might be having?
1. Using a 900 Series card, or, using a pre-900 Series card.
2. Using a single card, or using SLI.
3. Updating to a newer/newest driver.
4. The version of Direct X being used.
5. Regularly updating MS Windows (Which I do not do).
Any thoughts would be helpful, just to see if any sort of pattern emerges.
Below is my latest reply to Rockstar, dated 1.5.2015, although it is important to state that there was a further patch from Rockstar since then.
"Hi there,
I’ve been advised by Arnold G. a Tier 1 TSA, to contact somebody at Rockstar regarding any prevailing issues with stereoscopic 3D that I’ve already mentioned initially to Zendesk. Nvidia drivers ranging from 320.XX up to 350.12 are not providing a stable stereo 3D platform for GTA V. Running 331.82, as I currently do, being older and more stable, but perhaps does not optimise frame rates anywhere near as well as newer NVidia drivers, is still a very reliable driver that enabled me to get all the games I wanted to get working in 3D, working properly.
As of today’s update (May 1st 2015), those following issues with respect to 3D Vision 2 stereoscopy appear not to have been resolved so far. These are;
1.Brief flickering, or change of focus in 3D usually lasting under a second in length, before self-correcting
2.Shadows permanently rendering incorrectly to 2D screen depth, sometimes 2/3 minutes after loading, but sometimes well over an hour into gameplay. These incorrectly rendered shadows do not correct themselves in-game or are able to be corrected in-game, making 3D viewing impossible.
3.The game crashes to the desktop on occasion, again possibly related to problems GTA V has in rendering in 3D at present.
4.Grass cannot be rendered above normal settings without causing problems. Text only showing in one eye is also an issue.
Point number 4, does NOT apply to me, simply because I am using an older driver which has better SLI functionality, and is a version that is prior to the introduction of Nvidia’s comparatively inferior ‘Compatibility Mode’. I am using Ultra grass settings with no problems, whereas many other users of 3D Vision 2, cannot enable this setting with newer cards or up to date drivers.
The game’s 3D visuals themselves, as I have been at pains to point out to your aforementioned colleagues are of a truly phenomenal quality, before any rendering errors eventually do occur. Visually in 3D, the game is a masterpiece of programing, artwork and design. I can’t speak highly enough about it.
Unfortunately, GTA V’s in-game recording facility does not record the rendering errors themselves, and in fact renders the shadows perfectly upon play back. Hence, I have no visual proof. Only that when a rendering glitch is actually occurring whilst being recorded, does pressing Alt-F1 cause the game to crash to the desktop, confirming that in fact, all is not well.
I was hoping to get confirmation that certain individuals at Rockstar, have themselves been able to get GTA V running correctly in stereoscopic 3D without any issues. We at the Geforce 3D forum on NVidia’s website, are truly struggling to see how that is possible, considering the universal issues that we are all experiencing. I stress, that this is not in any way a criticism of Rockstar. We are all genuinely interested as to how somebody is managing to get it working in 3D stereo at Rockstar.
I would be most grateful for any response.
Best regards,"
I intend keeping my ticket open in order to provide direct feedback. I will have to simply assume, although obviously I can't confirm this, that Rockstar technicians are monitoring this thread for reactions to patches.
[quote="bo3b"]
It's hard to reconcile those numbers. The Average is drifting down, but the Min and Max both get better, or perhaps just swing wildly.
Since you probably still have the benchmark files on hand, please snip out the numbers for the 99% level in Percentiles section. It's my theory that this is the best parameter for MIN. You can get all 10 from the single last file.[/quote]
Here you are :
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 7.774801, 143.631271, 48.527340
Pass 1, 17.938318, 144.747345, 47.843216
Pass 2, 2.180034, 143.580078, 47.563763
Pass 3, 19.541544, 142.854507, 46.892235
Pass 4, 17.526812, 142.582733, 46.748287
Pass 5, 19.237089, 144.016403, 46.236950
Pass 6, 18.798553, 143.759415, 45.663731
Pass 7, 19.240650, 139.249664, 45.415791
Pass 8, 19.004305, 141.356918, 44.776154
Pass 9, 18.993664, 144.721344, 44.263580
Percentiles in ms for pass 0
90%, 26.00
Percentiles in ms for pass 1
90%, 25.00
Percentiles in ms for pass 2
90%, 25.00
Percentiles in ms for pass 3
90%, 26.00
Percentiles in ms for pass 4
90%, 26.00
Percentiles in ms for pass 5
90%, 26.00
Percentiles in ms for pass 6
90%, 26.00
Percentiles in ms for pass 7
90%, 26.00
Percentiles in ms for pass 8
90%, 26.00
Percentiles in ms for pass 9
90%, 27.00
actually i'm afraid one of my Titan seems to behave strangely, i'll check it back tomorrow when i'll install the 980SLI.
bo3b said:
It's hard to reconcile those numbers. The Average is drifting down, but the Min and Max both get better, or perhaps just swing wildly.
Since you probably still have the benchmark files on hand, please snip out the numbers for the 99% level in Percentiles section. It's my theory that this is the best parameter for MIN. You can get all 10 from the single last file.
Now, just to clarify again, I tested 3D at the beginning but not anymore with these settings, that's right. And so, yes, I should have said it earlier. But as already said, I thought that if these settings work well in 2D, they could also in 3D. And it's maybe the case. And note that I used these settings before the patch, so their efficiency is verified, at least in 2D. A lot of people have already confirmed.
Don't think our discussion is useless at the moment, isn't it?
Now, finally, just to resume and to be totally clear, here are the only useful settings to apply in NVIDIA Control Panel if you need better performances in 2D (and maybe in 3D) and additionally better texture quality without any performance loss:
__________________________________________________
Power Management mode: Prefer maximum performance
Threaded optimization: On
Anisotropic filtering: 16x
Texture filtering: High quality
Vertical sync: Use global setting (application-controlled)
__________________________________________________
About Vsync, that applies if you don't have a G-SYNC screen of course.
And so, for a well working Vsync, one last time:
__________________________________________________
Disable and enable again the in-game Active Vsync at every game start.
__________________________________________________
Intel Core i7-4930K | GeForce GTX Titan X Superclocked (SLI) | G.Skill Trident X DDR3-2400 CL10 32 Go | ASUS Rampage IV Extreme | ASUS Xonar Phoebus | ASUS VG248QE (144 Hz)
I'll also try leaving vsync on for a test. And also your off, then leave on test. If it is dropping to 30 fps as some bug, that should definitely show in a MIN.
The 99% piece is only useful for people using the benchmark tool built into the game. I'm writing this here for other people to review as well. Anybody running the benchmark, phase 4, I'm suggesting that the 99% percentile is the most interesting and most reliable number.
As a visual example, here is the benchmark result:
Note the Percentiles in ms. I'm saying that I think the 99% number here the 36 ms is the best representation from this benchmark for MIN. Nearly all frames were faster than 36ms. 1000/36=28 fps.
(It says Pass 0, but that's because I disabled the others.)
The caveat for all numbers of course are that this is my machine, only one set of hardware. That's why I've provided the settings and technique and which number to look at, in case someone else wants to add their hardware results.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
using Gtx 780 sli
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
So if anyone tries 3d migoto on this one, you can play singleplayer fine, but don't try and go online without removing it. 3d migoto is a wrapper rather than an injector so it might not trigger their anticheat, but still better safe than sorry.
https://www.reddit.com/r/pcgaming/comments/34ky0g/rockstar_now_issuing_bans_for_using_fov_mod_in/
Although it's not as if we have (or will likely need) a 3dmigoto fix for this one, just giving you all a heads up.
Before this latest update the game would only play for around 10 mins before the 3D breaks.
Fingers crossed that this update has really addressed all the issues because this game in 3D with the graphics maxed out is by far the best out there . It's almost photo realistic.
I hate the save system in this game. Checkpoints are way too far apart for a game jam packed with bugs.
And the latest update has made zero difference.
Palit GTX 970 4GB Card, Driver 358.50, DX runtime 11.0, Asus VG236 monitor, Pre-Lightboost 3D Vision setup, i7-2600k 3.40/3.70 GHz CPU, 8 GB Ram, Z68XP-UD4 motherboard, OCZ-VERTEX3 SSD 128 GB, 64bit Windows7 Pro SP1
I have done the test 3 times (don't compare the figures between the 3 times, CPU and GPU were set differently each time)
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 8.274319, 79.250854, 48.673489
Pass 1, 19.104586, 198.549500, 47.510914
Pass 2, 2.327339, 75.364395, 47.201565
Pass 3, 18.310551, 77.111816, 46.860889
Pass 4, 19.334860, 71.862122, 46.282482
Pass 5, 18.710052, 77.659996, 45.896805
Pass 6, 19.118412, 156.030640, 45.469990
Pass 7, 18.935085, 193.879013, 45.192516
Pass 8, 17.472658, 65.274292, 44.333344
Pass 9, 18.856533, 201.542023, 43.399326
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 5.707793, 265.698730, 49.062191
Pass 1, 18.752640, 260.627563, 47.403358
Pass 2, 18.996603, 253.752747, 47.562878
Pass 3, 23.152124, 69.413490, 46.640488
Pass 4, 19.486277, 71.179680, 46.427822
Pass 5, 6.964149, 256.152924, 46.526970
Pass 6, 18.044312, 205.736710, 46.035938
Pass 7, 2.516500, 236.855804, 45.867451
Pass 8, 18.352591, 166.925308, 44.789360
Pass 9, 18.917986, 117.748550, 44.289425
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 8.717572, 71.694328, 51.182423
Pass 1, 8.151101, 74.096054, 49.872566
Pass 2, 18.490429, 74.780365, 49.840721
Pass 3, 19.850199, 74.936859, 49.617954
Pass 4, 20.692564, 77.065781, 49.091450
Pass 5, 16.665167, 240.174377, 48.653721
Pass 6, 21.086842, 72.685562, 48.114239
Pass 7, 18.944452, 165.434296, 47.584011
Pass 8, 19.800438, 160.755219, 46.477692
Pass 9, 20.585236, 227.846436, 45.944942
CPU and memory usage was more or less constant between iterations
980, Driver 375.57, Asus ROG Swift PG278Q 1440p monitor, 3D Vision,MSI Gaming 7, Haswell 5820K, 16Go RAM Win 8.1 64bit
let's see how it compares with my actual Titan SLI !
and my titan don't OC very well so i should get even more from the 980...
980, Driver 375.57, Asus ROG Swift PG278Q 1440p monitor, 3D Vision,MSI Gaming 7, Haswell 5820K, 16Go RAM Win 8.1 64bit
The short story is that only one thing improved performance, overclocking the CPU.
My conclusions (last column/3rd run is perhaps best to use):
1) @nolankotulan: I followed your settings from post 446, and can not reproduce any improvement. Being overly generous, we might say it adds 5% to MIN, but realistically it's all within the variance of the benchmark. Note the 29.4 best case value for both CP default, and your earlier CP settings.
2) @nolankotulan: I also did three runs of your manual vsync off, then back on test, and cannot reproduce any performance improvement. Possibly there is a stability benefit, but I see no performance bump. I ran tests back to back, changing one parameter at a time for comparison, so compare to the previous row with just vsync on.
3) Trying to make sure that the benchmark test at 99% is valid, I tested with vsync set to cap at 30, line 7 above. Note that it made a measurable difference in MIN of perhaps 10%.
4) Not shown in the table above, but shown on spreadsheet is that the Reddit parameters from the Img did in fact improve the 2D performance. But does nothing for 3D.
5) These tests were all done with CPU overclocked to 4.2GHz Max for 4 cores. Dropping back to stock clocks is line 11, and we can see a perhaps 10% drop in MIN. Normally we'd expect better than a simple 10% bump, which suggests CPU is a limiting factor, but not a bottleneck.
6) CPU overclock and driver 350.12 are so far the only things tested that measurably improve MIN performance in 3D.
Anyone: Please let me know if you think the test is flawed, or I've made a mistake.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
It's hard to reconcile those numbers. The Average is drifting down, but the Min and Max both get better, or perhaps just swing wildly.
Since you probably still have the benchmark files on hand, please snip out the numbers for the 99% level in Percentiles section. It's my theory that this is the best parameter for MIN. You can get all 10 from the single last file.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
http://www.reddit.com/r/pcmasterrace/comments/344z7f/nvidia_usersraise_your_fps_and_save_cpu_usage_by/
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Is anybody nearer to seeing a reason as to why these problems are still occurring? How much of an impact does anyone think that the following factors might be having?
1. Using a 900 Series card, or, using a pre-900 Series card.
2. Using a single card, or using SLI.
3. Updating to a newer/newest driver.
4. The version of Direct X being used.
5. Regularly updating MS Windows (Which I do not do).
Any thoughts would be helpful, just to see if any sort of pattern emerges.
Below is my latest reply to Rockstar, dated 1.5.2015, although it is important to state that there was a further patch from Rockstar since then.
"Hi there,
I’ve been advised by Arnold G. a Tier 1 TSA, to contact somebody at Rockstar regarding any prevailing issues with stereoscopic 3D that I’ve already mentioned initially to Zendesk. Nvidia drivers ranging from 320.XX up to 350.12 are not providing a stable stereo 3D platform for GTA V. Running 331.82, as I currently do, being older and more stable, but perhaps does not optimise frame rates anywhere near as well as newer NVidia drivers, is still a very reliable driver that enabled me to get all the games I wanted to get working in 3D, working properly.
As of today’s update (May 1st 2015), those following issues with respect to 3D Vision 2 stereoscopy appear not to have been resolved so far. These are;
1.Brief flickering, or change of focus in 3D usually lasting under a second in length, before self-correcting
2.Shadows permanently rendering incorrectly to 2D screen depth, sometimes 2/3 minutes after loading, but sometimes well over an hour into gameplay. These incorrectly rendered shadows do not correct themselves in-game or are able to be corrected in-game, making 3D viewing impossible.
3.The game crashes to the desktop on occasion, again possibly related to problems GTA V has in rendering in 3D at present.
4.Grass cannot be rendered above normal settings without causing problems. Text only showing in one eye is also an issue.
Point number 4, does NOT apply to me, simply because I am using an older driver which has better SLI functionality, and is a version that is prior to the introduction of Nvidia’s comparatively inferior ‘Compatibility Mode’. I am using Ultra grass settings with no problems, whereas many other users of 3D Vision 2, cannot enable this setting with newer cards or up to date drivers.
The game’s 3D visuals themselves, as I have been at pains to point out to your aforementioned colleagues are of a truly phenomenal quality, before any rendering errors eventually do occur. Visually in 3D, the game is a masterpiece of programing, artwork and design. I can’t speak highly enough about it.
Unfortunately, GTA V’s in-game recording facility does not record the rendering errors themselves, and in fact renders the shadows perfectly upon play back. Hence, I have no visual proof. Only that when a rendering glitch is actually occurring whilst being recorded, does pressing Alt-F1 cause the game to crash to the desktop, confirming that in fact, all is not well.
I was hoping to get confirmation that certain individuals at Rockstar, have themselves been able to get GTA V running correctly in stereoscopic 3D without any issues. We at the Geforce 3D forum on NVidia’s website, are truly struggling to see how that is possible, considering the universal issues that we are all experiencing. I stress, that this is not in any way a criticism of Rockstar. We are all genuinely interested as to how somebody is managing to get it working in 3D stereo at Rockstar.
I would be most grateful for any response.
Best regards,"
I intend keeping my ticket open in order to provide direct feedback. I will have to simply assume, although obviously I can't confirm this, that Rockstar technicians are monitoring this thread for reactions to patches.
Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82 (Win8.0), Driver 388.71 (Win7), DX11.0
harisukro: "You sir, are 'Steely Eyed Missile Man'" (Quote from Apollo 13)
Here you are :
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 7.774801, 143.631271, 48.527340
Pass 1, 17.938318, 144.747345, 47.843216
Pass 2, 2.180034, 143.580078, 47.563763
Pass 3, 19.541544, 142.854507, 46.892235
Pass 4, 17.526812, 142.582733, 46.748287
Pass 5, 19.237089, 144.016403, 46.236950
Pass 6, 18.798553, 143.759415, 45.663731
Pass 7, 19.240650, 139.249664, 45.415791
Pass 8, 19.004305, 141.356918, 44.776154
Pass 9, 18.993664, 144.721344, 44.263580
Percentiles in ms for pass 0
90%, 26.00
Percentiles in ms for pass 1
90%, 25.00
Percentiles in ms for pass 2
90%, 25.00
Percentiles in ms for pass 3
90%, 26.00
Percentiles in ms for pass 4
90%, 26.00
Percentiles in ms for pass 5
90%, 26.00
Percentiles in ms for pass 6
90%, 26.00
Percentiles in ms for pass 7
90%, 26.00
Percentiles in ms for pass 8
90%, 26.00
Percentiles in ms for pass 9
90%, 27.00
actually i'm afraid one of my Titan seems to behave strangely, i'll check it back tomorrow when i'll install the 980SLI.
980, Driver 375.57, Asus ROG Swift PG278Q 1440p monitor, 3D Vision,MSI Gaming 7, Haswell 5820K, 16Go RAM Win 8.1 64bit