[quote="Zappologist"]Yes Bo3b, I had indeed tried Black Flag with the most recent fix, before giving up on it and uninstalling.
It was still stuttering.
At this point, with SLI 980s, I'm willing to accept that I will probably not play any Ubisoft games before I upgrade the CPU and motherboard in the future (or before Ubi figures out what's wrong with their games)
It's ok, I'm beginning to be fine with it, especially with you guys fixing so many other games, and I have great performance in all non-Ubi games, no matter how demanding they are (Last Light in 3D is simply stunning, same as open world Shadow of Mordor).
But still, my upgrade to 980s is hands down the most disappointing upgrade I've ever done, in terms of performance gained. So I prefer to let my story known from time to time, so that people may understand that just going to more powerful cards will not solve stuttering (and by no means do I have an antiquated PC build - 3DVision 2 on Asus VG278H 27", SLI GTX980 Strix, i7-2600K 3.70GHz, 8GB RAM, Win7 64bit on Intel SSD)[/quote]
OK, fair enough. It's weird though, because I'm not getting stutter in AC4 anymore with that fix. It's not high performance, I average 45 fps with SLI 760, but it's a smooth experience. I also no longer get stutter in WatchDogs with it tuned. Changing the textures to Medium from Ultra made the biggest difference, and even though I have 4G cards, going less than High fixed the stutter during driving.
It's a good lesson that performance for games is definitely not all about the GPU. I used to think that, but the biggest change for me in 3D was to get a better CPU. I think that 3D hammers the CPU much, much harder than 2D, and is often a second thought component in game builds.
For your rig there, I'd definitely say that CPU is next on your agenda. The 2600K is a great chip, but since it's possible to do better today, I think that would make the most difference in your game computer. I specifically went with an i5 because almost zero games use more than two threads, and I can overclock a bit better with no hyper-threading.
Zappologist said:Yes Bo3b, I had indeed tried Black Flag with the most recent fix, before giving up on it and uninstalling.
It was still stuttering.
At this point, with SLI 980s, I'm willing to accept that I will probably not play any Ubisoft games before I upgrade the CPU and motherboard in the future (or before Ubi figures out what's wrong with their games)
It's ok, I'm beginning to be fine with it, especially with you guys fixing so many other games, and I have great performance in all non-Ubi games, no matter how demanding they are (Last Light in 3D is simply stunning, same as open world Shadow of Mordor).
But still, my upgrade to 980s is hands down the most disappointing upgrade I've ever done, in terms of performance gained. So I prefer to let my story known from time to time, so that people may understand that just going to more powerful cards will not solve stuttering (and by no means do I have an antiquated PC build - 3DVision 2 on Asus VG278H 27", SLI GTX980 Strix, i7-2600K 3.70GHz, 8GB RAM, Win7 64bit on Intel SSD)
OK, fair enough. It's weird though, because I'm not getting stutter in AC4 anymore with that fix. It's not high performance, I average 45 fps with SLI 760, but it's a smooth experience. I also no longer get stutter in WatchDogs with it tuned. Changing the textures to Medium from Ultra made the biggest difference, and even though I have 4G cards, going less than High fixed the stutter during driving.
It's a good lesson that performance for games is definitely not all about the GPU. I used to think that, but the biggest change for me in 3D was to get a better CPU. I think that 3D hammers the CPU much, much harder than 2D, and is often a second thought component in game builds.
For your rig there, I'd definitely say that CPU is next on your agenda. The 2600K is a great chip, but since it's possible to do better today, I think that would make the most difference in your game computer. I specifically went with an i5 because almost zero games use more than two threads, and I can overclock a bit better with no hyper-threading.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Perfectly said, Bo3b.
I hope more and more our findings will put the spotlight on the CPU. If your theory that CPU is extra hammered by 3D, (and that hyper-threaded CPUs are counter-productive in 3D gaming) then this would explain everything I'm experiencing curently.
If I wasn't so lazy, I'd buy the best i5 I could find for my Gigabyte Z68XP-UD3 and test your theory. But I barely could fit the 980s in the my Antec case, and I'd have to play with thermal paste again :-)
Do you think it's worth it, to put a new CPU (socket 1155) in my build, or are the latest i5s not even compatible with my motherboard. I've tried briefly to check compatibility charts here - http://www.cpu-upgrade.com/mb-Gigabyte/GA-Z68XP-UD3(rev._1.3).html
But the latest i5 displayed is the i5-3570K, which costs 232EUR on Amazon! Not at all cheap, and I wonder if it's really a big improvement on the i7-2600k, enough to make the experiment worthwhile.
Interested in your thoughts. But it's not something i'll do lightly, since apart from some Ubisoft games my 3D performance is awesome. I think it could wait until the next full upgrade.
On the other hand (I play mindgames with myself it seems), users such as you, Pirate, etc, are not reporting so pervasive stutters, expecially with the optimised wrapper versions, so it could indeed all point to a CPU bottleneck. Otherwise there would be many more users here complaining about this phenomenon, I think.
EDIT
The CPU upgrade dilema is even further complicated by advice such as Tom's Hardware, everytime they update their CPU chart - http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html
"I don’t recommend upgrading your CPU unless the potential replacement is at least three tiers higher. Otherwise, the upgrade is somewhat parallel and you may not notice a worthwhile difference in game performance"
On their list, my i7 2600K is at the topmost tier, and the presumably better-for-3D i5 3570K is two tiers [u]below[/u].
I imagine TH's view is adopted mainstream on tech websites. In this context, your theory is even more sobering, if true. And reveals the current views as incorrect, at least for 3D gaming.
Perfectly said, Bo3b.
I hope more and more our findings will put the spotlight on the CPU. If your theory that CPU is extra hammered by 3D, (and that hyper-threaded CPUs are counter-productive in 3D gaming) then this would explain everything I'm experiencing curently.
If I wasn't so lazy, I'd buy the best i5 I could find for my Gigabyte Z68XP-UD3 and test your theory. But I barely could fit the 980s in the my Antec case, and I'd have to play with thermal paste again :-)
Do you think it's worth it, to put a new CPU (socket 1155) in my build, or are the latest i5s not even compatible with my motherboard. I've tried briefly to check compatibility charts here - http://www.cpu-upgrade.com/mb-Gigabyte/GA-Z68XP-UD3(rev._1.3).html But the latest i5 displayed is the i5-3570K, which costs 232EUR on Amazon! Not at all cheap, and I wonder if it's really a big improvement on the i7-2600k, enough to make the experiment worthwhile.
Interested in your thoughts. But it's not something i'll do lightly, since apart from some Ubisoft games my 3D performance is awesome. I think it could wait until the next full upgrade.
On the other hand (I play mindgames with myself it seems), users such as you, Pirate, etc, are not reporting so pervasive stutters, expecially with the optimised wrapper versions, so it could indeed all point to a CPU bottleneck. Otherwise there would be many more users here complaining about this phenomenon, I think.
"I don’t recommend upgrading your CPU unless the potential replacement is at least three tiers higher. Otherwise, the upgrade is somewhat parallel and you may not notice a worthwhile difference in game performance"
On their list, my i7 2600K is at the topmost tier, and the presumably better-for-3D i5 3570K is two tiers below.
I imagine TH's view is adopted mainstream on tech websites. In this context, your theory is even more sobering, if true. And reveals the current views as incorrect, at least for 3D gaming.
I'm not sure what to think of those tables from Toms. When I look at their CPU charts specifically, I note that the vast majority of those benchmarks don't have anything to do with gaming, and are heavily skewed to multi-threaded stuff. If that's true, then the chart is meaningless for gaming, because the entire problem is that games are not seriously multi-threaded.
I also do not agree with the idea that you should jump 3 tiers for a good upgrade, that is just silly. It totally depends upon the job in question. If you did video editing for a living for example, you should jump for every chip that adds more threads. I personally jumped two tiers from an 860 to 4670K, and it was a great bump in 3D game performance.
For your case, I'm not sure that an i5 would make any difference here on your current motherboard. Not enough delta there over your already strong 2600K for gaming.
If I were in your position, I'd keep the 2600K for awhile longer, but try overclocking it. Correct me if I'm wrong, but you are running stock clocks, which is very likely to be the problem for unity games stutter.
Here's a particularly good comparison of CPUs on WatchDogs:
[url]http://pclab.pl/zdjecia/artykuly/chaostheory/2014/05/watch_dogs/charts/wd_cpu_gf.png[/url]
Note that the 2600K when OCed is comparable to modern chips. Also note how far back the 2600K falls when not OCed.
When I went with i5, the theory was that would allow more headroom for overclocking, and I don't believe that panned out. An i7 is a better choice for other stuff, and I think the headroom for OC on the Haswell chips is on the weak side in general.
The Overclocking on the 2600K is renowned to be particularly good.
So, if I were you, I'd buy a water cooler (closed-loop, like Kraken), and OC your chip. Sure it lowers the life span, but so what? It's not going to be viable another 3 years anyway, and it's arguably not viable now.
No guarantee, but if you are stock-clocking, that would be my best guess for your Ubi stutter.
I'm not sure what to think of those tables from Toms. When I look at their CPU charts specifically, I note that the vast majority of those benchmarks don't have anything to do with gaming, and are heavily skewed to multi-threaded stuff. If that's true, then the chart is meaningless for gaming, because the entire problem is that games are not seriously multi-threaded.
I also do not agree with the idea that you should jump 3 tiers for a good upgrade, that is just silly. It totally depends upon the job in question. If you did video editing for a living for example, you should jump for every chip that adds more threads. I personally jumped two tiers from an 860 to 4670K, and it was a great bump in 3D game performance.
For your case, I'm not sure that an i5 would make any difference here on your current motherboard. Not enough delta there over your already strong 2600K for gaming.
If I were in your position, I'd keep the 2600K for awhile longer, but try overclocking it. Correct me if I'm wrong, but you are running stock clocks, which is very likely to be the problem for unity games stutter.
Here's a particularly good comparison of CPUs on WatchDogs:
Note that the 2600K when OCed is comparable to modern chips. Also note how far back the 2600K falls when not OCed.
When I went with i5, the theory was that would allow more headroom for overclocking, and I don't believe that panned out. An i7 is a better choice for other stuff, and I think the headroom for OC on the Haswell chips is on the weak side in general.
The Overclocking on the 2600K is renowned to be particularly good.
So, if I were you, I'd buy a water cooler (closed-loop, like Kraken), and OC your chip. Sure it lowers the life span, but so what? It's not going to be viable another 3 years anyway, and it's arguably not viable now.
No guarantee, but if you are stock-clocking, that would be my best guess for your Ubi stutter.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="2Cb"]
Currently I'm sporting two gtx 780ti's with a hexacore intel cpu. I have noticed, for example, when I play AC Unity on 1080p on ULTRA with 4x msaa it runs perfectly smooth (60+ fps) with absolutely no stutter.
Now when I turn on 3D (I know its 3D compatibility mode but lets put that aside for a minute) my fps is still solid around 60fps but I do get a little repeated stutter here and there. Now if you know me, you know my games must run perfectly at maximum graphical settings - that is what I game for - so I cannot stand stutter.[/quote]
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?
2Cb said:
Currently I'm sporting two gtx 780ti's with a hexacore intel cpu. I have noticed, for example, when I play AC Unity on 1080p on ULTRA with 4x msaa it runs perfectly smooth (60+ fps) with absolutely no stutter.
Now when I turn on 3D (I know its 3D compatibility mode but lets put that aside for a minute) my fps is still solid around 60fps but I do get a little repeated stutter here and there. Now if you know me, you know my games must run perfectly at maximum graphical settings - that is what I game for - so I cannot stand stutter.
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?
At one time I posted a link to Microsoft (but I can't seem to find it) where they detailed expected behavior in relationship between core parking and power profiles. The cores default to a parked status. As I recall Microsoft told game programmers that they should be expected to load the first core heavily and use any remaining cores for overspill. This constant parking/unparking of the subsequent cores resulted in microstutters in some instances for some users.
If you haven't, I'd suggest unparking your cores.
At one time I posted a link to Microsoft (but I can't seem to find it) where they detailed expected behavior in relationship between core parking and power profiles. The cores default to a parked status. As I recall Microsoft told game programmers that they should be expected to load the first core heavily and use any remaining cores for overspill. This constant parking/unparking of the subsequent cores resulted in microstutters in some instances for some users.
I found it. My memory failed me! (there are plenty of youtube videos relating to microstutter and core parking though)
The link is more about high-resolution timing and how to implement timing code in a way that avoids the problems associated with using RDTSC.
http://msdn.microsoft.com/en-us/library/windows/desktop/ee417693(v=vs.85).aspx
3.Compute all timing on a single thread. Computation of timing on multiple threads — for example, with each thread associated with a specific processor — greatly reduces performance of multi-core systems.
4.Set that single thread to remain on a single processor by using the Windows API SetThreadAffinityMask. Typically, this is the main game thread. While QueryPerformanceCounter and QueryPerformanceFrequency typically adjust for multiple processors, bugs in the BIOS or drivers may result in these routines returning different values as the thread moves from one processor to another. So, it's best to keep the thread on a single processor.
All other threads should operate without gathering their own timer data. We do not recommend using a worker thread to compute timing, as this will become a synchronization bottleneck. Instead, worker threads should read timestamps from the main thread, and because the worker threads only read timestamps, there is no need to use critical sections.
3.Compute all timing on a single thread. Computation of timing on multiple threads — for example, with each thread associated with a specific processor — greatly reduces performance of multi-core systems.
4.Set that single thread to remain on a single processor by using the Windows API SetThreadAffinityMask. Typically, this is the main game thread. While QueryPerformanceCounter and QueryPerformanceFrequency typically adjust for multiple processors, bugs in the BIOS or drivers may result in these routines returning different values as the thread moves from one processor to another. So, it's best to keep the thread on a single processor.
All other threads should operate without gathering their own timer data. We do not recommend using a worker thread to compute timing, as this will become a synchronization bottleneck. Instead, worker threads should read timestamps from the main thread, and because the worker threads only read timestamps, there is no need to use critical sections.
[quote="Partol"][quote="2Cb"]
Currently I'm sporting two gtx 780ti's with a hexacore intel cpu. I have noticed, for example, when I play AC Unity on 1080p on ULTRA with 4x msaa it runs perfectly smooth (60+ fps) with absolutely no stutter.
Now when I turn on 3D (I know its 3D compatibility mode but lets put that aside for a minute) my fps is still solid around 60fps but I do get a little repeated stutter here and there. Now if you know me, you know my games must run perfectly at maximum graphical settings - that is what I game for - so I cannot stand stutter.[/quote]
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?[/quote]
Your advice is appreciated but if you would read more carefully you would have understood I am not interested in lowering settings. As an enthusiast I only want to play on ultra. I have ONLY stutter in 3D with 4 x msaa in unity - no stutter in 2D - and am looking to upgrading as necessary to have no stutter for this game or any game in the future that consumes more than 3GB VRAM. Lowering settings can have impact on many different components.
2Cb said:
Currently I'm sporting two gtx 780ti's with a hexacore intel cpu. I have noticed, for example, when I play AC Unity on 1080p on ULTRA with 4x msaa it runs perfectly smooth (60+ fps) with absolutely no stutter.
Now when I turn on 3D (I know its 3D compatibility mode but lets put that aside for a minute) my fps is still solid around 60fps but I do get a little repeated stutter here and there. Now if you know me, you know my games must run perfectly at maximum graphical settings - that is what I game for - so I cannot stand stutter.
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?
Your advice is appreciated but if you would read more carefully you would have understood I am not interested in lowering settings. As an enthusiast I only want to play on ultra. I have ONLY stutter in 3D with 4 x msaa in unity - no stutter in 2D - and am looking to upgrading as necessary to have no stutter for this game or any game in the future that consumes more than 3GB VRAM. Lowering settings can have impact on many different components.
ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)
Hi Bo3b,
Indeed I am running everything at stock, not particularly experienced with overclocking.
I will probably try some OC from my motherboard software at some point, if I ever dare install a Ubisoft game again. If the stutters disappear, I'll report back (althought most of the civilised world will probably hear me scream anyway lol )
But if your theory is correct, then ... are most people overclocking? Or maybe they're getting stutters and not reporting them here. Anyway, if it's only me who is plagued by this, all the better. There are enough funky issues with gaming performance lately. (for example that AC:Unity thread that someone posted in a differnt thread about the upcoming patch4. Consoles gamers have already tried out the patch, and for some users the fps is perfect now, while others have the same crappy frame rate issues - If this is possible on machines with standardised hardware, what chance do we PC playesr have, right? LOL)
Hi Bo3b,
Indeed I am running everything at stock, not particularly experienced with overclocking.
I will probably try some OC from my motherboard software at some point, if I ever dare install a Ubisoft game again. If the stutters disappear, I'll report back (althought most of the civilised world will probably hear me scream anyway lol )
But if your theory is correct, then ... are most people overclocking? Or maybe they're getting stutters and not reporting them here. Anyway, if it's only me who is plagued by this, all the better. There are enough funky issues with gaming performance lately. (for example that AC:Unity thread that someone posted in a differnt thread about the upcoming patch4. Consoles gamers have already tried out the patch, and for some users the fps is perfect now, while others have the same crappy frame rate issues - If this is possible on machines with standardised hardware, what chance do we PC playesr have, right? LOL)
Oh yeah, you should definitely OC. The 2600K is a superb chip for that. It's pretty much as easy as setting numbers for the chip in the UI. Your 2600K is unlocked, which means that it's easier than ever.
You can OC on air too, just get better results with water. The strategy is go slow, single changes at a time, and watch the CPU temps. That's it. Run Prime95 as a stress test, and see what temps you get. Stay under 85c when fully loaded with Prime95. You stop bumping the numbers up when you either cannot stay under 85c, or you get crashes or dead threads in Prime95.
Give it a try, it's easy and will help your frame rates on any CPU bound game.
Oh yeah, you should definitely OC. The 2600K is a superb chip for that. It's pretty much as easy as setting numbers for the chip in the UI. Your 2600K is unlocked, which means that it's easier than ever.
You can OC on air too, just get better results with water. The strategy is go slow, single changes at a time, and watch the CPU temps. That's it. Run Prime95 as a stress test, and see what temps you get. Stay under 85c when fully loaded with Prime95. You stop bumping the numbers up when you either cannot stay under 85c, or you get crashes or dead threads in Prime95.
Give it a try, it's easy and will help your frame rates on any CPU bound game.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="2Cb"][quote="Partol"][quote="2Cb"]
Currently I'm sporting two gtx 780ti's with a hexacore intel cpu. I have noticed, for example, when I play AC Unity on 1080p on ULTRA with 4x msaa it runs perfectly smooth (60+ fps) with absolutely no stutter.
Now when I turn on 3D (I know its 3D compatibility mode but lets put that aside for a minute) my fps is still solid around 60fps but I do get a little repeated stutter here and there. Now if you know me, you know my games must run perfectly at maximum graphical settings - that is what I game for - so I cannot stand stutter.[/quote]
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?[/quote]
Your advice is appreciated but if you would read more carefully you would have understood I am not interested in lowering settings. As an enthusiast I only want to play on ultra. I have ONLY stutter in 3D with 4 x msaa in unity - no stutter in 2D - and am looking to upgrading as necessary to have no stutter for this game or any game in the future that consumes more than 3GB VRAM. Lowering settings can have impact on many different components.[/quote]
well if you are getting stutter in unity with 4msaa turn that down a bit and see if it helped only as a test we are not telling you to turn it down permantly only as a test.. if it does get rid of the stutter you might be able to conclude that it was a Vram issue. as MSAA takes alor of vram up..
Like was stated too OC your CPU.. that 2600k can get as high as 4.5 stale even higher if on water..
I don't have either game but it could be a vram issue especially since you like ULTRA Settings..
I did not see but do you run triple monitors??
2Cb said:
Currently I'm sporting two gtx 780ti's with a hexacore intel cpu. I have noticed, for example, when I play AC Unity on 1080p on ULTRA with 4x msaa it runs perfectly smooth (60+ fps) with absolutely no stutter.
Now when I turn on 3D (I know its 3D compatibility mode but lets put that aside for a minute) my fps is still solid around 60fps but I do get a little repeated stutter here and there. Now if you know me, you know my games must run perfectly at maximum graphical settings - that is what I game for - so I cannot stand stutter.
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?
Your advice is appreciated but if you would read more carefully you would have understood I am not interested in lowering settings. As an enthusiast I only want to play on ultra. I have ONLY stutter in 3D with 4 x msaa in unity - no stutter in 2D - and am looking to upgrading as necessary to have no stutter for this game or any game in the future that consumes more than 3GB VRAM. Lowering settings can have impact on many different components.
well if you are getting stutter in unity with 4msaa turn that down a bit and see if it helped only as a test we are not telling you to turn it down permantly only as a test.. if it does get rid of the stutter you might be able to conclude that it was a Vram issue. as MSAA takes alor of vram up..
Like was stated too OC your CPU.. that 2600k can get as high as 4.5 stale even higher if on water..
I don't have either game but it could be a vram issue especially since you like ULTRA Settings..
OK, fair enough. It's weird though, because I'm not getting stutter in AC4 anymore with that fix. It's not high performance, I average 45 fps with SLI 760, but it's a smooth experience. I also no longer get stutter in WatchDogs with it tuned. Changing the textures to Medium from Ultra made the biggest difference, and even though I have 4G cards, going less than High fixed the stutter during driving.
It's a good lesson that performance for games is definitely not all about the GPU. I used to think that, but the biggest change for me in 3D was to get a better CPU. I think that 3D hammers the CPU much, much harder than 2D, and is often a second thought component in game builds.
For your rig there, I'd definitely say that CPU is next on your agenda. The 2600K is a great chip, but since it's possible to do better today, I think that would make the most difference in your game computer. I specifically went with an i5 because almost zero games use more than two threads, and I can overclock a bit better with no hyper-threading.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I hope more and more our findings will put the spotlight on the CPU. If your theory that CPU is extra hammered by 3D, (and that hyper-threaded CPUs are counter-productive in 3D gaming) then this would explain everything I'm experiencing curently.
If I wasn't so lazy, I'd buy the best i5 I could find for my Gigabyte Z68XP-UD3 and test your theory. But I barely could fit the 980s in the my Antec case, and I'd have to play with thermal paste again :-)
Do you think it's worth it, to put a new CPU (socket 1155) in my build, or are the latest i5s not even compatible with my motherboard. I've tried briefly to check compatibility charts here - http://www.cpu-upgrade.com/mb-Gigabyte/GA-Z68XP-UD3(rev._1.3).html
But the latest i5 displayed is the i5-3570K, which costs 232EUR on Amazon! Not at all cheap, and I wonder if it's really a big improvement on the i7-2600k, enough to make the experiment worthwhile.
Interested in your thoughts. But it's not something i'll do lightly, since apart from some Ubisoft games my 3D performance is awesome. I think it could wait until the next full upgrade.
On the other hand (I play mindgames with myself it seems), users such as you, Pirate, etc, are not reporting so pervasive stutters, expecially with the optimised wrapper versions, so it could indeed all point to a CPU bottleneck. Otherwise there would be many more users here complaining about this phenomenon, I think.
EDIT
The CPU upgrade dilema is even further complicated by advice such as Tom's Hardware, everytime they update their CPU chart - http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html
"I don’t recommend upgrading your CPU unless the potential replacement is at least three tiers higher. Otherwise, the upgrade is somewhat parallel and you may not notice a worthwhile difference in game performance"
On their list, my i7 2600K is at the topmost tier, and the presumably better-for-3D i5 3570K is two tiers below.
I imagine TH's view is adopted mainstream on tech websites. In this context, your theory is even more sobering, if true. And reveals the current views as incorrect, at least for 3D gaming.
I also do not agree with the idea that you should jump 3 tiers for a good upgrade, that is just silly. It totally depends upon the job in question. If you did video editing for a living for example, you should jump for every chip that adds more threads. I personally jumped two tiers from an 860 to 4670K, and it was a great bump in 3D game performance.
For your case, I'm not sure that an i5 would make any difference here on your current motherboard. Not enough delta there over your already strong 2600K for gaming.
If I were in your position, I'd keep the 2600K for awhile longer, but try overclocking it. Correct me if I'm wrong, but you are running stock clocks, which is very likely to be the problem for unity games stutter.
Here's a particularly good comparison of CPUs on WatchDogs:
http://pclab.pl/zdjecia/artykuly/chaostheory/2014/05/watch_dogs/charts/wd_cpu_gf.png
Note that the 2600K when OCed is comparable to modern chips. Also note how far back the 2600K falls when not OCed.
When I went with i5, the theory was that would allow more headroom for overclocking, and I don't believe that panned out. An i7 is a better choice for other stuff, and I think the headroom for OC on the Haswell chips is on the weak side in general.
The Overclocking on the 2600K is renowned to be particularly good.
So, if I were you, I'd buy a water cooler (closed-loop, like Kraken), and OC your chip. Sure it lowers the life span, but so what? It's not going to be viable another 3 years anyway, and it's arguably not viable now.
No guarantee, but if you are stock-clocking, that would be my best guess for your Ubi stutter.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Simply lower the graphic settings a little, to reduce vram usage, and then check for stutter. No need to hypothesis or conjecture or poll the forum. Just run a quick test at lower graphics settings (and lower vram usage). Is there any improvement? yes or no?
Thief 1/2/gold in 3D
https://forums.geforce.com/default/topic/523535/3d-vision/thief-1-2-and-system-shock-2-perfect-3d-with-unofficial-patch-1-19
http://photos.3dvisionlive.com/Partol/album/509eb580a3e067153c000020/
[Acer GD245HQ - 1920x1080 120Hz] [Nvidia 3D Vision]
[MSI H81M-P33 with Pentium G3258 @ 4.4GHz and Zalman CNPS5X}[Transcend 2x2GB DDR3]
[Asus GTX 750 Ti @ 1350MHz] [Intel SSD 330 - 240GB]
[Creative Titanium HD + Beyerdynamic DT 880 (250ohm) headphones] [Windows 7 64bit]
If you haven't, I'd suggest unparking your cores.
The link is more about high-resolution timing and how to implement timing code in a way that avoids the problems associated with using RDTSC.
http://msdn.microsoft.com/en-us/library/windows/desktop/ee417693(v=vs.85).aspx
3.Compute all timing on a single thread. Computation of timing on multiple threads — for example, with each thread associated with a specific processor — greatly reduces performance of multi-core systems.
4.Set that single thread to remain on a single processor by using the Windows API SetThreadAffinityMask. Typically, this is the main game thread. While QueryPerformanceCounter and QueryPerformanceFrequency typically adjust for multiple processors, bugs in the BIOS or drivers may result in these routines returning different values as the thread moves from one processor to another. So, it's best to keep the thread on a single processor.
All other threads should operate without gathering their own timer data. We do not recommend using a worker thread to compute timing, as this will become a synchronization bottleneck. Instead, worker threads should read timestamps from the main thread, and because the worker threads only read timestamps, there is no need to use critical sections.
Your advice is appreciated but if you would read more carefully you would have understood I am not interested in lowering settings. As an enthusiast I only want to play on ultra. I have ONLY stutter in 3D with 4 x msaa in unity - no stutter in 2D - and am looking to upgrading as necessary to have no stutter for this game or any game in the future that consumes more than 3GB VRAM. Lowering settings can have impact on many different components.
ASUS VG278H - 3D Vision 2 - Driver 358.87 - Titan X SLI@1519Mhz - i7-4930K@4.65GHz - 16GB RAM - Win7x64 - Samsung SSD 850 PRO (256GB) and Samsung EVO 850 (1TB) - Full EK Custom Waterloop - Project Milkyway Galaxy (3D Mark Firestrike Hall of Famer)
G-Pat on Helixmod
Indeed I am running everything at stock, not particularly experienced with overclocking.
I will probably try some OC from my motherboard software at some point, if I ever dare install a Ubisoft game again. If the stutters disappear, I'll report back (althought most of the civilised world will probably hear me scream anyway lol )
But if your theory is correct, then ... are most people overclocking? Or maybe they're getting stutters and not reporting them here. Anyway, if it's only me who is plagued by this, all the better. There are enough funky issues with gaming performance lately. (for example that AC:Unity thread that someone posted in a differnt thread about the upcoming patch4. Consoles gamers have already tried out the patch, and for some users the fps is perfect now, while others have the same crappy frame rate issues - If this is possible on machines with standardised hardware, what chance do we PC playesr have, right? LOL)
You can OC on air too, just get better results with water. The strategy is go slow, single changes at a time, and watch the CPU temps. That's it. Run Prime95 as a stress test, and see what temps you get. Stay under 85c when fully loaded with Prime95. You stop bumping the numbers up when you either cannot stay under 85c, or you get crashes or dead threads in Prime95.
Give it a try, it's easy and will help your frame rates on any CPU bound game.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
well if you are getting stutter in unity with 4msaa turn that down a bit and see if it helped only as a test we are not telling you to turn it down permantly only as a test.. if it does get rid of the stutter you might be able to conclude that it was a Vram issue. as MSAA takes alor of vram up..
Like was stated too OC your CPU.. that 2600k can get as high as 4.5 stale even higher if on water..
I don't have either game but it could be a vram issue especially since you like ULTRA Settings..
I did not see but do you run triple monitors??
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit