How much does DDR4 timings matter for 3D gaming?
Basically, is it the same as 2D gaming - where there are diminishing returns on anything over 3200 (w/low CAS)? Or can 3D gaming benefit more than usual by going big on your DDR4? I am upgrading to a i9 9900K as I have been CPU bottlenecking on several games, and this is my first time upgrading CPU/RAM/Motherboard. Coming from a 4790K 16GB DDR3 1600, this PC is for nothing but gaming (90% 3D, on 4k TV but mostly forced to settle on 1440p/60). I know AMD CPUs usually perform better with faster DDR4, and Intel not as much...but was reading the i9's might actually be more sensitive to faster RAM and see higher gains? Also open to any motherboard suggestions, was thinking ASUS ROG STRIX Z390-E GAMING as I am coming from an Asus Z97 Sabertooth Mark 2 but really have no preference. I am not a big overclocker, so the Hero would be overkill for me.
Basically, is it the same as 2D gaming - where there are diminishing returns on anything over 3200 (w/low CAS)? Or can 3D gaming benefit more than usual by going big on your DDR4?

I am upgrading to a i9 9900K as I have been CPU bottlenecking on several games, and this is my first time upgrading CPU/RAM/Motherboard. Coming from a 4790K 16GB DDR3 1600, this PC is for nothing but gaming (90% 3D, on 4k TV but mostly forced to settle on 1440p/60). I know AMD CPUs usually perform better with faster DDR4, and Intel not as much...but was reading the i9's might actually be more sensitive to faster RAM and see higher gains?

Also open to any motherboard suggestions, was thinking ASUS ROG STRIX Z390-E GAMING as I am coming from an Asus Z97 Sabertooth Mark 2 but really have no preference. I am not a big overclocker, so the Hero would be overkill for me.

#1
Posted 10/26/2018 02:59 PM   
I guess no one really knows. I haven't actually felt the need to upgrade from DDR3 and my 4790k and I accept barely anything less than 60fps but then again I'm not and probably never in the foreseeable future going to go to 4k due to gaming exclusively (almost) in 3dvision. What graphics card are you using btw?
I guess no one really knows. I haven't actually felt the need to upgrade from DDR3 and my 4790k and I accept barely anything less than 60fps but then again I'm not and probably never in the foreseeable future going to go to 4k due to gaming exclusively (almost) in 3dvision.

What graphics card are you using btw?

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#2
Posted 10/30/2018 12:05 AM   
I recently upgraded to an 8700k, and on every game I've tested it seems to be the case that my 1080TI bottlenecks before anything else. I don't think ram timings are a major concern. The 9900k seems like major overkill - and if you're not going to overclock it, there's no real benefit in choosing a K model anyway, as the significance of the K is that they have an unlocked multiplier, allowing for overclocking.
I recently upgraded to an 8700k, and on every game I've tested it seems to be the case that my 1080TI bottlenecks before anything else. I don't think ram timings are a major concern.

The 9900k seems like major overkill - and if you're not going to overclock it, there's no real benefit in choosing a K model anyway, as the significance of the K is that they have an unlocked multiplier, allowing for overclocking.

#3
Posted 10/30/2018 01:24 PM   
I just updated from 2600k ddr3 1600mhz to 8086k dd4 3200mhz and I have to say it is a big improvement. My games were quite stuttery even if the reported frame-rate was good and stable locked at 60. Now every game is buttery smooth. On the DDR chapter, what I can confidently state, is that nowadays Dual-Channel matters big-time. I saw this on my whife's laptop which had serious framedrops below 60 FPS in Bf1 and COD until I added the second DIMM. Regarding the frequency, I saw few YouTube videos where in some games there was significant FPS difference only from DRAM frequency. In the past this had 0 impact but I think now we are getting close to the bandwidth with some games and this is why there is a difference. I can imagine in 3D it is even worse than 2D
I just updated from 2600k ddr3 1600mhz to 8086k dd4 3200mhz and I have to say it is a big improvement.
My games were quite stuttery even if the reported frame-rate was good and stable locked at 60. Now every game is buttery smooth.

On the DDR chapter, what I can confidently state, is that nowadays Dual-Channel matters big-time. I saw this on my whife's laptop which had serious framedrops below 60 FPS in Bf1 and COD until I added the second DIMM. Regarding the frequency, I saw few YouTube videos where in some games there was significant FPS difference only from DRAM frequency. In the past this had 0 impact but I think now we are getting close to the bandwidth with some games and this is why there is a difference. I can imagine in 3D it is even worse than 2D

Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits

#4
Posted 10/30/2018 03:43 PM   
I will try to test this when I get my 9900k. As it was 505usd compared to 450usd for 9700k If I have time I’ll also try it with my 7700k. I have 2133 dual 3000 single and 4266 dual channel sets. In final fantasy xv it made a huge difference but I’ve only tested 2133 to 3000. And that was only single stick so not good test with unoptimizable timings. It resulted in like 1200 extra points at 1440p But that wasn’t a good test for reasons stated above. That benchmark also kinda has problems I get cpu bottlenecks in ffxv in some locations, Witcher, and deus ex mankind divided in some Prague situations
I will try to test this when I get my 9900k. As it was 505usd compared to 450usd for 9700k

If I have time I’ll also try it with my 7700k.

I have 2133 dual 3000 single and 4266 dual channel sets.

In final fantasy xv it made a huge difference but I’ve only tested 2133 to 3000. And that was only single stick so not good test with unoptimizable timings. It resulted in like 1200 extra points at 1440p But that wasn’t a good test for reasons stated above. That benchmark also kinda has problems

I get cpu bottlenecks in ffxv in some locations, Witcher, and deus ex mankind divided in some Prague situations

I'm ishiki, forum screwed up my name.

7700k @4.7 GHZ, 16GBDDR4@3466MHZ, 2080 Ti

#5
Posted 10/30/2018 07:53 PM   
I was getting bottlenecked on Kingdom Come Deliverance, Wildlands (2D), Project cars 2 (at night), Arma 3 at times, and Fallout 4 (because of using a very heavily CPU intensive tree mod). Open world games are getting much more CPU intensive, so I decided to go as big as possible in part to future proof and because it can hit 3D much worse than 2D. I put off finishing most of those games until I get the new build going as the frames were so bad in 3D and the 1080ti was just bored waiting for stuff to do on any resolution (I currently have a 1080ti but will probably upgrade to 2080ti even though the pricing makes me nauseous and we can't even use the RTX features in 3D). Now that I am hooked on 3D, I can never get enough frames in AAA games or poorly optimized indies so I will always be chasing raw power at higher resolutions...
I was getting bottlenecked on Kingdom Come Deliverance, Wildlands (2D), Project cars 2 (at night), Arma 3 at times, and Fallout 4 (because of using a very heavily CPU intensive tree mod). Open world games are getting much more CPU intensive, so I decided to go as big as possible in part to future proof and because it can hit 3D much worse than 2D. I put off finishing most of those games until I get the new build going as the frames were so bad in 3D and the 1080ti was just bored waiting for stuff to do on any resolution (I currently have a 1080ti but will probably upgrade to 2080ti even though the pricing makes me nauseous and we can't even use the RTX features in 3D). Now that I am hooked on 3D, I can never get enough frames in AAA games or poorly optimized indies so I will always be chasing raw power at higher resolutions...

#6
Posted 10/30/2018 10:01 PM   
Well in 3D you are locked at 60fps anyway. Wish Nvidia would revisit the tech and it least update it to current standards. But yeah I run a skyrim modded to the hilt at 1440p and it runs 60fps smooth until I turn on 3d then framerate is almost cut in half. (can get it back up to 45 - 55 with an SLI hack.) I assume a CPU/memory upgrade would alleviate this somewhat. But I don't have much of an issue in any of the games I play including Kingdom Come. Probably just due to running SLI. It's just a pain the the ass though in general due to lack of support. So it's obviously hard to recommend.
Well in 3D you are locked at 60fps anyway. Wish Nvidia would revisit the tech and it least update it to current standards. But yeah I run a skyrim modded to the hilt at 1440p and it runs 60fps smooth until I turn on 3d then framerate is almost cut in half. (can get it back up to 45 - 55 with an SLI hack.) I assume a CPU/memory upgrade would alleviate this somewhat. But I don't have much of an issue in any of the games I play including Kingdom Come. Probably just due to running SLI. It's just a pain the the ass though in general due to lack of support. So it's obviously hard to recommend.

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#7
Posted 10/30/2018 11:08 PM   
Yes, memory speed matters a huge deal, and always has. The results which show otherwise were GPU bottlenecked. You have to be careful where you look to consume info, as majority of internet "reviewers" don't know what they are talking about; e.g. test with bottlenecks elsewhere. Cringeworthy... Here is a good video... https://www.youtube.com/watch?v=Er_Fuz54U0Y Also, independent thread here: [url]https://www.overclock.net/forum/18051-memory/1487162-independent-study-does-speed-ram-directly-affect-fps-during-high-cpu-overhead-scenarios.html[/url] All this is assuming you aren't running into a GPU bottleneck. If GPU is the bottleneck, then the fastest memory in the world at a trillion GHz won't make the blindest bit of difference... Memory prices are due to drop significantly in 2019 so if you want to wait, that might be a good idea. (Actually that is a half-truth - US has stopped exporting to the Chinese company which was going to alleviate the problem because it would compete with US companies, so prices might stay high for a while yet after all, until SK Hynix and Samsung get their new fabs on new processes up and running; but they are in no hurry to do that, with the current memory market being as lucrative as it currently is for them).
Yes, memory speed matters a huge deal, and always has. The results which show otherwise were GPU bottlenecked. You have to be careful where you look to consume info, as majority of internet "reviewers" don't know what they are talking about; e.g. test with bottlenecks elsewhere. Cringeworthy...

Here is a good video...



Also, independent thread here:
https://www.overclock.net/forum/18051-memory/1487162-independent-study-does-speed-ram-directly-affect-fps-during-high-cpu-overhead-scenarios.html

All this is assuming you aren't running into a GPU bottleneck. If GPU is the bottleneck, then the fastest memory in the world at a trillion GHz won't make the blindest bit of difference...


Memory prices are due to drop significantly in 2019 so if you want to wait, that might be a good idea. (Actually that is a half-truth - US has stopped exporting to the Chinese company which was going to alleviate the problem because it would compete with US companies, so prices might stay high for a while yet after all, until SK Hynix and Samsung get their new fabs on new processes up and running; but they are in no hurry to do that, with the current memory market being as lucrative as it currently is for them).

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#8
Posted 10/30/2018 11:40 PM   
Scroll To Top