Hardware Upgrade strictly for 3D Vision Gaming
  3 / 4    
[quote="clammy"]Ok guys I'm going 6700k 1080 GTX Still haven't decided on which 1440p 3D Vision capable monitor[/quote] Good Luck with your upgrade and enjoy
clammy said:Ok guys I'm going

6700k
1080 GTX
Still haven't decided on which 1440p 3D Vision capable monitor



Good Luck with your upgrade and enjoy

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

#31
Posted 08/30/2016 02:27 PM   
Good luck to you mate. That will net you ~40% performance increase in non-GPU limited games :) A word of caution though - Newer games such as the Witcher 3, and Deus Ex:Mankind Divided; as well as all triple A titles from now on will likely be severely GPU limited in 3D vision at 1440p. Further, the vast majority of future games will likely have a DX12 path option, which means they will utelise the CPU far better. This means that you will likely not see any CPU gain going forward compared to your 2500k @ 4.5GHz. IMHO, you will find that spending your 6700k + mtherboard + RAM money on a second 1080 instead will double your performance going forward. Giving you an example, I am playing through the witcher 3 with hairworks off + foliage distance down a notch, at a resolution very close to 1440p. I get solid 60FPS everywhere except in towns where it drops to ~54fps. Here, both my overclocked 1080s are maxed out @ >98% usage (GPU bottleneck). The CPU hovers at around 25% however. Disabling 3D Vision shunts the FPS to 115fps, where the GPUs are working at ~60% but the CPU is now maxing at 33% (CPU bottleneck). This shows that with my 5 year old 6 core xeon @ 4.4GHz with a practically identical IPC to your 2500K @ 4.5GHz, and a very similar resolution to yours, and even with overclocked 1080s in SLi, I am being GPU bottlenecked in the most demanding PC game out right now, and the CPU still has a [(33-25)/25 x 100] = 32% spare headroom! (This means that with even more powerful GPU cards, I can increase the game performance a further 32% before the CPU will bottleneck). Summing up: With a 6700K, indeed you will get a 40% performance increase in 3D GTA5, and all the old crappy 2 thread limited games such as AC3/4, FO4, ARMA etc. If you mostly play older games, then this is indeed a great choice. Going forward, however, the 6700K will likely not give you any performance increase over a 2500K @ 4.5GHz, whereas spending the money on another 1080 will quite literally double your performance going forward. Here is some further corroborating evidence from an independent site: http://www.techspot.com/review/1235-deus-ex-mankind-divided-benchmarks/page5.html Note how there is no real performance gain with a 6700k over the 2500K @ stock, even at such a low resolution of 1080p in 2D. Also note that overclocking the 6700K has no benefit. You will be playing 1440p in 3D which is quite literally equivalent to playing in double the resolution of 1440p, which will become an extreme GPU bottleneck. Please take what I have to say with a pinch of salt - there are no absolutes when predicting the future, but I believe that on the balance of probability, going forward, these observations will likely be correct.
Good luck to you mate. That will net you ~40% performance increase in non-GPU limited games :)

A word of caution though - Newer games such as the Witcher 3, and Deus Ex:Mankind Divided; as well as all triple A titles from now on will likely be severely GPU limited in 3D vision at 1440p.

Further, the vast majority of future games will likely have a DX12 path option, which means they will utelise the CPU far better.

This means that you will likely not see any CPU gain going forward compared to your 2500k @ 4.5GHz.

IMHO, you will find that spending your 6700k + mtherboard + RAM money on a second 1080 instead will double your performance going forward.

Giving you an example, I am playing through the witcher 3 with hairworks off + foliage distance down a notch, at a resolution very close to 1440p. I get solid 60FPS everywhere except in towns where it drops to ~54fps. Here, both my overclocked 1080s are maxed out @ >98% usage (GPU bottleneck). The CPU hovers at around 25% however. Disabling 3D Vision shunts the FPS to 115fps, where the GPUs are working at ~60% but the CPU is now maxing at 33% (CPU bottleneck).

This shows that with my 5 year old 6 core xeon @ 4.4GHz with a practically identical IPC to your 2500K @ 4.5GHz, and a very similar resolution to yours, and even with overclocked 1080s in SLi, I am being GPU bottlenecked in the most demanding PC game out right now, and the CPU still has a [(33-25)/25 x 100] = 32% spare headroom! (This means that with even more powerful GPU cards, I can increase the game performance a further 32% before the CPU will bottleneck).

Summing up: With a 6700K, indeed you will get a 40% performance increase in 3D GTA5, and all the old crappy 2 thread limited games such as AC3/4, FO4, ARMA etc. If you mostly play older games, then this is indeed a great choice.

Going forward, however, the 6700K will likely not give you any performance increase over a 2500K @ 4.5GHz, whereas spending the money on another 1080 will quite literally double your performance going forward.

Here is some further corroborating evidence from an independent site:
http://www.techspot.com/review/1235-deus-ex-mankind-divided-benchmarks/page5.html
Note how there is no real performance gain with a 6700k over the 2500K @ stock, even at such a low resolution of 1080p in 2D. Also note that overclocking the 6700K has no benefit. You will be playing 1440p in 3D which is quite literally equivalent to playing in double the resolution of 1440p, which will become an extreme GPU bottleneck.

Please take what I have to say with a pinch of salt - there are no absolutes when predicting the future, but I believe that on the balance of probability, going forward, these observations will likely be correct.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#32
Posted 08/31/2016 02:28 AM   
After my intial 2c on this, and reading through the thread. I am with RAGEdemon on this. I would say would would be better off considering going SLI than forking out for Motherboard, CPU, Memory. You'll see much a larger performance benefit, at this point. That said, honestly. SLI can be a pain in the ass but it's worth it when it works nicely.
After my intial 2c on this, and reading through the thread. I am with RAGEdemon on this. I would say would would be better off considering going SLI than forking out for Motherboard, CPU, Memory.

You'll see much a larger performance benefit, at this point.

That said, honestly. SLI can be a pain in the ass but it's worth it when it works nicely.

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#33
Posted 08/31/2016 03:54 AM   
[quote="clammy"]Ok guys I'm going 6700k 1080 GTX Still haven't decided on which 1440p 3D Vision capable monitor[/quote] Not to give you a hard time, but that's a strange conclusion after you said that GTA5 was your prime metric, and two people on this thread (ishiki and zig11727) already have done the upgrade here and got no improvement. Would be worth overclocking your current RAM first, as an experiment. Based on what I read, that matters far more here than CPU or GPU.
clammy said:Ok guys I'm going

6700k
1080 GTX
Still haven't decided on which 1440p 3D Vision capable monitor

Not to give you a hard time, but that's a strange conclusion after you said that GTA5 was your prime metric, and two people on this thread (ishiki and zig11727) already have done the upgrade here and got no improvement.

Would be worth overclocking your current RAM first, as an experiment. Based on what I read, that matters far more here than CPU or GPU.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#34
Posted 08/31/2016 08:03 AM   
I would recommend getting a motherboard with 3 GPU slots. That way, if you decide to SLI you can. Also, for games using Nvidia's PhysX/Flex, you can pick up extra frames by having a dedicated PhysX GPU. This is dependent on the on the GPUs being used as to how much of a gain, but it can really help in certain games. Plus, with the low heat generation and low power usage of these newer GPUs, it's not like it's a detriment to have one in your PC.
I would recommend getting a motherboard with 3 GPU slots.

That way, if you decide to SLI you can.

Also, for games using Nvidia's PhysX/Flex, you can pick up extra frames by having a dedicated PhysX GPU. This is dependent on the on the GPUs being used as to how much of a gain, but it can really help in certain games. Plus, with the low heat generation and low power usage of these newer GPUs, it's not like it's a detriment to have one in your PC.

#35
Posted 08/31/2016 04:46 PM   
So was the issue here ever actually ascertained?: https://forums.geforce.com/default/topic/832496/3d-vision/3d-vision-cpu-core-bottleneck/1/ I see bo3b now mentions memory bandwidth being an issue as well, is that separate from the 3-core limitation?
So was the issue here ever actually ascertained?: https://forums.geforce.com/default/topic/832496/3d-vision/3d-vision-cpu-core-bottleneck/1/


I see bo3b now mentions memory bandwidth being an issue as well, is that separate from the 3-core limitation?

#36
Posted 08/31/2016 06:09 PM   
bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit. Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)
bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit.

Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#37
Posted 08/31/2016 08:40 PM   
[quote="D-Man11"]I would recommend getting a motherboard with 3 GPU slots. That way, if you decide to SLI you can. Also, for games using Nvidia's PhysX/Flex, you can pick up extra frames by having a dedicated PhysX GPU. This is dependent on the on the GPUs being used as to how much of a gain, but it can really help in certain games. Plus, with the low heat generation and low power usage of these newer GPUs, it's not like it's a detriment to have one in your PC.[/quote] Here a french buddy from the great community of Hardware.fr who made a bench for us about this: https://www.youtube.com/watch?v=hbMLhDJ43JU&feature=youtu.be
D-Man11 said:I would recommend getting a motherboard with 3 GPU slots.

That way, if you decide to SLI you can.

Also, for games using Nvidia's PhysX/Flex, you can pick up extra frames by having a dedicated PhysX GPU. This is dependent on the on the GPUs being used as to how much of a gain, but it can really help in certain games. Plus, with the low heat generation and low power usage of these newer GPUs, it's not like it's a detriment to have one in your PC.


Here a french buddy from the great community of Hardware.fr who made a bench for us about this:

;feature=youtu.be

#38
Posted 08/31/2016 09:14 PM   
[quote="RAGEdemon"]bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit. Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)[/quote] Ah, thanks for the explanation. If that's the case it wouldn't really be a problem worth throwing money at. Reminds me of the chase for performance in FSX, where people would have ridiculous CPU OCs and massive amounts of RAM and still get pitiful fps, then Prepar3d and subsequently the Steam Edition came out and finally people were getting appropriate numbers. Since improvements in single-thread CPU performance have slowed down so much, it would take a long while for the problem to be addressed with brute force than with fixing the back end, unfortunately the latter might mean never here. Too bad, GTA V's world needs a proper 3D/VR treatment.
RAGEdemon said:bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit.

Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)

Ah, thanks for the explanation. If that's the case it wouldn't really be a problem worth throwing money at. Reminds me of the chase for performance in FSX, where people would have ridiculous CPU OCs and massive amounts of RAM and still get pitiful fps, then Prepar3d and subsequently the Steam Edition came out and finally people were getting appropriate numbers. Since improvements in single-thread CPU performance have slowed down so much, it would take a long while for the problem to be addressed with brute force than with fixing the back end, unfortunately the latter might mean never here. Too bad, GTA V's world needs a proper 3D/VR treatment.

#39
Posted 08/31/2016 09:30 PM   
[quote="aeliusg"][quote="RAGEdemon"]bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit. Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)[/quote]Ah, thanks for the explanation. If that's the case it wouldn't really be a problem worth throwing money at. Reminds me of the chase for performance in FSX, where people would have ridiculous CPU OCs and massive amounts of RAM and still get pitiful fps, then Prepar3d and subsequently the Steam Edition came out and finally people were getting appropriate numbers. Since improvements in single-thread CPU performance have slowed down so much, it would take a long while for the problem to be addressed with brute force than with fixing the back end, unfortunately the latter might mean never here. Too bad, GTA V's world needs a proper 3D/VR treatment. [/quote] I think that experiment of forcing 3D via 3DMigoto was probably valid, but our conclusion was off. It didn't help the 3D performance, which suggested that it was a 3 core limit in the driver. Since then, that seems less likely because almost no other games have this problem. That leads me to research that suggests it might be related to CPU L3 cache size, and/or memory bandwidth. If GTA5 were tightly tuned for a 6M L3 cache for example, and 3D blows that cache, this is the sort of effect you would see. GTA5 is nearly flawless in 3D already. You can't get 60 fps, but 50 fps is doable with SLI (about 35% scaling). It also has a much, much better feel than most games at 50 fps, primarily because of its smooth minimum frame rate. It doesn't jerk up and down all the time, it's smooth. If you are looking for better than this on GTA5, you will be disappointed, and I'm saying it's easily worth playing today.
aeliusg said:
RAGEdemon said:bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit.

Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)
Ah, thanks for the explanation. If that's the case it wouldn't really be a problem worth throwing money at. Reminds me of the chase for performance in FSX, where people would have ridiculous CPU OCs and massive amounts of RAM and still get pitiful fps, then Prepar3d and subsequently the Steam Edition came out and finally people were getting appropriate numbers. Since improvements in single-thread CPU performance have slowed down so much, it would take a long while for the problem to be addressed with brute force than with fixing the back end, unfortunately the latter might mean never here. Too bad, GTA V's world needs a proper 3D/VR treatment.

I think that experiment of forcing 3D via 3DMigoto was probably valid, but our conclusion was off. It didn't help the 3D performance, which suggested that it was a 3 core limit in the driver. Since then, that seems less likely because almost no other games have this problem.

That leads me to research that suggests it might be related to CPU L3 cache size, and/or memory bandwidth. If GTA5 were tightly tuned for a 6M L3 cache for example, and 3D blows that cache, this is the sort of effect you would see.

GTA5 is nearly flawless in 3D already. You can't get 60 fps, but 50 fps is doable with SLI (about 35% scaling). It also has a much, much better feel than most games at 50 fps, primarily because of its smooth minimum frame rate. It doesn't jerk up and down all the time, it's smooth. If you are looking for better than this on GTA5, you will be disappointed, and I'm saying it's easily worth playing today.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#40
Posted 08/31/2016 10:57 PM   
I was running a dedicated Phsyx card in this build previously with a 780 Ultimately I removed it because of lack of benefit I was seeing along with heat in this particular motherboard which would also step one of my cards down to 8x with 3 cards.
I was running a dedicated Phsyx card in this build previously with a 780 Ultimately I removed it because of lack of benefit I was seeing along with heat in this particular motherboard which would also step one of my cards down to 8x with 3 cards.

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#41
Posted 08/31/2016 11:17 PM   
[quote="bo3b"][quote="aeliusg"][quote="RAGEdemon"]bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit. Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)[/quote]Ah, thanks for the explanation. If that's the case it wouldn't really be a problem worth throwing money at. Reminds me of the chase for performance in FSX, where people would have ridiculous CPU OCs and massive amounts of RAM and still get pitiful fps, then Prepar3d and subsequently the Steam Edition came out and finally people were getting appropriate numbers. Since improvements in single-thread CPU performance have slowed down so much, it would take a long while for the problem to be addressed with brute force than with fixing the back end, unfortunately the latter might mean never here. Too bad, GTA V's world needs a proper 3D/VR treatment. [/quote] I think that experiment of forcing 3D via 3DMigoto was probably valid, but our conclusion was off. It didn't help the 3D performance, which suggested that it was a 3 core limit in the driver. Since then, that seems less likely because almost no other games have this problem. That leads me to research that suggests it might be related to CPU L3 cache size, and/or memory bandwidth. If GTA5 were tightly tuned for a 6M L3 cache for example, and 3D blows that cache, this is the sort of effect you would see. GTA5 is nearly flawless in 3D already. You can't get 60 fps, but 50 fps is doable with SLI (about 35% scaling). It also has a much, much better feel than most games at 50 fps, primarily because of its smooth minimum frame rate. It doesn't jerk up and down all the time, it's smooth. If you are looking for better than this on GTA5, you will be disappointed, and I'm saying it's easily worth playing today.[/quote] Very interesting. Thanks for the insight, bo3b.
bo3b said:
aeliusg said:
RAGEdemon said:bo3b thinks that it's not due to the 3D Vision driver but more to do with the way 3D Vision in GTA5 has been implemented. He reckons that Automatic Stereo3D in other games shouldn't have this limit.

Someone forced GTA5's Stereo3D to off via a wrapper in the Witcher 3 thread to test this hypothesis, but the problem still remained. I am not knowledgeable about coding enough to ascertain if this was a valid experiment however :)
Ah, thanks for the explanation. If that's the case it wouldn't really be a problem worth throwing money at. Reminds me of the chase for performance in FSX, where people would have ridiculous CPU OCs and massive amounts of RAM and still get pitiful fps, then Prepar3d and subsequently the Steam Edition came out and finally people were getting appropriate numbers. Since improvements in single-thread CPU performance have slowed down so much, it would take a long while for the problem to be addressed with brute force than with fixing the back end, unfortunately the latter might mean never here. Too bad, GTA V's world needs a proper 3D/VR treatment.

I think that experiment of forcing 3D via 3DMigoto was probably valid, but our conclusion was off. It didn't help the 3D performance, which suggested that it was a 3 core limit in the driver. Since then, that seems less likely because almost no other games have this problem.

That leads me to research that suggests it might be related to CPU L3 cache size, and/or memory bandwidth. If GTA5 were tightly tuned for a 6M L3 cache for example, and 3D blows that cache, this is the sort of effect you would see.

GTA5 is nearly flawless in 3D already. You can't get 60 fps, but 50 fps is doable with SLI (about 35% scaling). It also has a much, much better feel than most games at 50 fps, primarily because of its smooth minimum frame rate. It doesn't jerk up and down all the time, it's smooth. If you are looking for better than this on GTA5, you will be disappointed, and I'm saying it's easily worth playing today.

Very interesting. Thanks for the insight, bo3b.

#42
Posted 09/01/2016 04:19 PM   
Yesy very. If you can tell me what to look for bo3b, I can perhaps do some experiments as I have 12MB of L3 cache. Inquisitively, my original benchmarks were also with 12MB L3.
Yesy very. If you can tell me what to look for bo3b, I can perhaps do some experiments as I have 12MB of L3 cache. Inquisitively, my original benchmarks were also with 12MB L3.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#43
Posted 09/01/2016 07:01 PM   
Since I just did upgrade - I can provide my observations. In Rise of the Tomb Rider - in Geothermal valley, just in the first village I was getting down to 30fps with my old computer: 3770k@4500, gtx 980. It is official side by side 3d at 1080p resolution (with helifax' version it was even few fps less - down to 27 or so). Then I upgraded just video - to Titan X Pascal. Now I was getting around 40fps at the same location (and around 30 with Helifax' mod). Then I decided to change cpu/mb/ram as well. And now with 6700k@4600 and DDR4@3200 with Titan I'm getting stable 60fps at the same spot (dx11 or dx12 official sbs 3d). However - very strange - but with Helifax' mod I'm still getting fps dips to 35fps at this place, so probably something wrong is going on with 3dvision here. So my conclusion - upgrade from 3770k to 6700k was very worth it for official 3d support in Tomb Rider. If that was 3d vision automatic - not so much. Update: in witcher 3 moving from 3770k to 6700k also solved low fps issues in large open locations - from around 40 to stable 60fps.
Since I just did upgrade - I can provide my observations. In Rise of the Tomb Rider - in Geothermal valley, just in the first village I was getting down to 30fps with my old computer: 3770k@4500, gtx 980. It is official side by side 3d at 1080p resolution (with helifax' version it was even few fps less - down to 27 or so).

Then I upgraded just video - to Titan X Pascal. Now I was getting around 40fps at the same location (and around 30 with Helifax' mod).

Then I decided to change cpu/mb/ram as well. And now with 6700k@4600 and DDR4@3200 with Titan I'm getting stable 60fps at the same spot (dx11 or dx12 official sbs 3d). However - very strange - but with Helifax' mod I'm still getting fps dips to 35fps at this place, so probably something wrong is going on with 3dvision here.

So my conclusion - upgrade from 3770k to 6700k was very worth it for official 3d support in Tomb Rider. If that was 3d vision automatic - not so much.

Update: in witcher 3 moving from 3770k to 6700k also solved low fps issues in large open locations - from around 40 to stable 60fps.

#44
Posted 09/04/2016 12:02 PM   
[quote="bo3b"][quote="clammy"]Ok guys I'm going 6700k 1080 GTX Still haven't decided on which 1440p 3D Vision capable monitor[/quote] Not to give you a hard time, but that's a strange conclusion after you said that GTA5 was your prime metric, and two people on this thread (ishiki and zig11727) already have done the upgrade here and got no improvement. Would be worth overclocking your current RAM first, as an experiment. Based on what I read, that matters far more here than CPU or GPU.[/quote] GTA 5 in 3D isn't my prime metric anymore. I'm buying the new hardware to now do the following. Heavily Mod GTA 5 in 2D with a ton of textures and ENB Graphics like this. https://www.youtube.com/watch?v=e5TJ9Z_7fTk Keep my DK2 after all and run it with the lastest Occulus Runtime to use that debug tool to increase the pixel density. [url]http://www.roadtovr.com/improve-oculus-rift-game-image-quality-using-this-tool-oculus-debug-tool/[/url] Run Planetside 2 with high settings. Probably a bunch of other titles to mess with and heavily mod.
bo3b said:
clammy said:Ok guys I'm going

6700k
1080 GTX
Still haven't decided on which 1440p 3D Vision capable monitor

Not to give you a hard time, but that's a strange conclusion after you said that GTA5 was your prime metric, and two people on this thread (ishiki and zig11727) already have done the upgrade here and got no improvement.

Would be worth overclocking your current RAM first, as an experiment. Based on what I read, that matters far more here than CPU or GPU.


GTA 5 in 3D isn't my prime metric anymore.
I'm buying the new hardware to now do the following.

Heavily Mod GTA 5 in 2D with a ton of textures and ENB Graphics like this.

Keep my DK2 after all and run it with the lastest Occulus Runtime to use that debug tool to increase the pixel density.
http://www.roadtovr.com/improve-oculus-rift-game-image-quality-using-this-tool-oculus-debug-tool/

Run Planetside 2 with high settings.

Probably a bunch of other titles to mess with and heavily mod.

Gaming Rig 1

i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO

Gaming Rig 2
My new build

Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2

#45
Posted 09/04/2016 08:43 PM   
  3 / 4    
Scroll To Top