Upgrade CPU, which one is the best cost x benefit for 3D Vision?
2 / 3
[quote="RAGEdemon"]i5 8600K for 3D gaming due to 3D vision bottleneck. IPC x Clock speed is King.
Make sure to also get fast memory (>3000MHz with decent CL), and OC as much as you can.[/quote]
Thanks for the reply. Lets see if there will be any good promos this black friday here.
With my budget I could get another SSD or another 8GB of ram for the price difference of the i5 8400 vs 8600K.
I dont think there will be much difference in VR, but yes 3D Vision might require the OC of the processor :(
These are my options so far:
i5-8400
i5-8600k
MSI Z370 SLI PLUS
Corsair Vengeance LPX 8GB (2x4GB) 3200Mhz DDR4 CL16 Black
Any recommendations? The 8600K will make me go over my budget :(
RAGEdemon said:i5 8600K for 3D gaming due to 3D vision bottleneck. IPC x Clock speed is King.
Make sure to also get fast memory (>3000MHz with decent CL), and OC as much as you can.
Thanks for the reply. Lets see if there will be any good promos this black friday here.
With my budget I could get another SSD or another 8GB of ram for the price difference of the i5 8400 vs 8600K.
I dont think there will be much difference in VR, but yes 3D Vision might require the OC of the processor :(
These are my options so far:
i5-8400
i5-8600k
MSI Z370 SLI PLUS
Corsair Vengeance LPX 8GB (2x4GB) 3200Mhz DDR4 CL16 Black
Any recommendations? The 8600K will make me go over my budget :(
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
You really want my advice?
I am going to say something and you will not like it.
Your free time is valuable - you work hard for it. Why settle for anything less than the relatively best you can have?
Forget the 8400. There isn't a big price difference between 8600k and 8700k. There is a huge difference in HT vs non HT though, which will give 30% gains in the future. You have had the i7 860 for 10 years. You will likely have the new CPU for even longer. HT does matter in games even nowadays, let alone in the future.
[img]http://core0.staticworld.net/images/article/2016/02/dx12_cpu_cinebench_r15_all_cores-100647717-orig.png[/img]
My advice to you:
-- Save up more money.
-- Buy 8700K + good OC motherboard + cooler, and OC as high as you can.
-- Buy 16GB 3000+MHz memory with good CL (memory prices ought to drop next year) - On my streamlined system, modern games + OS combined take up vastly more than 8GB memory - 8GB is not enough nowadays and certainly not for the future.
Compared to what you are proposing to buy now:
-- You will get a significant improvement in performance going into the future.
-- Much better future proofing - new gen consoles over the next 10 years will have a huge amount of cores, most PC games are ports of those console games designed for those core counts; you will need all the cores/virtual cores you can get.
-- For the next 10 years +, you will be glad you waited a few more months now.
Someone smarter than me once said something similar to this:
A late purchase is only late once, a bad purchase is bad forever.
I am going to say something and you will not like it.
Your free time is valuable - you work hard for it. Why settle for anything less than the relatively best you can have?
Forget the 8400. There isn't a big price difference between 8600k and 8700k. There is a huge difference in HT vs non HT though, which will give 30% gains in the future. You have had the i7 860 for 10 years. You will likely have the new CPU for even longer. HT does matter in games even nowadays, let alone in the future.
My advice to you:
-- Save up more money.
-- Buy 8700K + good OC motherboard + cooler, and OC as high as you can.
-- Buy 16GB 3000+MHz memory with good CL (memory prices ought to drop next year) - On my streamlined system, modern games + OS combined take up vastly more than 8GB memory - 8GB is not enough nowadays and certainly not for the future.
Compared to what you are proposing to buy now:
-- You will get a significant improvement in performance going into the future.
-- Much better future proofing - new gen consoles over the next 10 years will have a huge amount of cores, most PC games are ports of those console games designed for those core counts; you will need all the cores/virtual cores you can get.
-- For the next 10 years +, you will be glad you waited a few more months now.
Someone smarter than me once said something similar to this:
A late purchase is only late once, a bad purchase is bad forever.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
The 8700k is way too expensive fr me here. I live in Brazil so not that easy to buy electronics here, unfortunately.
I used to live in Australia and I am sure if I was still there I would already have done this upgrade by now, but here the reality is another.
The 8700k is way out of my league, the processor itself cost almost the entire upgrade 8400/mobo/memory and the extra cores so far are doing no good for gaming, so not really an option for me. I think the best to do now is to get an i5 and upgrade if the games start using the other cores in the future, which honestly I doubt, but still the option will be there for an upgrade later on.
My decision here is between the i5 8600k which cost around $400 dollars here now and the i5 8400 which costs $260 dollars, the $150 dollars difference is quite a bit of money here and that will only be to basically play 3D Vision games, otherwise, the 8400 should be more than enough. The plan was to wait until next year to do the upgrade and get a better price on the 8600k, but now I am having so many issues with the Oculus Rift that I cannot figure out. The sensors are not working properly and it is driving me insane, I kind of just want a fresh new computer with a decent VR ready motherboard, so perhaps I won't have USB issues. I've got a Inateck USB 3.0 PCI-e card but also having issues, so I am a bit lost, wondering the the Rift has a hardware issues. This is one reason I am rushing to buy a new PC, just had enough. If the issues dont get fixed, then I know the Rift is the problem.
I spent hours trying to get it to work a few days ago when the issues started and suddenly got it to work, played for 3 days during the weekend, and decided to reboot the computer yesterday as I was feeling it was not going to work again after the reboot, and yes it indeed stopped working... I am pissed :(
The 8700k is way too expensive fr me here. I live in Brazil so not that easy to buy electronics here, unfortunately.
I used to live in Australia and I am sure if I was still there I would already have done this upgrade by now, but here the reality is another.
The 8700k is way out of my league, the processor itself cost almost the entire upgrade 8400/mobo/memory and the extra cores so far are doing no good for gaming, so not really an option for me. I think the best to do now is to get an i5 and upgrade if the games start using the other cores in the future, which honestly I doubt, but still the option will be there for an upgrade later on.
My decision here is between the i5 8600k which cost around $400 dollars here now and the i5 8400 which costs $260 dollars, the $150 dollars difference is quite a bit of money here and that will only be to basically play 3D Vision games, otherwise, the 8400 should be more than enough. The plan was to wait until next year to do the upgrade and get a better price on the 8600k, but now I am having so many issues with the Oculus Rift that I cannot figure out. The sensors are not working properly and it is driving me insane, I kind of just want a fresh new computer with a decent VR ready motherboard, so perhaps I won't have USB issues. I've got a Inateck USB 3.0 PCI-e card but also having issues, so I am a bit lost, wondering the the Rift has a hardware issues. This is one reason I am rushing to buy a new PC, just had enough. If the issues dont get fixed, then I know the Rift is the problem.
I spent hours trying to get it to work a few days ago when the issues started and suddenly got it to work, played for 3 days during the weekend, and decided to reboot the computer yesterday as I was feeling it was not going to work again after the reboot, and yes it indeed stopped working... I am pissed :(
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
I wouldn't base a PC purchase decision on getting VR to work properly. Have you tried the rift with a friend's computer? The issue is probably with the Fresco powered USB3 card (I have the same card).
Google Oculus Tray Tool, and within it, enable the Fresco tweaks + Fresco power management tweaks. That has helped a lot of people.
I wouldn't base a PC purchase decision on getting VR to work properly. Have you tried the rift with a friend's computer? The issue is probably with the Fresco powered USB3 card (I have the same card).
Google Oculus Tray Tool, and within it, enable the Fresco tweaks + Fresco power management tweaks. That has helped a lot of people.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="RAGEdemon"]I wouldn't base a PC purchase decision on getting VR to work properly. Have you tried the rift with a friend's computer? The issue is probably with the Fresco powered USB3 card (I have the same card).
Google Oculus Tray Tool, and within it, enable the Fresco tweaks + Fresco power management tweaks. That has helped a lot of people.[/quote]
Yes, I wouldnt too, but this upgrade is due and I,ve been waiting for the i5 coffee lake, this issue with the Rift and the black friday this week seem to be a good time to go ahead with the upgrade.
I suspect it could be a hardware or firmware issue on my rift, sensors or toch controllers. I have spent so much time already trying to figure out and send emais to Oculus support.
Everything was working fine for months here, I even had usb extension cables installed and the sensors were working on the usb 2.0 ports of the motherboard coz i had issues with the inateck card and did not want to mess around as the worked fine in the mobo 2.0 ports. But anyway suddenly the touch controllers went offline and the sensors started showing error of poor tracking wireless sync time out. So, I changes the usb ports and adter a while managed to get rid of the sensors errors, but the touch controllers would not get back online, so I removed them and tried to pair them again, but then another problem happened, they would not complete the pairing, got stuck in the thrird step "finalizing controller", the firts two steps are find and pair sucessfully. After trying quite a few times and run the full setup again, always got stuck. If i messed with the cables the sensor errors would apear again, or even not messing at all with them it would sometimes appear the error messages with no patters. I unistalled the Oculus Software and reinstalled it and luckly it worked and paired again the controllers, but again i had the same problems the next day. This time reinstaling would not fix and i could not pass the pairing process of the touch. I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers. Got back from frienda house tried again and it woked, got the controllers paired and everything working fine, played From Other Suns during the weekend for 3 days without rebooting my PC, decided to reboot yesterday and again the same issues. Just tired really, wanna replace bloody everything just to make sure it is the rift that is defected. I tried again today to reinstall everything and i dont get error messages with everything connected to the inateck, but cannot get thw touch to pair and once I try error messages in the sensors appear again and there is nothing i can do to get rid of them, even leaving just a single sensor connected would give me error messages. So nothing really makes sense...
RAGEdemon said:I wouldn't base a PC purchase decision on getting VR to work properly. Have you tried the rift with a friend's computer? The issue is probably with the Fresco powered USB3 card (I have the same card).
Google Oculus Tray Tool, and within it, enable the Fresco tweaks + Fresco power management tweaks. That has helped a lot of people.
Yes, I wouldnt too, but this upgrade is due and I,ve been waiting for the i5 coffee lake, this issue with the Rift and the black friday this week seem to be a good time to go ahead with the upgrade.
I suspect it could be a hardware or firmware issue on my rift, sensors or toch controllers. I have spent so much time already trying to figure out and send emais to Oculus support.
Everything was working fine for months here, I even had usb extension cables installed and the sensors were working on the usb 2.0 ports of the motherboard coz i had issues with the inateck card and did not want to mess around as the worked fine in the mobo 2.0 ports. But anyway suddenly the touch controllers went offline and the sensors started showing error of poor tracking wireless sync time out. So, I changes the usb ports and adter a while managed to get rid of the sensors errors, but the touch controllers would not get back online, so I removed them and tried to pair them again, but then another problem happened, they would not complete the pairing, got stuck in the thrird step "finalizing controller", the firts two steps are find and pair sucessfully. After trying quite a few times and run the full setup again, always got stuck. If i messed with the cables the sensor errors would apear again, or even not messing at all with them it would sometimes appear the error messages with no patters. I unistalled the Oculus Software and reinstalled it and luckly it worked and paired again the controllers, but again i had the same problems the next day. This time reinstaling would not fix and i could not pass the pairing process of the touch. I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers. Got back from frienda house tried again and it woked, got the controllers paired and everything working fine, played From Other Suns during the weekend for 3 days without rebooting my PC, decided to reboot yesterday and again the same issues. Just tired really, wanna replace bloody everything just to make sure it is the rift that is defected. I tried again today to reinstall everything and i dont get error messages with everything connected to the inateck, but cannot get thw touch to pair and once I try error messages in the sensors appear again and there is nothing i can do to get rid of them, even leaving just a single sensor connected would give me error messages. So nothing really makes sense...
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
if you want to troubleshoot IF the Rift is defect, wouldn't it better to swap it out & test it on another computer system?
or you can borrow a RIFT from your friend and test it on your system?
Replacing every major component in your PC to figure out if the RIFT is defective is expensive
[quote="J0hnnieW4ker"] I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers. [/quote]
This is the good test. You tried it on another computer, completely unrelated to yours. And it failed. That demonstrates that it is the Rift. Let Oculus Support know this, and be clear with them, don't add a lot of extra tests. This one test proves the Rift is faulty.
BTW, that Inatek board is bogus. Oculus recommended it for a long time, but it causes a lot of problems. Read the comments section: https://www.oculus.com/blog/oculus-roomscale-extra-equipment/
At every boot it resets power management, which sounds like it might be your problem.
J0hnnieW4ker said: I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers.
This is the good test. You tried it on another computer, completely unrelated to yours. And it failed. That demonstrates that it is the Rift. Let Oculus Support know this, and be clear with them, don't add a lot of extra tests. This one test proves the Rift is faulty.
[quote="RAGEdemon"]i5 8600K for 3D gaming due to 3D vision bottleneck. IPC x Clock speed is King.
Make sure to also get fast memory (>3000MHz with decent CL), and OC as much as you can.[/quote]
What do you think of the idea of using one generation back, i7-7700K?
The IPC does not seem to be different between Sky/Kaby/Coffee.
https://us.hardware.info/reviews/7602/22/intel-core-i7-8700k--i5-8600k--i5-8400-coffee-lake-review-affordable-six-cores-ipc-test-kaby-lake-vs-coffee-lake
I'm betting the differences people measure in 2D benchmarks are because of the larger cache on Coffee, not IPC. That still might make a measurable difference for 3D, but I'd really want to see it.
The reason I question this, is because it's much easier to get an OC with Kaby than Coffee. And even easier on Sky. I completely agree that IPC x Clock is what matters for our single threaded problems. If I can push the OC better, that seems like the only win to be had.
The theory then is also that fewer cores is easier to OC. There has been a lot of discussion that i5 is better because lowering the chip real estate that is active is better. So i5-6600K Sky in principle might be best.
This line of thought would discourage a move to Coffee, because 6 cores is more than we actually need, even in i5 trim, and makes OC weaker. Base frequency starts lower too. Base frequency matters because Windows has a uniquely awful process manager which keeps other cores hot all the time, so the automatic turbo tends to not be active at all. If turbo only works for fewer cores, it's not going to be used.
I took this approach with an older computer, doing i5, instead of i7, and it did not work out. OC was feeble anyway, and I think it also cost me some cache, which might very well be more important that we think.
I'm not building for 10 years, I think that is just a waste of money. There is no such thing as future-proofing, no one knows what the future will bring. Case in point- 5 years ago, "get as many cores as you can, it's all going to be multithreaded." Did not happen. Another example- bought SkyLake for the future, great chip, should last. Microsoft is putting a bullet into Win7 support next March. Did not see that coming. "FutureProofing" is simply rationalizing an expensive purchase, it's not a real thing. Especially for 3D Vision.
I need the best I can get for 3D right now, and it's not clear to me that 6 cores is that. The trouble here is that best for 3D is not necessarily the highest priced chip, it's so complicated I'm not sure where the sweet spot is.
From what I've seen in the OC wars, the best choice at the moment appears to be i7-7700K, Kaby. Best base clock, easier to OC especially if you de-lid, best turbo if that works, 1M less cache.
i7-compare:
https://ark.intel.com/compare/88195,97129,126684
i5, then i7, all six:
https://ark.intel.com/compare/88191,97144,126685,88195,97129,126684
I'm betting the differences people measure in 2D benchmarks are because of the larger cache on Coffee, not IPC. That still might make a measurable difference for 3D, but I'd really want to see it.
The reason I question this, is because it's much easier to get an OC with Kaby than Coffee. And even easier on Sky. I completely agree that IPC x Clock is what matters for our single threaded problems. If I can push the OC better, that seems like the only win to be had.
The theory then is also that fewer cores is easier to OC. There has been a lot of discussion that i5 is better because lowering the chip real estate that is active is better. So i5-6600K Sky in principle might be best.
This line of thought would discourage a move to Coffee, because 6 cores is more than we actually need, even in i5 trim, and makes OC weaker. Base frequency starts lower too. Base frequency matters because Windows has a uniquely awful process manager which keeps other cores hot all the time, so the automatic turbo tends to not be active at all. If turbo only works for fewer cores, it's not going to be used.
I took this approach with an older computer, doing i5, instead of i7, and it did not work out. OC was feeble anyway, and I think it also cost me some cache, which might very well be more important that we think.
I'm not building for 10 years, I think that is just a waste of money. There is no such thing as future-proofing, no one knows what the future will bring. Case in point- 5 years ago, "get as many cores as you can, it's all going to be multithreaded." Did not happen. Another example- bought SkyLake for the future, great chip, should last. Microsoft is putting a bullet into Win7 support next March. Did not see that coming. "FutureProofing" is simply rationalizing an expensive purchase, it's not a real thing. Especially for 3D Vision.
I need the best I can get for 3D right now, and it's not clear to me that 6 cores is that. The trouble here is that best for 3D is not necessarily the highest priced chip, it's so complicated I'm not sure where the sweet spot is.
From what I've seen in the OC wars, the best choice at the moment appears to be i7-7700K, Kaby. Best base clock, easier to OC especially if you de-lid, best turbo if that works, 1M less cache.
Hi bo3b, those are all great points! It's always exhilaration to meet people who have deep knowledge and you can learn from!
Your rationale is spot on all the way, and I can attest that I am very happy with my kaby. I feel that there are different perspectives - from people who might use the computer for different things.
[quote="bo3b"][quote="RAGEdemon"]i5 8600K for 3D gaming due to 3D vision bottleneck. IPC x Clock speed is King.
Make sure to also get fast memory (>3000MHz with decent CL), and OC as much as you can.[/quote]
What do you think of the idea of using one generation back, i7-7700K?
The IPC does not seem to be different between Sky/Kaby/Coffee.
https://us.hardware.info/reviews/7602/22/intel-core-i7-8700k--i5-8600k--i5-8400-coffee-lake-review-affordable-six-cores-ipc-test-kaby-lake-vs-coffee-lake
I'm betting the differences people measure in 2D benchmarks are because of the larger cache on Coffee, not IPC. That still might make a measurable difference for 3D, but I'd really want to see it.[/quote]
Let's assume that the IPC is exactly the same.
[quote="bo3b"]
The reason I question this, is because it's much easier to get an OC with Kaby than Coffee. And even easier on Sky. I completely agree that IPC x Clock is what matters for our single threaded problems. If I can push the OC better, that seems like the only win to be had.[/quote]
I don't feel this is true - comparing them all at 4 cores All OC'd, it's easier to get higher OC on Kaby, and then easier still to get even higher OC on Coffee, assuming you disable 2 middle cores.
[quote="bo3b"]
The theory then is also that fewer cores is easier to OC. There has been a lot of discussion that i5 is better because lowering the chip real estate that is active is better. So i5-6600K Sky in principle might be best.[/quote]
Generally that is intuitively true. However, there was trade-off in the old days with number of cores vs higher OC (and also the interlinking mesh they used to connect the cores). Back in the day when games didn't make use of >4 cores, an i5 4 core with high OC would have been the best way to go - you are absolutely right.
It is different now:
a. games make good use of more than 4 logical cores.
b. What coffee offers is something quite remarkable - as high an OC (5GHz+) on all 12 HT cores as an older gen i5 with 4 cores (kaby), which just about managed to hit 5GHz on average. SkyL though was mostly a ~4.8GHz affair.
c. One thing to remember is that 8700k is special because while all other high core count chips today from AMD and Intel use a more scaleable mesh that is great for high thread count applications (Ryzen, SkyLake-X) but not good for ~4 thread applications, 8700K is the first CPU to use the highly optimised for lower thread-count mesh which is exactly the one found in kaby - it might not perform so great in heavily threaded tasks as other high-core-count CPU's, but it performs at least as well as 4 core kaby clock for clock on 4 cores, unlike those other CPUs.
[quote="bo3b"]
This line of thought would discourage a move to Coffee, because 6 cores is more than we actually need,
[/quote]
This is true only for 3D Vision, at this present point in time, but not on the whole. It is also up for investigation - Metaholic has ordered a 8700K system which he intends to clock to 5GHz, and we are intending to compare 7700K vs 8700K at 5GHz to see what kind of difference there is. We don't want to bias our results by going in with a strong opinion, but we are confident that 3D Vision bottleneck will likely show that 4HT cores on 7700k vs. 6HT cores on 8700k, all @5GHz are all pretty equal in performance, not taking cache into account.
[quote="bo3b"]
...even in i5 trim, and makes OC weaker.[/quote]
As above, this no longer necessarily true, and even quite the opposite when comparing generations with good cooling.
[quote="bo3b"]
Base frequency starts lower too. Base frequency matters because Windows has a uniquely awful process manager which keeps other cores hot all the time, so the automatic turbo tends to not be active at all. If turbo only works for fewer cores, it's not going to be used.[/quote]
Automatic OC used to OC a small number of cores or a single core as turbo. This might have been true back then for automatic OC scenarios.
With Coffee, Intel have introduced Multi-Core Enhancement (MCE) which now automatically OCs all cores simultaneously. This was the main culprit that tended to show 8700k perform better than a 7700k at the same apparent frequency when using workloads of 4 cores or less.
Nowadays, when we OC on SkyL/kaby/coffee, we OC all cores, and force the OC both in the BIOS and Windows power management. This ensures that all cores are at the set frequency (say 5GHz) constantly, no matter what.
[quote="bo3b"]
I took this approach with an older computer, doing i5, instead of i7, and it did not work out. OC was feeble anyway, and I think it also cost me some cache, which might very well be more important that we think. [/quote]
Cache hypothesis is intriguing. Indeed, in the past, Intel have showed lower frequency CPUs outperform same generation higher frequency CPUs due to having more cache. Cache frequency doesn't seem to be as important though - tests have shown that with a 5GHz OC, varying cache frequency from 4.2GHz to 5GHz showed no tangible difference. Benchmarks from Metaholic/myself comparing 4 core kaby vs 4 core coffee with 2 cores disabled = more cache for other cores, ought to make for interesting analysis.
[quote="bo3b"]
I'm not building for 10 years, I think that is just a waste of money. There is no such thing as future-proofing, no one knows what the future will bring. Case in point- 5 years ago, "get as many cores as you can, it's all going to be multithreaded." Did not happen. Another example- bought SkyLake for the future, great chip, should last. Microsoft is putting a bullet into Win7 support next March. Did not see that coming. "FutureProofing" is simply rationalizing an expensive purchase, it's not a real thing. Especially for 3D Vision.
[/quote]
I agree to an extent, but I would say that there are trends that cannot be ignored - we have hit a 5GHz brick wall that will only be more and more difficult to go past until we start using something other than silicon. IPC x clock right now, will likely be the same IPC x clock 10 years from now except maybe 25% improvement. Not much has changed over the past decade in these terms. The only way to up performance is more cores. Indeed this does not bode well for 3D vision.
In this regard, future proofing means getting a CPU with the best IPC x clock right now, while also getting as many cores. It does not mean sacrificing IPC x clock for a higher core count.
[quote="bo3b"]
I need the best I can get for 3D right now, and it's not clear to me that 6 cores is that. The trouble here is that best for 3D is not necessarily the highest priced chip, it's so complicated I'm not sure where the sweet spot is.[/quote]
You've hit the nail on the head. Everything else being equal, 7700k is the best right now for 3D gaming, though potentially 8700k will be better because it has more cache once 2 cores are disabled, and it will have increased OC potential.
This is not discounting the potential that the 2 more cores might actually make a difference to the 3D Vision bottleneck in games where they make heavy use of >4 cores in 2D. Right now, a safe bet is that an 8700k @ 5 Ghz is at least as good for 3D Vision as a 7700K @ 5GHz, but comes with 50% more cores and cache, and actually works out cheaper than 7700k per core:
7700k = £274/4= £68.50 per core.
8700k = £394/6 £65.60 per core.
My rationale, after a lot of research, is as follows:
1. Although 3D vision bottlenecked gaming is one of the main activities on my system, I do a lot of other things too: professional CAD/rendering, VR, CMode gaming, 2D gaming, encoding, etc. I believe that over the next 10 years, a power user would make decent use of the 2 extra cores.
2. Indeed the IPC is identical, but the all core overclock seems on par with a kaby 7700K. This is great in that if one was to disable the middle 2 cores on the 8700k coffee in the BIOS to compare Apples-to-Apples, you would have better performance due to:
a. more mature process
b. less concentrated heat, now that there are 2 cores separation and a much larger heatsink surface area
c. more cache shared between less cores.
3. Silicon has a 5GHz barrier that it won't surpass much any time soon. The only way forward, far more so than a decade ago, is more cores @5GHz base. Already we see NextGen consoles make great use of these cores - BF1, GTA5, Titanfall 2, Ashes of the singularity; this right here, is the future:
[img]https://images.techhive.com/images/article/2016/02/dx12_cpu_ashes_of_the_singularity_beta_2_average_cpu_frame_rate_high_quality_19x10-100647718-orig.png[/img]
4. The realities of Future-proofing: It is natural to assume that more games will properly use DX12 and make far better use of more cores going into the future. It is also a safe bet that Next gen consoles, being stuck at the 5GHz barrier, will likely use even more cores than the 8 they use currently. This means that ports from them should make good use of as many cores as a PC CPU can spare, IPC x Clock being as it currently stands of course.
Already consoles use 8 cores (granted they are weak individually) - but the threading potential that Mantle/Vulcan and DX12 have presented is very real and will absolutely be taken advantage of going forward. It would be unwise to invest in a CPU which uses less than these going forward (i.e. i5 wouldn't be a good decision for the future speaking generally, not specifically for 3D Vision bottlenecked gaming).
5. The problem with the old thinking was that AMD's IPC and clocks were quite horrendous compared to Intel. They tried to substitute inferior engineering with higher core count because they had no choice. It was never their intention or plan to sacrifice IPC for a higher core count. The truth is that they couldn't get higher IPC due to their tiny R&D budget, so to stay alive and compete, they gave the next best marketable thing: copy-paste more cores.
The realities of research and development are that designing a higher IPC core was too expensive at the time - better just to copy-paste what they had come up with with their comparatively tiny R&D budget, and then let marketing tell people high core count was the future - Indeed it was, but not as a trade in for weak single core performance. AMD learned their lesson, one they knew all along - investing pretty much everything they had into Zen to be able to survive.
Also, indeed, the software to take advantage of high core count in gaming was never really there - only recently have we had mantle and DX12 break through some of the limits.
Indeed there is a good chance Intel will release 8 core CPUs with similar IPC and clocks to SkyL/Kaby/Coffee, and then 12 cores, and 16 cores within the next 10 years. Whether or not they will be able to compete with the kaby/coffee interlink which does an amazing job for lower threaded applications remains to be seen. Certainly, AMD is going to do exactly this... :)
Hi bo3b, those are all great points! It's always exhilaration to meet people who have deep knowledge and you can learn from!
Your rationale is spot on all the way, and I can attest that I am very happy with my kaby. I feel that there are different perspectives - from people who might use the computer for different things.
bo3b said:
RAGEdemon said:i5 8600K for 3D gaming due to 3D vision bottleneck. IPC x Clock speed is King.
Make sure to also get fast memory (>3000MHz with decent CL), and OC as much as you can.
What do you think of the idea of using one generation back, i7-7700K?
I'm betting the differences people measure in 2D benchmarks are because of the larger cache on Coffee, not IPC. That still might make a measurable difference for 3D, but I'd really want to see it.
Let's assume that the IPC is exactly the same.
bo3b said:
The reason I question this, is because it's much easier to get an OC with Kaby than Coffee. And even easier on Sky. I completely agree that IPC x Clock is what matters for our single threaded problems. If I can push the OC better, that seems like the only win to be had.
I don't feel this is true - comparing them all at 4 cores All OC'd, it's easier to get higher OC on Kaby, and then easier still to get even higher OC on Coffee, assuming you disable 2 middle cores.
bo3b said:
The theory then is also that fewer cores is easier to OC. There has been a lot of discussion that i5 is better because lowering the chip real estate that is active is better. So i5-6600K Sky in principle might be best.
Generally that is intuitively true. However, there was trade-off in the old days with number of cores vs higher OC (and also the interlinking mesh they used to connect the cores). Back in the day when games didn't make use of >4 cores, an i5 4 core with high OC would have been the best way to go - you are absolutely right.
It is different now:
a. games make good use of more than 4 logical cores.
b. What coffee offers is something quite remarkable - as high an OC (5GHz+) on all 12 HT cores as an older gen i5 with 4 cores (kaby), which just about managed to hit 5GHz on average. SkyL though was mostly a ~4.8GHz affair.
c. One thing to remember is that 8700k is special because while all other high core count chips today from AMD and Intel use a more scaleable mesh that is great for high thread count applications (Ryzen, SkyLake-X) but not good for ~4 thread applications, 8700K is the first CPU to use the highly optimised for lower thread-count mesh which is exactly the one found in kaby - it might not perform so great in heavily threaded tasks as other high-core-count CPU's, but it performs at least as well as 4 core kaby clock for clock on 4 cores, unlike those other CPUs.
bo3b said:
This line of thought would discourage a move to Coffee, because 6 cores is more than we actually need,
This is true only for 3D Vision, at this present point in time, but not on the whole. It is also up for investigation - Metaholic has ordered a 8700K system which he intends to clock to 5GHz, and we are intending to compare 7700K vs 8700K at 5GHz to see what kind of difference there is. We don't want to bias our results by going in with a strong opinion, but we are confident that 3D Vision bottleneck will likely show that 4HT cores on 7700k vs. 6HT cores on 8700k, all @5GHz are all pretty equal in performance, not taking cache into account.
bo3b said:
...even in i5 trim, and makes OC weaker.
As above, this no longer necessarily true, and even quite the opposite when comparing generations with good cooling.
bo3b said:
Base frequency starts lower too. Base frequency matters because Windows has a uniquely awful process manager which keeps other cores hot all the time, so the automatic turbo tends to not be active at all. If turbo only works for fewer cores, it's not going to be used.
Automatic OC used to OC a small number of cores or a single core as turbo. This might have been true back then for automatic OC scenarios.
With Coffee, Intel have introduced Multi-Core Enhancement (MCE) which now automatically OCs all cores simultaneously. This was the main culprit that tended to show 8700k perform better than a 7700k at the same apparent frequency when using workloads of 4 cores or less.
Nowadays, when we OC on SkyL/kaby/coffee, we OC all cores, and force the OC both in the BIOS and Windows power management. This ensures that all cores are at the set frequency (say 5GHz) constantly, no matter what.
bo3b said:
I took this approach with an older computer, doing i5, instead of i7, and it did not work out. OC was feeble anyway, and I think it also cost me some cache, which might very well be more important that we think.
Cache hypothesis is intriguing. Indeed, in the past, Intel have showed lower frequency CPUs outperform same generation higher frequency CPUs due to having more cache. Cache frequency doesn't seem to be as important though - tests have shown that with a 5GHz OC, varying cache frequency from 4.2GHz to 5GHz showed no tangible difference. Benchmarks from Metaholic/myself comparing 4 core kaby vs 4 core coffee with 2 cores disabled = more cache for other cores, ought to make for interesting analysis.
bo3b said:
I'm not building for 10 years, I think that is just a waste of money. There is no such thing as future-proofing, no one knows what the future will bring. Case in point- 5 years ago, "get as many cores as you can, it's all going to be multithreaded." Did not happen. Another example- bought SkyLake for the future, great chip, should last. Microsoft is putting a bullet into Win7 support next March. Did not see that coming. "FutureProofing" is simply rationalizing an expensive purchase, it's not a real thing. Especially for 3D Vision.
I agree to an extent, but I would say that there are trends that cannot be ignored - we have hit a 5GHz brick wall that will only be more and more difficult to go past until we start using something other than silicon. IPC x clock right now, will likely be the same IPC x clock 10 years from now except maybe 25% improvement. Not much has changed over the past decade in these terms. The only way to up performance is more cores. Indeed this does not bode well for 3D vision.
In this regard, future proofing means getting a CPU with the best IPC x clock right now, while also getting as many cores. It does not mean sacrificing IPC x clock for a higher core count.
bo3b said:
I need the best I can get for 3D right now, and it's not clear to me that 6 cores is that. The trouble here is that best for 3D is not necessarily the highest priced chip, it's so complicated I'm not sure where the sweet spot is.
You've hit the nail on the head. Everything else being equal, 7700k is the best right now for 3D gaming, though potentially 8700k will be better because it has more cache once 2 cores are disabled, and it will have increased OC potential.
This is not discounting the potential that the 2 more cores might actually make a difference to the 3D Vision bottleneck in games where they make heavy use of >4 cores in 2D. Right now, a safe bet is that an 8700k @ 5 Ghz is at least as good for 3D Vision as a 7700K @ 5GHz, but comes with 50% more cores and cache, and actually works out cheaper than 7700k per core:
7700k = £274/4= £68.50 per core.
8700k = £394/6 £65.60 per core.
My rationale, after a lot of research, is as follows:
1. Although 3D vision bottlenecked gaming is one of the main activities on my system, I do a lot of other things too: professional CAD/rendering, VR, CMode gaming, 2D gaming, encoding, etc. I believe that over the next 10 years, a power user would make decent use of the 2 extra cores.
2. Indeed the IPC is identical, but the all core overclock seems on par with a kaby 7700K. This is great in that if one was to disable the middle 2 cores on the 8700k coffee in the BIOS to compare Apples-to-Apples, you would have better performance due to:
a. more mature process
b. less concentrated heat, now that there are 2 cores separation and a much larger heatsink surface area
c. more cache shared between less cores.
3. Silicon has a 5GHz barrier that it won't surpass much any time soon. The only way forward, far more so than a decade ago, is more cores @5GHz base. Already we see NextGen consoles make great use of these cores - BF1, GTA5, Titanfall 2, Ashes of the singularity; this right here, is the future:
4. The realities of Future-proofing: It is natural to assume that more games will properly use DX12 and make far better use of more cores going into the future. It is also a safe bet that Next gen consoles, being stuck at the 5GHz barrier, will likely use even more cores than the 8 they use currently. This means that ports from them should make good use of as many cores as a PC CPU can spare, IPC x Clock being as it currently stands of course.
Already consoles use 8 cores (granted they are weak individually) - but the threading potential that Mantle/Vulcan and DX12 have presented is very real and will absolutely be taken advantage of going forward. It would be unwise to invest in a CPU which uses less than these going forward (i.e. i5 wouldn't be a good decision for the future speaking generally, not specifically for 3D Vision bottlenecked gaming).
5. The problem with the old thinking was that AMD's IPC and clocks were quite horrendous compared to Intel. They tried to substitute inferior engineering with higher core count because they had no choice. It was never their intention or plan to sacrifice IPC for a higher core count. The truth is that they couldn't get higher IPC due to their tiny R&D budget, so to stay alive and compete, they gave the next best marketable thing: copy-paste more cores.
The realities of research and development are that designing a higher IPC core was too expensive at the time - better just to copy-paste what they had come up with with their comparatively tiny R&D budget, and then let marketing tell people high core count was the future - Indeed it was, but not as a trade in for weak single core performance. AMD learned their lesson, one they knew all along - investing pretty much everything they had into Zen to be able to survive.
Also, indeed, the software to take advantage of high core count in gaming was never really there - only recently have we had mantle and DX12 break through some of the limits.
Indeed there is a good chance Intel will release 8 core CPUs with similar IPC and clocks to SkyL/Kaby/Coffee, and then 12 cores, and 16 cores within the next 10 years. Whether or not they will be able to compete with the kaby/coffee interlink which does an amazing job for lower threaded applications remains to be seen. Certainly, AMD is going to do exactly this... :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
My question would be. With my GTX 1070 and a i5 8400 do I get 50 to 60fps in 1080p in 3D for games like as an example Battlefield 1 and Quantum Break? With my current PC I get 30fps in these games. I will be happy with 50 to 60 fps. I am not buying a PC for 10 years, so just want it to be able to run above 45fps recent 3D Vision games, like lets say Hellblade once it is fixed.
I will have to spend quite a bit of money with other things right now, also just have to get the Rift replaced, which will be a pain from here and will cost, need a nobreak too. So, I dont think I can afford a 8600K right now as I have just purchased another Rift through a friend that will bring me from US. Cant live without VR now :P
My question would be. With my GTX 1070 and a i5 8400 do I get 50 to 60fps in 1080p in 3D for games like as an example Battlefield 1 and Quantum Break? With my current PC I get 30fps in these games. I will be happy with 50 to 60 fps. I am not buying a PC for 10 years, so just want it to be able to run above 45fps recent 3D Vision games, like lets say Hellblade once it is fixed.
I will have to spend quite a bit of money with other things right now, also just have to get the Rift replaced, which will be a pain from here and will cost, need a nobreak too. So, I dont think I can afford a 8600K right now as I have just purchased another Rift through a friend that will bring me from US. Cant live without VR now :P
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
[quote="bo3b"][quote="J0hnnieW4ker"] I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers. [/quote]
This is the good test. You tried it on another computer, completely unrelated to yours. And it failed. That demonstrates that it is the Rift. Let Oculus Support know this, and be clear with them, don't add a lot of extra tests. This one test proves the Rift is faulty.
BTW, that Inatek board is bogus. Oculus recommended it for a long time, but it causes a lot of problems. Read the comments section: https://www.oculus.com/blog/oculus-roomscale-extra-equipment/
At every boot it resets power management, which sounds like it might be your problem.[/quote]
Unfortunately, I dont have any friend here that has a Rift. I could probably count on my fingers the number of people that have a Rift here in South America. Oculus support believe that the issue is my Touch controllers, but I am not too sure about that. Anyway I purchased another one through a friend and hopefully, I can find the defected component by swapping them.
Oculus support won't send a replacement to my country, so I still need to figure out how I would get it replaced and avoid tax fees when the replacement returns as if I use a different address in US to receive and sent to me I might have to pay the tax fees, which are huuuuuge here.
J0hnnieW4ker said: I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers.
This is the good test. You tried it on another computer, completely unrelated to yours. And it failed. That demonstrates that it is the Rift. Let Oculus Support know this, and be clear with them, don't add a lot of extra tests. This one test proves the Rift is faulty.
At every boot it resets power management, which sounds like it might be your problem.
Unfortunately, I dont have any friend here that has a Rift. I could probably count on my fingers the number of people that have a Rift here in South America. Oculus support believe that the issue is my Touch controllers, but I am not too sure about that. Anyway I purchased another one through a friend and hopefully, I can find the defected component by swapping them.
Oculus support won't send a replacement to my country, so I still need to figure out how I would get it replaced and avoid tax fees when the replacement returns as if I use a different address in US to receive and sent to me I might have to pay the tax fees, which are huuuuuge here.
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Nobody knows if recent games run at 50 to 60fps 1080p in 3D with a 1070 and a i5 8400?
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?
Nobody knows if recent games run at 50 to 60fps 1080p in 3D with a 1070 and a i5 8400?
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
[quote="J0hnnieW4ker"]Nobody knows if recent games run at 50 to 60fps 1080p in 3D with a 1070 and a i5 8400?
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?[/quote]
Depends on the game.....Witcher 3 and GTAV probably not.
Older games like Metro Last Light, Shadow Warrior ran great on my old GTX 780/i5 2500k combo.
J0hnnieW4ker said:Nobody knows if recent games run at 50 to 60fps 1080p in 3D with a 1070 and a i5 8400?
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?
Depends on the game.....Witcher 3 and GTAV probably not.
Older games like Metro Last Light, Shadow Warrior ran great on my old GTX 780/i5 2500k combo.
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
[quote="lou4612"][quote="J0hnnieW4ker"]Nobody knows if recent games run at 50 to 60fps 1080p in 3D with a 1070 and a i5 8400?
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?[/quote]
Depends on the game.....Witcher 3 and GTAV probably not.
Older games like Metro Last Light, Shadow Warrior ran great on my old GTX 780/i5 2500k combo.
[/quote]
Yeah, more interested to know about new games like the list below and if the bottleneck would be the processor in these cases. I think an i5 8400 will not bottleneck these games even with 3D Vision cpu issues, the bottleneck here would be the 1070 I think.
Mass Effect Andromeda
Battlefield 1
Middle Earth: Shadow of Mordor
Quantum Break
Ryse - Son of Rome
The Witcher 3
The Evil Within 2
The Evil Within
Anyway, end up buying an i5 8600K :D
J0hnnieW4ker said:Nobody knows if recent games run at 50 to 60fps 1080p in 3D with a 1070 and a i5 8400?
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?
Depends on the game.....Witcher 3 and GTAV probably not.
Older games like Metro Last Light, Shadow Warrior ran great on my old GTX 780/i5 2500k combo.
Yeah, more interested to know about new games like the list below and if the bottleneck would be the processor in these cases. I think an i5 8400 will not bottleneck these games even with 3D Vision cpu issues, the bottleneck here would be the 1070 I think.
Mass Effect Andromeda
Battlefield 1
Middle Earth: Shadow of Mordor
Quantum Break
Ryse - Son of Rome
The Witcher 3
The Evil Within 2
The Evil Within
Anyway, end up buying an i5 8600K :D
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Well I cant provide numbers at the moment but I have had great 3d experiences with Middle Earth and The Witcher 3 with my old overclocked 6 core I expect the i5 8400 would beat easily. I can provide detailed info on 4 of those games if wanted but thats my anecdotal experience.
Well I cant provide numbers at the moment but I have had great 3d experiences with Middle Earth and The Witcher 3 with my old overclocked 6 core I expect the i5 8400 would beat easily. I can provide detailed info on 4 of those games if wanted but thats my anecdotal experience.
Win7/10 64-bit, Xeon x5650 @ 4.5GHz (Corsair H105), 18GB 1666MHz DDR3 RAM, EVGA GTX 1080 Ti Hybrid SC2, ASUS P6X58D-E MB, Sound Blaster Z, Crucial MX300 SSD (OS), WD Black HDD (Games), EVGA G2 850W PS, Benq 2150ST Projector, ASUS VG278H + ROG SWIFT PG278Q
Thanks for the reply. Lets see if there will be any good promos this black friday here.
With my budget I could get another SSD or another 8GB of ram for the price difference of the i5 8400 vs 8600K.
I dont think there will be much difference in VR, but yes 3D Vision might require the OC of the processor :(
These are my options so far:
i5-8400
i5-8600k
MSI Z370 SLI PLUS
Corsair Vengeance LPX 8GB (2x4GB) 3200Mhz DDR4 CL16 Black
Any recommendations? The 8600K will make me go over my budget :(
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
I am going to say something and you will not like it.
Your free time is valuable - you work hard for it. Why settle for anything less than the relatively best you can have?
Forget the 8400. There isn't a big price difference between 8600k and 8700k. There is a huge difference in HT vs non HT though, which will give 30% gains in the future. You have had the i7 860 for 10 years. You will likely have the new CPU for even longer. HT does matter in games even nowadays, let alone in the future.
My advice to you:
-- Save up more money.
-- Buy 8700K + good OC motherboard + cooler, and OC as high as you can.
-- Buy 16GB 3000+MHz memory with good CL (memory prices ought to drop next year) - On my streamlined system, modern games + OS combined take up vastly more than 8GB memory - 8GB is not enough nowadays and certainly not for the future.
Compared to what you are proposing to buy now:
-- You will get a significant improvement in performance going into the future.
-- Much better future proofing - new gen consoles over the next 10 years will have a huge amount of cores, most PC games are ports of those console games designed for those core counts; you will need all the cores/virtual cores you can get.
-- For the next 10 years +, you will be glad you waited a few more months now.
Someone smarter than me once said something similar to this:
A late purchase is only late once, a bad purchase is bad forever.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I used to live in Australia and I am sure if I was still there I would already have done this upgrade by now, but here the reality is another.
The 8700k is way out of my league, the processor itself cost almost the entire upgrade 8400/mobo/memory and the extra cores so far are doing no good for gaming, so not really an option for me. I think the best to do now is to get an i5 and upgrade if the games start using the other cores in the future, which honestly I doubt, but still the option will be there for an upgrade later on.
My decision here is between the i5 8600k which cost around $400 dollars here now and the i5 8400 which costs $260 dollars, the $150 dollars difference is quite a bit of money here and that will only be to basically play 3D Vision games, otherwise, the 8400 should be more than enough. The plan was to wait until next year to do the upgrade and get a better price on the 8600k, but now I am having so many issues with the Oculus Rift that I cannot figure out. The sensors are not working properly and it is driving me insane, I kind of just want a fresh new computer with a decent VR ready motherboard, so perhaps I won't have USB issues. I've got a Inateck USB 3.0 PCI-e card but also having issues, so I am a bit lost, wondering the the Rift has a hardware issues. This is one reason I am rushing to buy a new PC, just had enough. If the issues dont get fixed, then I know the Rift is the problem.
I spent hours trying to get it to work a few days ago when the issues started and suddenly got it to work, played for 3 days during the weekend, and decided to reboot the computer yesterday as I was feeling it was not going to work again after the reboot, and yes it indeed stopped working... I am pissed :(
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Google Oculus Tray Tool, and within it, enable the Fresco tweaks + Fresco power management tweaks. That has helped a lot of people.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Yes, I wouldnt too, but this upgrade is due and I,ve been waiting for the i5 coffee lake, this issue with the Rift and the black friday this week seem to be a good time to go ahead with the upgrade.
I suspect it could be a hardware or firmware issue on my rift, sensors or toch controllers. I have spent so much time already trying to figure out and send emais to Oculus support.
Everything was working fine for months here, I even had usb extension cables installed and the sensors were working on the usb 2.0 ports of the motherboard coz i had issues with the inateck card and did not want to mess around as the worked fine in the mobo 2.0 ports. But anyway suddenly the touch controllers went offline and the sensors started showing error of poor tracking wireless sync time out. So, I changes the usb ports and adter a while managed to get rid of the sensors errors, but the touch controllers would not get back online, so I removed them and tried to pair them again, but then another problem happened, they would not complete the pairing, got stuck in the thrird step "finalizing controller", the firts two steps are find and pair sucessfully. After trying quite a few times and run the full setup again, always got stuck. If i messed with the cables the sensor errors would apear again, or even not messing at all with them it would sometimes appear the error messages with no patters. I unistalled the Oculus Software and reinstalled it and luckly it worked and paired again the controllers, but again i had the same problems the next day. This time reinstaling would not fix and i could not pass the pairing process of the touch. I went to a friends house and the same thing happened, error messages on the sensors and stuck during the pairing of the contollers. Got back from frienda house tried again and it woked, got the controllers paired and everything working fine, played From Other Suns during the weekend for 3 days without rebooting my PC, decided to reboot yesterday and again the same issues. Just tired really, wanna replace bloody everything just to make sure it is the rift that is defected. I tried again today to reinstall everything and i dont get error messages with everything connected to the inateck, but cannot get thw touch to pair and once I try error messages in the sensors appear again and there is nothing i can do to get rid of them, even leaving just a single sensor connected would give me error messages. So nothing really makes sense...
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
or you can borrow a RIFT from your friend and test it on your system?
Replacing every major component in your PC to figure out if the RIFT is defective is expensive
8700K 5.0Ghz OC (Silicon Lottery Edition)
Noctua NH-15 cooler
Asus Maximus X Hero
16 GB Corsair Vengeance LPX RAM DDR4 3000
1TB Samsung PM961 OEM M.2 NVMe
MSI Gaming X Trio 1080Ti SLI
Corsair 1000RMi PSU
Cougar Conquer Case
Triple Screens Acer Predator 3D Vision XB272
3D Vision 2 Glasses
Win 10 Pro x64
This is the good test. You tried it on another computer, completely unrelated to yours. And it failed. That demonstrates that it is the Rift. Let Oculus Support know this, and be clear with them, don't add a lot of extra tests. This one test proves the Rift is faulty.
BTW, that Inatek board is bogus. Oculus recommended it for a long time, but it causes a lot of problems. Read the comments section: https://www.oculus.com/blog/oculus-roomscale-extra-equipment/
At every boot it resets power management, which sounds like it might be your problem.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
What do you think of the idea of using one generation back, i7-7700K?
The IPC does not seem to be different between Sky/Kaby/Coffee.
https://us.hardware.info/reviews/7602/22/intel-core-i7-8700k--i5-8600k--i5-8400-coffee-lake-review-affordable-six-cores-ipc-test-kaby-lake-vs-coffee-lake
I'm betting the differences people measure in 2D benchmarks are because of the larger cache on Coffee, not IPC. That still might make a measurable difference for 3D, but I'd really want to see it.
The reason I question this, is because it's much easier to get an OC with Kaby than Coffee. And even easier on Sky. I completely agree that IPC x Clock is what matters for our single threaded problems. If I can push the OC better, that seems like the only win to be had.
The theory then is also that fewer cores is easier to OC. There has been a lot of discussion that i5 is better because lowering the chip real estate that is active is better. So i5-6600K Sky in principle might be best.
This line of thought would discourage a move to Coffee, because 6 cores is more than we actually need, even in i5 trim, and makes OC weaker. Base frequency starts lower too. Base frequency matters because Windows has a uniquely awful process manager which keeps other cores hot all the time, so the automatic turbo tends to not be active at all. If turbo only works for fewer cores, it's not going to be used.
I took this approach with an older computer, doing i5, instead of i7, and it did not work out. OC was feeble anyway, and I think it also cost me some cache, which might very well be more important that we think.
I'm not building for 10 years, I think that is just a waste of money. There is no such thing as future-proofing, no one knows what the future will bring. Case in point- 5 years ago, "get as many cores as you can, it's all going to be multithreaded." Did not happen. Another example- bought SkyLake for the future, great chip, should last. Microsoft is putting a bullet into Win7 support next March. Did not see that coming. "FutureProofing" is simply rationalizing an expensive purchase, it's not a real thing. Especially for 3D Vision.
I need the best I can get for 3D right now, and it's not clear to me that 6 cores is that. The trouble here is that best for 3D is not necessarily the highest priced chip, it's so complicated I'm not sure where the sweet spot is.
From what I've seen in the OC wars, the best choice at the moment appears to be i7-7700K, Kaby. Best base clock, easier to OC especially if you de-lid, best turbo if that works, 1M less cache.
i7-compare:
https://ark.intel.com/compare/88195,97129,126684
i5, then i7, all six:
https://ark.intel.com/compare/88191,97144,126685,88195,97129,126684
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Your rationale is spot on all the way, and I can attest that I am very happy with my kaby. I feel that there are different perspectives - from people who might use the computer for different things.
Let's assume that the IPC is exactly the same.
I don't feel this is true - comparing them all at 4 cores All OC'd, it's easier to get higher OC on Kaby, and then easier still to get even higher OC on Coffee, assuming you disable 2 middle cores.
Generally that is intuitively true. However, there was trade-off in the old days with number of cores vs higher OC (and also the interlinking mesh they used to connect the cores). Back in the day when games didn't make use of >4 cores, an i5 4 core with high OC would have been the best way to go - you are absolutely right.
It is different now:
a. games make good use of more than 4 logical cores.
b. What coffee offers is something quite remarkable - as high an OC (5GHz+) on all 12 HT cores as an older gen i5 with 4 cores (kaby), which just about managed to hit 5GHz on average. SkyL though was mostly a ~4.8GHz affair.
c. One thing to remember is that 8700k is special because while all other high core count chips today from AMD and Intel use a more scaleable mesh that is great for high thread count applications (Ryzen, SkyLake-X) but not good for ~4 thread applications, 8700K is the first CPU to use the highly optimised for lower thread-count mesh which is exactly the one found in kaby - it might not perform so great in heavily threaded tasks as other high-core-count CPU's, but it performs at least as well as 4 core kaby clock for clock on 4 cores, unlike those other CPUs.
This is true only for 3D Vision, at this present point in time, but not on the whole. It is also up for investigation - Metaholic has ordered a 8700K system which he intends to clock to 5GHz, and we are intending to compare 7700K vs 8700K at 5GHz to see what kind of difference there is. We don't want to bias our results by going in with a strong opinion, but we are confident that 3D Vision bottleneck will likely show that 4HT cores on 7700k vs. 6HT cores on 8700k, all @5GHz are all pretty equal in performance, not taking cache into account.
As above, this no longer necessarily true, and even quite the opposite when comparing generations with good cooling.
Automatic OC used to OC a small number of cores or a single core as turbo. This might have been true back then for automatic OC scenarios.
With Coffee, Intel have introduced Multi-Core Enhancement (MCE) which now automatically OCs all cores simultaneously. This was the main culprit that tended to show 8700k perform better than a 7700k at the same apparent frequency when using workloads of 4 cores or less.
Nowadays, when we OC on SkyL/kaby/coffee, we OC all cores, and force the OC both in the BIOS and Windows power management. This ensures that all cores are at the set frequency (say 5GHz) constantly, no matter what.
Cache hypothesis is intriguing. Indeed, in the past, Intel have showed lower frequency CPUs outperform same generation higher frequency CPUs due to having more cache. Cache frequency doesn't seem to be as important though - tests have shown that with a 5GHz OC, varying cache frequency from 4.2GHz to 5GHz showed no tangible difference. Benchmarks from Metaholic/myself comparing 4 core kaby vs 4 core coffee with 2 cores disabled = more cache for other cores, ought to make for interesting analysis.
I agree to an extent, but I would say that there are trends that cannot be ignored - we have hit a 5GHz brick wall that will only be more and more difficult to go past until we start using something other than silicon. IPC x clock right now, will likely be the same IPC x clock 10 years from now except maybe 25% improvement. Not much has changed over the past decade in these terms. The only way to up performance is more cores. Indeed this does not bode well for 3D vision.
In this regard, future proofing means getting a CPU with the best IPC x clock right now, while also getting as many cores. It does not mean sacrificing IPC x clock for a higher core count.
You've hit the nail on the head. Everything else being equal, 7700k is the best right now for 3D gaming, though potentially 8700k will be better because it has more cache once 2 cores are disabled, and it will have increased OC potential.
This is not discounting the potential that the 2 more cores might actually make a difference to the 3D Vision bottleneck in games where they make heavy use of >4 cores in 2D. Right now, a safe bet is that an 8700k @ 5 Ghz is at least as good for 3D Vision as a 7700K @ 5GHz, but comes with 50% more cores and cache, and actually works out cheaper than 7700k per core:
7700k = £274/4= £68.50 per core.
8700k = £394/6 £65.60 per core.
My rationale, after a lot of research, is as follows:
1. Although 3D vision bottlenecked gaming is one of the main activities on my system, I do a lot of other things too: professional CAD/rendering, VR, CMode gaming, 2D gaming, encoding, etc. I believe that over the next 10 years, a power user would make decent use of the 2 extra cores.
2. Indeed the IPC is identical, but the all core overclock seems on par with a kaby 7700K. This is great in that if one was to disable the middle 2 cores on the 8700k coffee in the BIOS to compare Apples-to-Apples, you would have better performance due to:
a. more mature process
b. less concentrated heat, now that there are 2 cores separation and a much larger heatsink surface area
c. more cache shared between less cores.
3. Silicon has a 5GHz barrier that it won't surpass much any time soon. The only way forward, far more so than a decade ago, is more cores @5GHz base. Already we see NextGen consoles make great use of these cores - BF1, GTA5, Titanfall 2, Ashes of the singularity; this right here, is the future:
4. The realities of Future-proofing: It is natural to assume that more games will properly use DX12 and make far better use of more cores going into the future. It is also a safe bet that Next gen consoles, being stuck at the 5GHz barrier, will likely use even more cores than the 8 they use currently. This means that ports from them should make good use of as many cores as a PC CPU can spare, IPC x Clock being as it currently stands of course.
Already consoles use 8 cores (granted they are weak individually) - but the threading potential that Mantle/Vulcan and DX12 have presented is very real and will absolutely be taken advantage of going forward. It would be unwise to invest in a CPU which uses less than these going forward (i.e. i5 wouldn't be a good decision for the future speaking generally, not specifically for 3D Vision bottlenecked gaming).
5. The problem with the old thinking was that AMD's IPC and clocks were quite horrendous compared to Intel. They tried to substitute inferior engineering with higher core count because they had no choice. It was never their intention or plan to sacrifice IPC for a higher core count. The truth is that they couldn't get higher IPC due to their tiny R&D budget, so to stay alive and compete, they gave the next best marketable thing: copy-paste more cores.
The realities of research and development are that designing a higher IPC core was too expensive at the time - better just to copy-paste what they had come up with with their comparatively tiny R&D budget, and then let marketing tell people high core count was the future - Indeed it was, but not as a trade in for weak single core performance. AMD learned their lesson, one they knew all along - investing pretty much everything they had into Zen to be able to survive.
Also, indeed, the software to take advantage of high core count in gaming was never really there - only recently have we had mantle and DX12 break through some of the limits.
Indeed there is a good chance Intel will release 8 core CPUs with similar IPC and clocks to SkyL/Kaby/Coffee, and then 12 cores, and 16 cores within the next 10 years. Whether or not they will be able to compete with the kaby/coffee interlink which does an amazing job for lower threaded applications remains to be seen. Certainly, AMD is going to do exactly this... :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I will have to spend quite a bit of money with other things right now, also just have to get the Rift replaced, which will be a pain from here and will cost, need a nobreak too. So, I dont think I can afford a 8600K right now as I have just purchased another Rift through a friend that will bring me from US. Cant live without VR now :P
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Unfortunately, I dont have any friend here that has a Rift. I could probably count on my fingers the number of people that have a Rift here in South America. Oculus support believe that the issue is my Touch controllers, but I am not too sure about that. Anyway I purchased another one through a friend and hopefully, I can find the defected component by swapping them.
Oculus support won't send a replacement to my country, so I still need to figure out how I would get it replaced and avoid tax fees when the replacement returns as if I use a different address in US to receive and sent to me I might have to pay the tax fees, which are huuuuuge here.
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Anybody running recent 3D Vision games with a processor with 4ghz clock speed? Do we need extreme overclocking like 5GHz to run 3D Vision games properly?
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Depends on the game.....Witcher 3 and GTAV probably not.
Older games like Metro Last Light, Shadow Warrior ran great on my old GTX 780/i5 2500k combo.
Gaming Rig 1
i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO
Gaming Rig 2
My new build
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
Yeah, more interested to know about new games like the list below and if the bottleneck would be the processor in these cases. I think an i5 8400 will not bottleneck these games even with 3D Vision cpu issues, the bottleneck here would be the 1070 I think.
Mass Effect Andromeda
Battlefield 1
Middle Earth: Shadow of Mordor
Quantum Break
Ryse - Son of Rome
The Witcher 3
The Evil Within 2
The Evil Within
Anyway, end up buying an i5 8600K :D
EVGA GTX 1070 FTW
Motherboard MSI Z370 SLI PLUS
Processor i5-8600K @ 4.2 | Cooler SilverStone AR02
Corsair Vengeance 8GB 3000Mhz | Windows 10 Pro
SSD 240gb Kingston UV400 | 2x HDs 1TB RAID0 | 2x HD 2TB RAID1
TV LG Cinema 3D 49lb6200 | ACER EDID override | Oculus Rift CV1
Steam: http://steamcommunity.com/id/J0hnnieW4lker
Screenshots: http://phereo.com/583b3a2f8884282d5d000007
Win7/10 64-bit, Xeon x5650 @ 4.5GHz (Corsair H105), 18GB 1666MHz DDR3 RAM, EVGA GTX 1080 Ti Hybrid SC2, ASUS P6X58D-E MB, Sound Blaster Z, Crucial MX300 SSD (OS), WD Black HDD (Games), EVGA G2 850W PS, Benq 2150ST Projector, ASUS VG278H + ROG SWIFT PG278Q