How important is CPU performance for 3d vision?
  21 / 22    
[quote="mistersvin21"][quote="Metal-O-Holic"]seems that 8700k is 1151 board, i though all these new are 2xxx [/quote] it doesnt rly matter since they are 3xx chipset boards compatible only. this is funny, isn't it ?[/quote] Thats true it kind of is. Though im not laughing :D Hopefully these get available any time soon.
mistersvin21 said:
Metal-O-Holic said:seems that 8700k is 1151 board, i though all these new are 2xxx

it doesnt rly matter since they are 3xx chipset boards compatible only. this is funny, isn't it ?


Thats true it kind of is. Though im not laughing :D
Hopefully these get available any time soon.

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

Posted 09/27/2017 04:14 AM   
There's a review for the 8700K already: https://videocardz.com/72915/first-review-of-intel-core-i7-8700k-leaks-out The Witcher 3 (a game that likes using lots of cores) sees a 18.86% improvement compared to the 7700K, with both at 4.5GHz. However, people say this CPU runs hotter than the 7700K (which is already hot to begin with). I don't know if the other games had a GPU bottleneck or didn't use more than 4 cores.
There's a review for the 8700K already: https://videocardz.com/72915/first-review-of-intel-core-i7-8700k-leaks-out


The Witcher 3 (a game that likes using lots of cores) sees a 18.86% improvement compared to the 7700K, with both at 4.5GHz. However, people say this CPU runs hotter than the 7700K (which is already hot to begin with).

I don't know if the other games had a GPU bottleneck or didn't use more than 4 cores.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 09/27/2017 12:52 PM   
[quote="masterotaku"]However, people say this CPU runs hotter than the 7700K (which is already hot to begin with)..[/quote] That's bad, I was hoping that Intel may correct the heating issue, but ofcourse no. I was planning to buy an 8700k 4-5 months later, sigh...
masterotaku said:However, people say this CPU runs hotter than the 7700K (which is already hot to begin with)..


That's bad, I was hoping that Intel may correct the heating issue, but ofcourse no. I was planning to buy an 8700k 4-5 months later, sigh...

Asus Deluxe Gen3, Core i7 2700k@4.5Ghz, GTX 1080Ti, 16 GB RAM, Win 7 64bit
Samsung Pro 250 GB SSD, 4 TB WD Black (games)
Benq XL2720Z

Posted 09/28/2017 08:36 AM   
I wouldn't worry about the heat at all. A good custom loop or AIO such as the Kraken x62 will keep those temps below 60 degrees C even at 5GHz. Just make sure to delid and use a good thermal paste such as liquid metal. There is huge headroom: The 7700K doesn't begin to throttle until 100 degrees C, or degrade until 1.52V. why worry when both temps and voltage are well below this? It's all within well established operating limits. 8700k shouldn't be much different. Any degradation, if even at all possible when within spec, will not eat into useful life on the CPU i.e. when you next upgrade; at which point a replacement CPUs for a degraded one will be far cheaper. IM humble O, there are more important things in life to worry about :) Also, I should point out here again, please make sure to get fast memory,- memory speed actually matters a great deal nowadays!
I wouldn't worry about the heat at all. A good custom loop or AIO such as the Kraken x62 will keep those temps below 60 degrees C even at 5GHz. Just make sure to delid and use a good thermal paste such as liquid metal.

There is huge headroom: The 7700K doesn't begin to throttle until 100 degrees C, or degrade until 1.52V. why worry when both temps and voltage are well below this? It's all within well established operating limits. 8700k shouldn't be much different.

Any degradation, if even at all possible when within spec, will not eat into useful life on the CPU i.e. when you next upgrade; at which point a replacement CPUs for a degraded one will be far cheaper.

IM humble O, there are more important things in life to worry about :)


Also, I should point out here again, please make sure to get fast memory,- memory speed actually matters a great deal nowadays!

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 09/28/2017 06:27 PM   
[quote="RAGEdemon"] There is huge headroom: The 7700K doesn't begin to throttle until 100 degrees C, or degrade until 1.52V. why worry when both temps and voltage are well below this? It's all within well established operating limits. 8700k shouldn't be much different. [/quote] The problem is when you want to do stability benchmarks. To do it properly, you need the CPU to not throttle due to temperature, but at the same time you know that any real game won't make it run near that hot. But if you can't test it, you don't know if your OC is stable. My Noctua NH-D14 is barely enough for OCCT at 4.9GHz (high 90s degrees C in the heaviest benchmark. Undelidded CPU, to be fair) and I've heard enough horror stories about liquid cooling to fear trying for now.
RAGEdemon said:
There is huge headroom: The 7700K doesn't begin to throttle until 100 degrees C, or degrade until 1.52V. why worry when both temps and voltage are well below this? It's all within well established operating limits. 8700k shouldn't be much different.


The problem is when you want to do stability benchmarks. To do it properly, you need the CPU to not throttle due to temperature, but at the same time you know that any real game won't make it run near that hot. But if you can't test it, you don't know if your OC is stable. My Noctua NH-D14 is barely enough for OCCT at 4.9GHz (high 90s degrees C in the heaviest benchmark. Undelidded CPU, to be fair) and I've heard enough horror stories about liquid cooling to fear trying for now.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 09/28/2017 07:51 PM   
[quote="masterotaku"][quote="RAGEdemon"] There is huge headroom: The 7700K doesn't begin to throttle until 100 degrees C, or degrade until 1.52V. why worry when both temps and voltage are well below this? It's all within well established operating limits. 8700k shouldn't be much different. [/quote] To do it properly, you need the CPU to not throttle due to temperature, but at the same time you know that any real game won't make it run near that hot. But if you can't test it, you don't know if your OC is stable. [/quote] when the whole shit doesnt tip over you know you have a steady OC, why bother anything else ?
masterotaku said:
RAGEdemon said:
There is huge headroom: The 7700K doesn't begin to throttle until 100 degrees C, or degrade until 1.52V. why worry when both temps and voltage are well below this? It's all within well established operating limits. 8700k shouldn't be much different.

To do it properly, you need the CPU to not throttle due to temperature, but at the same time you know that any real game won't make it run near that hot. But if you can't test it, you don't know if your OC is stable.


when the whole shit doesnt tip over you know you have a steady OC, why bother anything else ?

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

Posted 09/28/2017 08:19 PM   
For what it's worth masterotaku, the best AIOs such as the Kraken x62 use glycol (antifreeze) and distilled water - AFAIK, even if it leaks onto electronics, it shouldn't do any damage due to high resistance and the CPU thermal throttle kicking in. Whether the high resistance properties last for a long time is another question anyone has yet to reliably answer :) NH-D14 is pretty much best thing on air IIRC. I really think your non-delid is letting you down. I keep going on about it because I like seeing people have the best performance - the delid wasn't nearly as difficult as I thought it was going to be at all. What was difficult imo was spreading the liquid metal and having it stick to both surfaces - this isn't dangerous if you take sensible precautions such as tape over exposed cpu resistors etc, just very time consuming. It's up to you if you want to or not mate. The extra OC and lower temps might not even be noticeable.
For what it's worth masterotaku, the best AIOs such as the Kraken x62 use glycol (antifreeze) and distilled water - AFAIK, even if it leaks onto electronics, it shouldn't do any damage due to high resistance and the CPU thermal throttle kicking in. Whether the high resistance properties last for a long time is another question anyone has yet to reliably answer :)

NH-D14 is pretty much best thing on air IIRC. I really think your non-delid is letting you down. I keep going on about it because I like seeing people have the best performance - the delid wasn't nearly as difficult as I thought it was going to be at all.

What was difficult imo was spreading the liquid metal and having it stick to both surfaces - this isn't dangerous if you take sensible precautions such as tape over exposed cpu resistors etc, just very time consuming.

It's up to you if you want to or not mate. The extra OC and lower temps might not even be noticeable.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 09/29/2017 12:39 AM   
i5 8400 is enough for 4K@3DVision gaming?
i5 8400 is enough for 4K@3DVision gaming?

Posted 10/10/2017 08:29 PM   
UP
UP

Posted 04/24/2018 08:17 PM   
[quote="nv1d1a5ux"]UP[/quote] Why? What's your question? If you're asking if anything has changed, then no, nothing has changed. If you want to understand the issues, then read this thread and the answers I gave you in your other thread.
nv1d1a5ux said:UP


Why? What's your question? If you're asking if anything has changed, then no, nothing has changed.
If you want to understand the issues, then read this thread and the answers I gave you in your other thread.

Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310

Posted 04/24/2018 08:47 PM   
Is this CPU issue still going on? I'm wondering because in all my racing games: F1 2017, Assetto Corsa, Project Cars, rFactor 1, Automobilista (formerly known as Game Stock Car or Game Stock Car Extreme), my 3D Vision performance is way off my 2d fps. When I double my 3d vision framerates, they're probably only around 70% of my 2d framerates. The only game that has normal 3D Vision framerates is rFactor 2 where the doubled 3d framerate is right around 100% of 2d framerates (not to mention, has just about perfect SLI scaling during 3D Vision). I have a 1080 Ti with an 8700K. I'm usuing high levels of DSR in all games. Do you think the DSR is causing this issue? Again, 3D framerates relative to 2D are perfect with rFactor 2, it's just all the other games which seem to take a massive performance hit when 3D Vision is enabled.
Is this CPU issue still going on?

I'm wondering because in all my racing games: F1 2017, Assetto Corsa, Project Cars, rFactor 1, Automobilista (formerly known as Game Stock Car or Game Stock Car Extreme), my 3D Vision performance is way off my 2d fps.

When I double my 3d vision framerates, they're probably only around 70% of my 2d framerates.

The only game that has normal 3D Vision framerates is rFactor 2 where the doubled 3d framerate is right around 100% of 2d framerates (not to mention, has just about perfect SLI scaling during 3D Vision).

I have a 1080 Ti with an 8700K.

I'm usuing high levels of DSR in all games. Do you think the DSR is causing this issue? Again, 3D framerates relative to 2D are perfect with rFactor 2, it's just all the other games which seem to take a massive performance hit when 3D Vision is enabled.

Posted 04/25/2018 05:59 AM   
DSR is not likely the cause. An easy way to tell is to load up anything which tells your GPU usage. If it's <90% usage in 3D Vision, your bottleneck is NOT the GPU, i.e. it's your CPU/Memory. F1 2017, Assetto Corsa, Project Cars are all known heavily CPU bound games, likely hence the issue. Best solution at this stage: OC your CPU to 5GHz and memory to ~3400MHz+ to achieve the best case scenario. Alternatively use Compatibility Mode, which does not suffer from the CPU bottleneck. You might also wish to play Project Cars in VR, as VR stereorisation does not have said bottleneck either - it seems specific to the 3D Vision driver.
DSR is not likely the cause. An easy way to tell is to load up anything which tells your GPU usage. If it's <90% usage in 3D Vision, your bottleneck is NOT the GPU, i.e. it's your CPU/Memory.

F1 2017, Assetto Corsa, Project Cars are all known heavily CPU bound games, likely hence the issue.


Best solution at this stage:
OC your CPU to 5GHz and memory to ~3400MHz+ to achieve the best case scenario.

Alternatively use Compatibility Mode, which does not suffer from the CPU bottleneck.

You might also wish to play Project Cars in VR, as VR stereorisation does not have said bottleneck either - it seems specific to the 3D Vision driver.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 04/25/2018 06:29 AM   
[quote="rustyk21"][quote="nv1d1a5ux"]UP[/quote] Why? What's your question? If you're asking if anything has changed, then no, nothing has changed. If you want to understand the issues, then read this thread and the answers I gave you in your other thread.[/quote] This topic needs to be active.
rustyk21 said:
nv1d1a5ux said:UP


Why? What's your question? If you're asking if anything has changed, then no, nothing has changed.
If you want to understand the issues, then read this thread and the answers I gave you in your other thread.


This topic needs to be active.

Posted 04/25/2018 02:41 PM   
[quote="RAGEdemon"]DSR is not likely the cause. An easy way to tell is to load up anything which tells your GPU usage. If it's <90% usage in 3D Vision, your bottleneck is NOT the GPU, i.e. it's your CPU/Memory. F1 2017, Assetto Corsa, Project Cars are all known heavily CPU bound games, likely hence the issue. Best solution at this stage: OC your CPU to 5GHz and memory to ~3400MHz+ to achieve the best case scenario. Alternatively use Compatibility Mode, which does not suffer from the CPU bottleneck. You might also wish to play Project Cars in VR, as VR stereorisation does not have said bottleneck either - it seems specific to the 3D Vision driver. [/quote] Do ya think NV 3D Vision would benefit from quad channel memory?
RAGEdemon said:DSR is not likely the cause. An easy way to tell is to load up anything which tells your GPU usage. If it's <90% usage in 3D Vision, your bottleneck is NOT the GPU, i.e. it's your CPU/Memory.

F1 2017, Assetto Corsa, Project Cars are all known heavily CPU bound games, likely hence the issue.


Best solution at this stage:
OC your CPU to 5GHz and memory to ~3400MHz+ to achieve the best case scenario.

Alternatively use Compatibility Mode, which does not suffer from the CPU bottleneck.

You might also wish to play Project Cars in VR, as VR stereorisation does not have said bottleneck either - it seems specific to the 3D Vision driver.


Do ya think NV 3D Vision would benefit from quad channel memory?

Posted 04/25/2018 02:47 PM   
[quote="nv1d1a5ux"][quote="rustyk21"][quote="nv1d1a5ux"]UP[/quote] Why? What's your question? If you're asking if anything has changed, then no, nothing has changed. If you want to understand the issues, then read this thread and the answers I gave you in your other thread.[/quote] This topic needs to be active.[/quote] Try asking questions then, rather than just bumping posts. You'll get more help that way.
nv1d1a5ux said:
rustyk21 said:
nv1d1a5ux said:UP


Why? What's your question? If you're asking if anything has changed, then no, nothing has changed.
If you want to understand the issues, then read this thread and the answers I gave you in your other thread.


This topic needs to be active.


Try asking questions then, rather than just bumping posts. You'll get more help that way.

Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310

Posted 04/25/2018 03:38 PM   
  21 / 22    
Scroll To Top