3D Vision CPU Bottelneck: Gathering Information thread.
  15 / 22    
[quote="Duerf"]Nvidia should thing about implementing an "extra" piece of hardware to the graphic cards (something similar to what the did with PhysX) to make 3D Vision work in the same league than 2D (in terms or framerates). Otherwise 3D is always falling behind (far away) and always creating framerate issues, specially if it is not even compatible with g-sync tech. Nvidia, forget audio implements and that kind of sh*ts that nobody use, and spend some effort in helping 3D renderings. Make framerate exactly the same than 2D. The graphic card only need another little brain to think about that extra mirroring picture.[/quote] Why so they can charge us again for what should have been working correctly in the first place??? No they need to fix the core issues with current products. A full 3D Vision setup isn't anywhere near cheap. GPU+Monitor+Emitter
Duerf said:Nvidia should thing about implementing an "extra" piece of hardware to the graphic cards (something similar to what the did with PhysX) to make 3D Vision work in the same league than 2D (in terms or framerates). Otherwise 3D is always falling behind (far away) and always creating framerate issues, specially if it is not even compatible with g-sync tech.

Nvidia, forget audio implements and that kind of sh*ts that nobody use, and spend some effort in helping 3D renderings. Make framerate exactly the same than 2D. The graphic card only need another little brain to think about that extra mirroring picture.


Why so they can charge us again for what should have been working correctly in the first place???
No they need to fix the core issues with current products.
A full 3D Vision setup isn't anywhere near cheap.

GPU+Monitor+Emitter

Gaming Rig 1

i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO

Gaming Rig 2
My new build

Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2

Posted 08/23/2017 05:58 PM   
I don't think nvidia is going to charge less if an utopic "new 3D Vision chip" is not implemented. Nvidia is always going to charge you everything they can do anyway. The production cost of you graphic card is nothing compared to what you pay for it, and if Nvidia would increase the price because of the inclusion of the "3D Vision chip" would be another silly excuse, but they should not do that because most people don't play using 3D Vision. It is just a way to make better their product, and maybe a way to make a good 3D Vision comunity once and for all. My only desire is to balance fps between 3D and 2D. I don't want to have a powerfull graphic card, I want to have a graphic card capable of running every recent games, and games are designed to be played with the 2D pfs limits, so you are always going to have fps issues if you play a demanding game with 3D. Most of the people that do not accept 3D after testing it are just because of the low framerates and choppy gameplay (I think..., that is what happen to me with a lot of games if I can not reach a minimum of frames/second). Creating a driver several years ago with 50% performance lost and do nothing to improve performance (neither via software nor hardware) is something I do not understand if Nvidia really want to offer something good in their silly expensive product. Do it or don't do it, but, if you do it, do it well, and do not make a fudge.
I don't think nvidia is going to charge less if an utopic "new 3D Vision chip" is not implemented. Nvidia is always going to charge you everything they can do anyway. The production cost of you graphic card is nothing compared to what you pay for it, and if Nvidia would increase the price because of the inclusion of the "3D Vision chip" would be another silly excuse, but they should not do that because most people don't play using 3D Vision. It is just a way to make better their product, and maybe a way to make a good 3D Vision comunity once and for all.

My only desire is to balance fps between 3D and 2D. I don't want to have a powerfull graphic card, I want to have a graphic card capable of running every recent games, and games are designed to be played with the 2D pfs limits, so you are always going to have fps issues if you play a demanding game with 3D.

Most of the people that do not accept 3D after testing it are just because of the low framerates and choppy gameplay (I think..., that is what happen to me with a lot of games if I can not reach a minimum of frames/second).

Creating a driver several years ago with 50% performance lost and do nothing to improve performance (neither via software nor hardware) is something I do not understand if Nvidia really want to offer something good in their silly expensive product. Do it or don't do it, but, if you do it, do it well, and do not make a fudge.

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

Posted 08/23/2017 08:24 PM   
[quote="DJ-RK"][quote="Adam J"]Has anyone been able to find any sort of work around with GTA V in 3D Vision yet? I have been trying to get this game working for last few months on 2 different systems and can't ever get to any FPS that is even playable most of the time. I have read through a lot of this thread but not all of it, is there actually a fix out there or coming down that will allow 3D vision to not be only using 3 Cores? Current system I am testing 6700K @ 4.6 GHz 16GB - 3000 DDR4 EVGA FTW GTX 1080 ASUS VG278HE (3D Vision 144Hz) Any update or work around would be appreciated, I have been wanting to play GTA V in 3D for over a year now but every time I come back to it the game just runs like balls with 3D Vision enabled, I have another GTX 1080 I have tried with SLi, a X99 6C 12T CPU and SLi 980Ti's... doesn't matter what I try hardware or OS I have tried in Win 7, 8.1, 10..... they all run GTA V like balls in 3D... 2D.. no problems at all. [/quote] Dude, I guarantee with specs like that GTAV is very "playable". Just try it and see, don't look at the actual FPS #'s and base your opinion off that alone. You'll find that the game feels a lot smoother than most others that dip down that low. I played GTAV on 2 780's in SLI (and later on a 980 Ti, which was roughly the same in performance), with mostly max settings (but low on AA and some optimization here and there) and I would get between 30-40 FPS, but if I did max out the settings with the highest AA (or even a little supersampling) I would only drop down to 25-30 FPS, which was still very fluid feeling to me and looked super crisp, and therefore was worth it. Due to the CPU bug in this game, it makes sense to crank up the graphics as much as possible to almost try to shift the bottleneck to the GPU rather than the CPU.[/quote] I think our standards are different, I can't play it with V-Sync at under 60 FPS because of the stutter. 25-30 FPS is NOT smooth, and honestly neither is 40 FPS, which is what I get with this stupid 3 core bottleneck. 2D my rig runs this game fine with like 120 FPS avg. but 3D it's 40 FPS, lame.
DJ-RK said:
Adam J said:Has anyone been able to find any sort of work around with GTA V in 3D Vision yet? I have been trying to get this game working for last few months on 2 different systems and can't ever get to any FPS that is even playable most of the time. I have read through a lot of this thread but not all of it, is there actually a fix out there or coming down that will allow 3D vision to not be only using 3 Cores?

Current system I am testing

6700K @ 4.6 GHz
16GB - 3000 DDR4
EVGA FTW GTX 1080
ASUS VG278HE (3D Vision 144Hz)

Any update or work around would be appreciated, I have been wanting to play GTA V in 3D for over a year now but every time I come back to it the game just runs like balls with 3D Vision enabled, I have another GTX 1080 I have tried with SLi, a X99 6C 12T CPU and SLi 980Ti's... doesn't matter what I try hardware or OS I have tried in Win 7, 8.1, 10..... they all run GTA V like balls in 3D... 2D.. no problems at all.


Dude, I guarantee with specs like that GTAV is very "playable". Just try it and see, don't look at the actual FPS #'s and base your opinion off that alone. You'll find that the game feels a lot smoother than most others that dip down that low. I played GTAV on 2 780's in SLI (and later on a 980 Ti, which was roughly the same in performance), with mostly max settings (but low on AA and some optimization here and there) and I would get between 30-40 FPS, but if I did max out the settings with the highest AA (or even a little supersampling) I would only drop down to 25-30 FPS, which was still very fluid feeling to me and looked super crisp, and therefore was worth it. Due to the CPU bug in this game, it makes sense to crank up the graphics as much as possible to almost try to shift the bottleneck to the GPU rather than the CPU.


I think our standards are different, I can't play it with V-Sync at under 60 FPS because of the stutter. 25-30 FPS is NOT smooth, and honestly neither is 40 FPS, which is what I get with this stupid 3 core bottleneck. 2D my rig runs this game fine with like 120 FPS avg. but 3D it's 40 FPS, lame.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 08/26/2017 01:47 AM   
I actually lost fps with the latest driver.
I actually lost fps with the latest driver.

Gaming Rig 1

i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO

Gaming Rig 2
My new build

Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2

Posted 08/26/2017 01:49 AM   
[quote="xXxStarManxXx"] I think our standards are different, I can't play it with V-Sync at under 60 FPS because of the stutter. 25-30 FPS is NOT smooth, and honestly neither is 40 FPS, which is what I get with this stupid 3 core bottleneck. 2D my rig runs this game fine with like 120 FPS avg. but 3D it's 40 FPS, lame.[/quote] Are you bothered by low refresh rate strobing? If not, with a monitor that has ULMB like yours you can create a custom resolution of for example 80Hz (40Hz per eye), with reversed eyes. A resolution with only your desired refresh rate. That way it would be smooth with vsync (1 strobe per fps if you get 40fps). With the PG278QR I could get 3D+ULMB to work correctly with the glasses between 64Hz and 125Hz. The drawbacks are that ULMB has more ghosting than Lightboost, and that you need to disable the red warning by modifying two Nvidia dlls. How to do it? Here's the guide: http://forums.blurbusters.com/viewtopic.php?f=4&t=3501 Basically you need to preserve the total pixel clock when going down or up in Hz, so the monitor accepts it. Be warned, 32Hz per eye will melt your eyes :p.
xXxStarManxXx said:
I think our standards are different, I can't play it with V-Sync at under 60 FPS because of the stutter. 25-30 FPS is NOT smooth, and honestly neither is 40 FPS, which is what I get with this stupid 3 core bottleneck. 2D my rig runs this game fine with like 120 FPS avg. but 3D it's 40 FPS, lame.


Are you bothered by low refresh rate strobing? If not, with a monitor that has ULMB like yours you can create a custom resolution of for example 80Hz (40Hz per eye), with reversed eyes. A resolution with only your desired refresh rate. That way it would be smooth with vsync (1 strobe per fps if you get 40fps). With the PG278QR I could get 3D+ULMB to work correctly with the glasses between 64Hz and 125Hz. The drawbacks are that ULMB has more ghosting than Lightboost, and that you need to disable the red warning by modifying two Nvidia dlls.

How to do it? Here's the guide: http://forums.blurbusters.com/viewtopic.php?f=4&t=3501

Basically you need to preserve the total pixel clock when going down or up in Hz, so the monitor accepts it. Be warned, 32Hz per eye will melt your eyes :p.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 08/26/2017 07:52 AM   
[quote="masterotaku"][quote="xXxStarManxXx"] I think our standards are different, I can't play it with V-Sync at under 60 FPS because of the stutter. 25-30 FPS is NOT smooth, and honestly neither is 40 FPS, which is what I get with this stupid 3 core bottleneck. 2D my rig runs this game fine with like 120 FPS avg. but 3D it's 40 FPS, lame.[/quote] Are you bothered by low refresh rate strobing? If not, with a monitor that has ULMB like yours you can create a custom resolution of for example 80Hz (40Hz per eye), with reversed eyes. A resolution with only your desired refresh rate. That way it would be smooth with vsync (1 strobe per fps if you get 40fps). With the PG278QR I could get 3D+ULMB to work correctly with the glasses between 64Hz and 125Hz. The drawbacks are that ULMB has more ghosting than Lightboost, and that you need to disable the red warning by modifying two Nvidia dlls. How to do it? Here's the guide: http://forums.blurbusters.com/viewtopic.php?f=4&t=3501 Basically you need to preserve the total pixel clock when going down or up in Hz, so the monitor accepts it. Be warned, 32Hz per eye will melt your eyes :p.[/quote] Wait what does this accomplish exactly? Is it brighter at lower FPS? I dont think I'm experiencing low refresh strobing as if I can't get the game to run at 57-60 FPS I simply don't play it in 3D Vision. With a 1080 Ti only a few compromises are required (like no Hairworks in TW3 for example) unless the game in question exhibits the dreaded 3 Core bug such as GTA 5.
masterotaku said:
xXxStarManxXx said:
I think our standards are different, I can't play it with V-Sync at under 60 FPS because of the stutter. 25-30 FPS is NOT smooth, and honestly neither is 40 FPS, which is what I get with this stupid 3 core bottleneck. 2D my rig runs this game fine with like 120 FPS avg. but 3D it's 40 FPS, lame.


Are you bothered by low refresh rate strobing? If not, with a monitor that has ULMB like yours you can create a custom resolution of for example 80Hz (40Hz per eye), with reversed eyes. A resolution with only your desired refresh rate. That way it would be smooth with vsync (1 strobe per fps if you get 40fps). With the PG278QR I could get 3D+ULMB to work correctly with the glasses between 64Hz and 125Hz. The drawbacks are that ULMB has more ghosting than Lightboost, and that you need to disable the red warning by modifying two Nvidia dlls.

How to do it? Here's the guide: http://forums.blurbusters.com/viewtopic.php?f=4&t=3501


Basically you need to preserve the total pixel clock when going down or up in Hz, so the monitor accepts it. Be warned, 32Hz per eye will melt your eyes :p.


Wait what does this accomplish exactly? Is it brighter at lower FPS? I dont think I'm experiencing low refresh strobing as if I can't get the game to run at 57-60 FPS I simply don't play it in 3D Vision. With a 1080 Ti only a few compromises are required (like no Hairworks in TW3 for example) unless the game in question exhibits the dreaded 3 Core bug such as GTA 5.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 08/26/2017 05:02 PM   
[quote="xXxStarManxXx"] Wait what does this accomplish exactly? Is it brighter at lower FPS?[/quote] What is annoying about not getting 60fps in 3D? Dropping fps below the vsync limit. That's the reason why I played Dark Souls 3 at 100Hz (50fps per eye), so it was smooth at least almost all the time. With custom resolutions you can have vsync perfection (1 fps per Hz per 1 strobe), that's what I'm talking about. 32fps at 32Hz is as good as 60fps at 60Hz, except for heavily increased flickering (like a CRT running at 32Hz) and that it IS running at lower fps than what you are used to. But it should be better than having 60Hz per eye and not reaching those fps.
xXxStarManxXx said:
Wait what does this accomplish exactly? Is it brighter at lower FPS?


What is annoying about not getting 60fps in 3D? Dropping fps below the vsync limit. That's the reason why I played Dark Souls 3 at 100Hz (50fps per eye), so it was smooth at least almost all the time.

With custom resolutions you can have vsync perfection (1 fps per Hz per 1 strobe), that's what I'm talking about. 32fps at 32Hz is as good as 60fps at 60Hz, except for heavily increased flickering (like a CRT running at 32Hz) and that it IS running at lower fps than what you are used to. But it should be better than having 60Hz per eye and not reaching those fps.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 08/26/2017 05:40 PM   
[quote="masterotaku"][quote="xXxStarManxXx"] Wait what does this accomplish exactly? Is it brighter at lower FPS?[/quote] What is annoying about not getting 60fps in 3D? Dropping fps below the vsync limit. That's the reason why I played Dark Souls 3 at 100Hz (50fps per eye), so it was smooth at least almost all the time. With custom resolutions you can have vsync perfection (1 fps per Hz per 1 strobe), that's what I'm talking about. 32fps at 32Hz is as good as 60fps at 60Hz, except for heavily increased flickering (like a CRT running at 32Hz) and that it IS running at lower fps than what you are used to. But it should be better than having 60Hz per eye and not reaching those fps.[/quote] Oh ok I think I might understand what youre on about, in an attempt to get The Witcher 3 running more smoothly at 2560x1440 with a single overclocked 1080 Ti in conjunction with making a few sacrifices I noticed that, even if the FPS was like 53-55 FPS there was still ridiculous stutter, so then with some exploring I noticed that there was an option to set the refresh rate in 3D to 100Hz, which I tried and at 50 FPS there was now zero stutter! BUT, there was an issue with pulsating light, especially visible on the horizon and any bright object in game, and another observation was that the game was only slightly darker. I ultimately couldn't stand the flickering so I reverted to 120 Hz and made a few more sacrifices, turning off SweetFX smaa when scenes are very demanding and opting for SSAO instead of HBAO+. I still get stuttering in really demanding areas where the frames dip under 60 FPS but it's not often enough to worry too much about. Is this what youre talking about?
masterotaku said:
xXxStarManxXx said:
Wait what does this accomplish exactly? Is it brighter at lower FPS?


What is annoying about not getting 60fps in 3D? Dropping fps below the vsync limit. That's the reason why I played Dark Souls 3 at 100Hz (50fps per eye), so it was smooth at least almost all the time.

With custom resolutions you can have vsync perfection (1 fps per Hz per 1 strobe), that's what I'm talking about. 32fps at 32Hz is as good as 60fps at 60Hz, except for heavily increased flickering (like a CRT running at 32Hz) and that it IS running at lower fps than what you are used to. But it should be better than having 60Hz per eye and not reaching those fps.


Oh ok I think I might understand what youre on about, in an attempt to get The Witcher 3 running more smoothly at 2560x1440 with a single overclocked 1080 Ti in conjunction with making a few sacrifices I noticed that, even if the FPS was like 53-55 FPS there was still ridiculous stutter, so then with some exploring I noticed that there was an option to set the refresh rate in 3D to 100Hz, which I tried and at 50 FPS there was now zero stutter! BUT, there was an issue with pulsating light, especially visible on the horizon and any bright object in game, and another observation was that the game was only slightly darker. I ultimately couldn't stand the flickering so I reverted to 120 Hz and made a few more sacrifices, turning off SweetFX smaa when scenes are very demanding and opting for SSAO instead of HBAO+. I still get stuttering in really demanding areas where the frames dip under 60 FPS but it's not often enough to worry too much about.

Is this what youre talking about?

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 08/26/2017 07:04 PM   
[quote="xXxStarManxXx"] Is this what youre talking about?[/quote] Yes, exactly. If you can't handle 50Hz per eye, then I don't recommend using anything lower :p. I'm pretty tolerant to flickering (I prefer it to frame drops), but other people aren't.
xXxStarManxXx said:
Is this what youre talking about?


Yes, exactly.

If you can't handle 50Hz per eye, then I don't recommend using anything lower :p. I'm pretty tolerant to flickering (I prefer it to frame drops), but other people aren't.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 08/26/2017 07:12 PM   
[quote="masterotaku"][quote="xXxStarManxXx"] Is this what youre talking about?[/quote] Yes, exactly. If you can't handle 50Hz per eye, then I don't recommend using anything lower :p. I'm pretty tolerant to flickering (I prefer it to frame drops), but other people aren't.[/quote] Yeah although I loved the complete absence of stutter at 50 FPS a few observations that compelled me to put up with occasional stutter under 60 FPS were the flickering, the slightly darker picture AND I could tell the difference in fluidity between 50 and 60 FPS, even though it's only 10 FPS. If there was no flickering I would probably choose the 100 Hz option to be honest though. Once you notice it there is no unnoticing it, to me it's immersion destroying. I do wish we had more than 100 and 120 Hz options, I have a a 144 Hz monitor and think it's time for a 140 Hz option. Not sure if it's a limitation of the glasses or what but yeah, I don't understand why this isn't an option on the part of NGreedia.
masterotaku said:
xXxStarManxXx said:
Is this what youre talking about?


Yes, exactly.

If you can't handle 50Hz per eye, then I don't recommend using anything lower :p. I'm pretty tolerant to flickering (I prefer it to frame drops), but other people aren't.


Yeah although I loved the complete absence of stutter at 50 FPS a few observations that compelled me to put up with occasional stutter under 60 FPS were the flickering, the slightly darker picture AND I could tell the difference in fluidity between 50 and 60 FPS, even though it's only 10 FPS. If there was no flickering I would probably choose the 100 Hz option to be honest though. Once you notice it there is no unnoticing it, to me it's immersion destroying.

I do wish we had more than 100 and 120 Hz options, I have a a 144 Hz monitor and think it's time for a 140 Hz option. Not sure if it's a limitation of the glasses or what but yeah, I don't understand why this isn't an option on the part of NGreedia.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 08/26/2017 07:35 PM   
[quote="xXxStarManxXx"]If there was no flickering I would probably choose the 100 Hz option to be honest though.[/quote] I tried 3D without ULMB or Lightboost, and it was unwatchable. Massive ghosting in all the screen except for the lowest 1/5 part. Strobing is necessary with active 3D. [quote="xXxStarManxXx"]Not sure if it's a limitation of the glasses or what but yeah, I don't understand why this isn't an option on the part of NGreedia. [/quote] It isn't a limitation in the glasses. I saw them work even at 165Hz (82.5Hz per eye) and I don't know their real limit. But generally the higher the refresh rate, the harder is to get a cleaner image (without ghosting or crosstalk, etc). With my hacky method, it's possible to have 125Hz (62.5Hz per eye), but 240Hz monitors officially have ULMB at 144Hz too, so 3D should be doable at 144Hz. Image quality will be inferior to 120Hz, though. Using ULMB instead of Lightboost is already a downgrade, so increasing the refresh rate so much must be even worse.
xXxStarManxXx said:If there was no flickering I would probably choose the 100 Hz option to be honest though.


I tried 3D without ULMB or Lightboost, and it was unwatchable. Massive ghosting in all the screen except for the lowest 1/5 part. Strobing is necessary with active 3D.

xXxStarManxXx said:Not sure if it's a limitation of the glasses or what but yeah, I don't understand why this isn't an option on the part of NGreedia.


It isn't a limitation in the glasses. I saw them work even at 165Hz (82.5Hz per eye) and I don't know their real limit. But generally the higher the refresh rate, the harder is to get a cleaner image (without ghosting or crosstalk, etc). With my hacky method, it's possible to have 125Hz (62.5Hz per eye), but 240Hz monitors officially have ULMB at 144Hz too, so 3D should be doable at 144Hz. Image quality will be inferior to 120Hz, though. Using ULMB instead of Lightboost is already a downgrade, so increasing the refresh rate so much must be even worse.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 08/26/2017 07:58 PM   
At last I've managed to get the results I wanted. I benchmarked both drivers a while ago then had a system restore issue, then I noticed that I had remnants of a 3d fix in there which had ruined all the results anyway. 3d on: Frames Per Second (Higher is better) Min, Max, Avg Pass 0, 6.649405, 78.836327, 44.013969 Pass 1, 37.178383, 149.694641, 60.905315 Pass 2, 39.453922, 153.866882, 54.436916 Pass 3, 41.022785, 140.253922, 64.300728 Pass 4, 13.881242, 146.320221, 73.324211 3d off: Frames Per Second (Higher is better) Min, Max, Avg Pass 0, 14.325850, 116.907585, 98.080704 Pass 1, 48.235455, 273.519623, 137.334793 Pass 2, 6.482828, 201.185593, 132.315552 Pass 3, 78.620964, 195.938416, 148.378952 Pass 4, 4.531065, 338.206665, 150.227356 These are with driver 384.94 and I used CTRL-T to toggle 3d, so no other settings were changed. System is as per sig, running in 3d surround at 5760x1080 Here's a zip with GPUz logs, the benchmarks and an Afterburner log. The afterburner log is very interesting, you can clearly see higher CPU usage on the 2d run. http://www94.zippyshare.com/v/hcqZJ8Ts/file.html
At last I've managed to get the results I wanted. I benchmarked both drivers a while ago then had a system restore issue, then I noticed that I had remnants of a 3d fix in there which had ruined all the results anyway.

3d on:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 6.649405, 78.836327, 44.013969
Pass 1, 37.178383, 149.694641, 60.905315
Pass 2, 39.453922, 153.866882, 54.436916
Pass 3, 41.022785, 140.253922, 64.300728
Pass 4, 13.881242, 146.320221, 73.324211

3d off:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 14.325850, 116.907585, 98.080704
Pass 1, 48.235455, 273.519623, 137.334793
Pass 2, 6.482828, 201.185593, 132.315552
Pass 3, 78.620964, 195.938416, 148.378952
Pass 4, 4.531065, 338.206665, 150.227356


These are with driver 384.94 and I used CTRL-T to toggle 3d, so no other settings were changed.
System is as per sig, running in 3d surround at 5760x1080

Here's a zip with GPUz logs, the benchmarks and an Afterburner log. The afterburner log is very interesting, you can clearly see higher CPU usage on the 2d run.


http://www94.zippyshare.com/v/hcqZJ8Ts/file.html

GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310

Posted 08/27/2017 12:11 PM   
[quote="rustyk21"]At last I've managed to get the results I wanted. I benchmarked both drivers a while ago then had a system restore issue, then I noticed that I had remnants of a 3d fix in there which had ruined all the results anyway. 3d on: Frames Per Second (Higher is better) Min, Max, Avg Pass 0, 6.649405, 78.836327, 44.013969 Pass 1, 37.178383, 149.694641, 60.905315 Pass 2, 39.453922, 153.866882, 54.436916 Pass 3, 41.022785, 140.253922, 64.300728 Pass 4, 13.881242, 146.320221, 73.324211 3d off: Frames Per Second (Higher is better) Min, Max, Avg Pass 0, 14.325850, 116.907585, 98.080704 Pass 1, 48.235455, 273.519623, 137.334793 Pass 2, 6.482828, 201.185593, 132.315552 Pass 3, 78.620964, 195.938416, 148.378952 Pass 4, 4.531065, 338.206665, 150.227356 These are with driver 384.94 and I used CTRL-T to toggle 3d, so no other settings were changed. System is as per sig, running in 3d surround at 5760x1080 Here's a zip with GPUz logs, the benchmarks and an Afterburner log. The afterburner log is very interesting, you can clearly see higher CPU usage on the 2d run. http://www94.zippyshare.com/v/hcqZJ8Ts/file.html [/quote] Are you saying that there was an improvement in terms of CPU bottleneck with this driver?
rustyk21 said:At last I've managed to get the results I wanted. I benchmarked both drivers a while ago then had a system restore issue, then I noticed that I had remnants of a 3d fix in there which had ruined all the results anyway.

3d on:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 6.649405, 78.836327, 44.013969
Pass 1, 37.178383, 149.694641, 60.905315
Pass 2, 39.453922, 153.866882, 54.436916
Pass 3, 41.022785, 140.253922, 64.300728
Pass 4, 13.881242, 146.320221, 73.324211

3d off:
Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 14.325850, 116.907585, 98.080704
Pass 1, 48.235455, 273.519623, 137.334793
Pass 2, 6.482828, 201.185593, 132.315552
Pass 3, 78.620964, 195.938416, 148.378952
Pass 4, 4.531065, 338.206665, 150.227356


These are with driver 384.94 and I used CTRL-T to toggle 3d, so no other settings were changed.
System is as per sig, running in 3d surround at 5760x1080

Here's a zip with GPUz logs, the benchmarks and an Afterburner log. The afterburner log is very interesting, you can clearly see higher CPU usage on the 2d run.



http://www94.zippyshare.com/v/hcqZJ8Ts/file.html



Are you saying that there was an improvement in terms of CPU bottleneck with this driver?

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 08/27/2017 05:45 PM   
No, unfortunately not, you can see from my results that the fps in 3d is less than half what it is in 2d. Also, if you look at the afterburner log, you'll see that the CPU utilisation is higher in 2d than 3d. If you're having the same problem, please post your results on this thread. Nvidia have acknowledged this problem and still support 3d vision, albeit to a limited extent. It's up to us to provide the evidence, they're clearly not going to spend any lab time troubleshooting it for us. It is what it is.
No, unfortunately not, you can see from my results that the fps in 3d is less than half what it is in 2d.
Also, if you look at the afterburner log, you'll see that the CPU utilisation is higher in 2d than 3d.
If you're having the same problem, please post your results on this thread. Nvidia have acknowledged this problem and still support 3d vision, albeit to a limited extent.
It's up to us to provide the evidence, they're clearly not going to spend any lab time troubleshooting it for us. It is what it is.

GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310

Posted 08/27/2017 08:10 PM   
Thank you Russel! I shall forward the results, together with your summary and the logs, to our man Ray. Best, -- Shahzad
Thank you Russel!

I shall forward the results, together with your summary and the logs, to our man Ray.

Best,
-- Shahzad

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 08/28/2017 05:34 AM   
  15 / 22    
Scroll To Top