Nvidia is dropping support for 3D Vision and older Kepler mobile GPU's
https://www.pcgamer.com/nvidia-is-ending-driver-support-for-3d-vision-and-older-laptop-gpus Today is a sad day for 3D Vision enthusiasts, once the reality of no Cyberpunk 2077 in 3D Vision (unless future display drivers can be modified somehow) kicks in. Let your voice be heard at the link above.
https://www.pcgamer.com/nvidia-is-ending-driver-support-for-3d-vision-and-older-laptop-gpus


Today is a sad day for 3D Vision enthusiasts, once the reality of no Cyberpunk 2077 in 3D Vision (unless future display drivers can be modified somehow) kicks in.

Let your voice be heard at the link above.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#1
Posted 03/11/2019 08:37 PM   
Done as CJ. [url]https://www.pcgamer.com/nvidia-is-ending-driver-support-for-3d-vision-and-older-laptop-gpus/#comment-jump[/url]

Overclocked Intel® Core™i5-4690k Quad Core
32 Gb RAM
8GB GEFORCE GTX 1080
3D Vision 2
Windows 10 64 Bit
NVidia driver 419.17
SAMSUNG - UE55H8000 Smart 3D 55" Curved
Philips G-Sync 272G
Oculus Rift with Touch controlers

#2
Posted 03/11/2019 09:43 PM   
[quote="costiq"]Done as CJ. [url]https://www.pcgamer.com/nvidia-is-ending-driver-support-for-3d-vision-and-older-laptop-gpus/#comment-jump[/url][/quote] Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards, specifically Jared Walton who gave the RTX "2070", a $900 60 card (TU-106, no SLI as fast as the outgoing 80 card whereas historically the new 70 card has been as fast as the outgoing 80 Ti card). In fact, I assert that the entire lineup has been renamed one card higher, i.e. the RTX "2080 Ti" is in fact a $1200 80 card, considering it's only some 25-30% faster than 1080 Ti, a performance improvement historically seen between the new 80 card and the old 80 Ti card. "Damn!!!!!! I guess this means I'm not going to be able to play Cyperpunk 2077 in stereoscopic 3D like I did the world of the Witcher. That's going to be a huge loss." - Quirky [i]I know right?! I too am still going through TW3 in 3D Vision. This is how good 3D Vision can be for those whove never tried it: due to the fact that I have two monitors on arms in front of me, TW3 looks better in 2560x1440 3D Vision @ 60 FPS (with some settings turned down, Hairworks off etc.) than it does at over 90 FPS on the curved 34" 3440x1440 120 Hz IPS AW3418DW, which also has much better colors and contrast. It's that good. Even with the potential ghosting (this really only happens if you pause the game for 5 minutes with static menu items on screen) and double talk (particularly noticeable if the game's environment is brightly lit, i.e TW3 at noon). I'm sad to see it go, but to be fair, I only have maybe 3-4 titles that still currently have my attention in 3D Vision: The Witcher 3 (still going through Blood and Wine, I completed Hearts of Stone a few months ago) Batman: Arkham Knight (the 3D Vision is REALLY good in this game, and I enjoy it even with all of it's shortcomings, I personally thought he Batmobile was well done and a nice addition, a source of most of the games criticism) Rise of the Tomb Raider There are others that were really well done that I wouldn't mind going through again, especially considering the first time I did I was with an i7 4930k @ 4.5 GHz and 780 Ti SLI or a single 980 Ti @ 1500 MHz, my new configuration is exactly 50% faster in both GPU and CPU compute: binned and delidded i7 8700k @ 5.0 GHz [23k CPU vs 16k CPU Firestrike, 50% more FPS in CPU necked games, and a 1080 Ti @ 2025 MHz with an undervolt {both components are under a combined 1.1 megawatt of radiator surface area, full monoblock on the CPU and full block on the GPU, all parts EK}]). So if there was either a CPU bottleneck or issue with 3GB or SLI that issue is gone. For example, I did, out of curiosity, reinstall Shadow of Mordor and travelled to a CPU troublesome spot on the map with a lot of NCP's and verticality going on that used to bring my 4930k down to around 40 FPS and now it's 60 FPS there, even with some of the texture settings completely maxed now (Ultra Textures were problematic with the 980 Ti in 2015 as memory usage can get as high as 8, even 10GB). Games I may revisit in 3D Vision: Alien: Isolation Middle Earth: Shadow of Mordor Max Payne 3 and possibly Metro and Metro LL Redux Mass Effect 3 (since this was the last decent ME and looks good in 3D Vision, hell I may try ME 2) But most newer titles just play, look and perform better at 3440x1440 120 Hz G-Sync, like any FPS, any racing game, and any 2D side-scroller (Ori the Blind Forest Definitive Edition looks way better at 3440x1440 120 Hz G-Sync than it does at 2560x1440 60 Hz V-Sync 3D Vision). XCOM 2 (WOTC) also looks and feels better on the AW3418DW than it does on the PG278Q). So I have only a particular genre that looks good @ 3D Vision: FPS 3rd PS / Adventure So I feel that my enthusiasm for 3D Vision is dying alongside mainstream support. (Nixxes did continue to offer official support all the way up to Rise of the Tomb Raider from 2016) I have probably 15 games currently being rotated through and only 2 of them are currently 3D Vision: The Witcher 3 Batman: Arkham Knight Everything else just looks, feels and runs better on the AW3418DW. One final comment and the epitaph for 3D Vision: I would rather play Doom: Eternal on the AW3418DW than in 3D Vision on the PG278Q.[/i]
costiq said:Done as CJ.

https://www.pcgamer.com/nvidia-is-ending-driver-support-for-3d-vision-and-older-laptop-gpus/#comment-jump


Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards, specifically Jared Walton who gave the RTX "2070", a $900 60 card (TU-106, no SLI as fast as the outgoing 80 card whereas historically the new 70 card has been as fast as the outgoing 80 Ti card). In fact, I assert that the entire lineup has been renamed one card higher, i.e. the RTX "2080 Ti" is in fact a $1200 80 card, considering it's only some 25-30% faster than 1080 Ti, a performance improvement historically seen between the new 80 card and the old 80 Ti card.

"Damn!!!!!!

I guess this means I'm not going to be able to play Cyperpunk 2077 in stereoscopic 3D like I did the world of the Witcher. That's going to be a huge loss." - Quirky


I know right?! I too am still going through TW3 in 3D Vision. This is how good 3D Vision can be for those whove never tried it: due to the fact that I have two monitors on arms in front of me, TW3 looks better in 2560x1440 3D Vision @ 60 FPS (with some settings turned down, Hairworks off etc.) than it does at over 90 FPS on the curved 34" 3440x1440 120 Hz IPS AW3418DW, which also has much better colors and contrast.

It's that good. Even with the potential ghosting (this really only happens if you pause the game for 5 minutes with static menu items on screen) and double talk (particularly noticeable if the game's environment is brightly lit, i.e TW3 at noon).

I'm sad to see it go, but to be fair, I only have maybe 3-4 titles that still currently have my attention in 3D Vision:

The Witcher 3 (still going through Blood and Wine, I completed Hearts of Stone a few months ago)

Batman: Arkham Knight (the 3D Vision is REALLY good in this game, and I enjoy it even with all of it's shortcomings, I personally thought he Batmobile was well done and a nice addition, a source of most of the games criticism)

Rise of the Tomb Raider

There are others that were really well done that I wouldn't mind going through again, especially considering the first time I did I was with an i7 4930k @ 4.5 GHz and 780 Ti SLI or a single 980 Ti @ 1500 MHz, my new configuration is exactly 50% faster in both GPU and CPU compute: binned and delidded i7 8700k @ 5.0 GHz [23k CPU vs 16k CPU Firestrike, 50% more FPS in CPU necked games, and a 1080 Ti @ 2025 MHz with an undervolt {both components are under a combined 1.1 megawatt of radiator surface area, full monoblock on the CPU and full block on the GPU, all parts EK}]). So if there was either a CPU bottleneck or issue with 3GB or SLI that issue is gone. For example, I did, out of curiosity, reinstall Shadow of Mordor and travelled to a CPU troublesome spot on the map with a lot of NCP's and verticality going on that used to bring my 4930k down to around 40 FPS and now it's 60 FPS there, even with some of the texture settings completely maxed now (Ultra Textures were problematic with the 980 Ti in 2015 as memory usage can get as high as 8, even 10GB).

Games I may revisit in 3D Vision:

Alien: Isolation

Middle Earth: Shadow of Mordor

Max Payne 3 and possibly Metro and Metro LL Redux

Mass Effect 3 (since this was the last decent ME and looks good in 3D Vision, hell I may try ME 2)

But most newer titles just play, look and perform better at 3440x1440 120 Hz G-Sync, like any FPS, any racing game, and any 2D side-scroller (Ori the Blind Forest Definitive Edition looks way better at 3440x1440 120 Hz G-Sync than it does at 2560x1440 60 Hz V-Sync 3D Vision). XCOM 2 (WOTC) also looks and feels better on the AW3418DW than it does on the PG278Q).

So I have only a particular genre that looks good @ 3D Vision:

FPS
3rd PS / Adventure

So I feel that my enthusiasm for 3D Vision is dying alongside mainstream support. (Nixxes did continue to offer official support all the way up to Rise of the Tomb Raider from 2016)

I have probably 15 games currently being rotated through and only 2 of them are currently 3D Vision:

The Witcher 3
Batman: Arkham Knight

Everything else just looks, feels and runs better on the AW3418DW.

One final comment and the epitaph for 3D Vision:

I would rather play Doom: Eternal on the AW3418DW than in 3D Vision on the PG278Q.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#3
Posted 03/12/2019 04:13 AM   
Why people assume we need new drivers for the cyberpunk ? Yes it may be mandatory and it may not. Changing gpu drivers often is pretty useless and most of the times just fucks thing up with 3d
Why people assume we need new drivers for the cyberpunk ?
Yes it may be mandatory and it may not. Changing gpu drivers often is pretty useless and most of the times just fucks thing up with 3d

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

#4
Posted 03/12/2019 04:50 AM   
[quote="Metal-O-Holic"]Why people assume we need new drivers for the cyberpunk ? Yes it may be mandatory and it may not. Changing gpu drivers often is pretty useless and most of the times just fucks thing up with 3d[/quote] Someone should do an experiment where they go to like 340 3dvision driver and use 418 normal driver, and see if it works. I think it's not a great assumption. I think cyberpunk will be very possible. I know some require new drivers. (fallout 4, but that's also Bethesda).
Metal-O-Holic said:Why people assume we need new drivers for the cyberpunk ?
Yes it may be mandatory and it may not. Changing gpu drivers often is pretty useless and most of the times just fucks thing up with 3d


Someone should do an experiment where they go to like 340 3dvision driver and use 418 normal driver, and see if it works.

I think it's not a great assumption. I think cyberpunk will be very possible. I know some require new drivers. (fallout 4, but that's also Bethesda).

I'm ishiki, forum screwed up my name.

9900K @5.0 GHZ, 16GBDDR4@4233MHZ, 2080 Ti

#5
Posted 03/12/2019 04:59 AM   
[quote="Metal-O-Holic"]Why people assume we need new drivers for the cyberpunk ? Yes it may be mandatory and it may not. Changing gpu drivers often is pretty useless and most of the times just fucks thing up with 3d[/quote] Sometimes new games don't work at all without a brand new driver. For example, I tried the Resident Evil 2 remake and it was nothing but a black screen where you could only barely see anything on 411.70, which I'm still on (I'm not going through the hole rigamorole of DDU'ing the driver for a 30 minute demo).
Metal-O-Holic said:Why people assume we need new drivers for the cyberpunk ?
Yes it may be mandatory and it may not. Changing gpu drivers often is pretty useless and most of the times just fucks thing up with 3d


Sometimes new games don't work at all without a brand new driver. For example, I tried the Resident Evil 2 remake and it was nothing but a black screen where you could only barely see anything on 411.70, which I'm still on (I'm not going through the hole rigamorole of DDU'ing the driver for a 30 minute demo).

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#6
Posted 03/12/2019 03:13 PM   
I was thinking about this further today and came to the realization that at some point in the not-too-distant future I may need to choose between 3D Vision on the PG278Q or 21:9 3440x1440 120 Hz G-Sync content on the AW3418DW. For example, say Doom: Eternal requires a new driver beyond 418 to work correctly, yet updating to said driver and there is no longer a 3D Vision section of Nvidia Control Panel to speak of.
I was thinking about this further today and came to the realization that at some point in the not-too-distant future I may need to choose between 3D Vision on the PG278Q or 21:9 3440x1440 120 Hz G-Sync content on the AW3418DW. For example, say Doom: Eternal requires a new driver beyond 418 to work correctly, yet updating to said driver and there is no longer a 3D Vision section of Nvidia Control Panel to speak of.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#7
Posted 03/12/2019 10:21 PM   
[quote="xXxStarManxXx"]Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards[/quote] Just register another account on disqus.com :)
xXxStarManxXx said:Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards

Just register another account on disqus.com :)

#8
Posted 03/12/2019 10:45 PM   
This is so sad. I just got into 3D vision last weekend. Grabbed a 1080 Ti specifically to game in 3D.Does this mean good ol Helix Mod will not be available anymore for future games?
This is so sad. I just got into 3D vision last weekend. Grabbed a 1080 Ti specifically to game in 3D.Does this mean good ol Helix Mod will not be available anymore for future games?

#9
Posted 03/12/2019 11:13 PM   
[quote="john105"][quote="xXxStarManxXx"]Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards[/quote] Just register another account on disqus.com :)[/quote] I did, they must have an advanced algorithm that is going off of IP or something because comments created with a different email address were also "detected as spam".
john105 said:
xXxStarManxXx said:Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards

Just register another account on disqus.com :)


I did, they must have an advanced algorithm that is going off of IP or something because comments created with a different email address were also "detected as spam".

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#10
Posted 03/13/2019 02:29 AM   
[quote="Muojo"]This is so sad. I just got into 3D vision last weekend. Grabbed a 1080 Ti specifically to game in 3D.Does this mean good ol Helix Mod will not be available anymore for future games? [/quote] See my 2nd to last comment above, it may come down to having to choose between getting a new game to work correctly, if at all and forgoing 3D Vision or sticking with 3D Vision on the 418 driver. I mean you could dual boot another instance of an OS with a different display driver, but that's kinda pushing it in terms of convenience.
Muojo said:This is so sad. I just got into 3D vision last weekend. Grabbed a 1080 Ti specifically to game in 3D.Does this mean good ol Helix Mod will not be available anymore for future games?


See my 2nd to last comment above, it may come down to having to choose between getting a new game to work correctly, if at all and forgoing 3D Vision or sticking with 3D Vision on the 418 driver.

I mean you could dual boot another instance of an OS with a different display driver, but that's kinda pushing it in terms of convenience.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#11
Posted 03/13/2019 02:31 AM   
[quote="xXxStarManxXx"][quote="john105"][quote="xXxStarManxXx"]Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards[/quote] Just register another account on disqus.com :)[/quote] I did, they must have an advanced algorithm that is going off of IP or something because comments created with a different email address were also "detected as spam". [/quote] I think it means that they filter some "offensive words". It can even be something like "damn" from your post.
xXxStarManxXx said:
john105 said:
xXxStarManxXx said:Here's mine, which isn't posted because PC Gamer is blocking my comments because I criticized them for taking money from Nvidia in return for a glowing review of RTX cards

Just register another account on disqus.com :)


I did, they must have an advanced algorithm that is going off of IP or something because comments created with a different email address were also "detected as spam".


I think it means that they filter some "offensive words". It can even be something like "damn" from your post.

#12
Posted 03/13/2019 02:44 AM   
Scroll To Top