Future of 3D Vision Support (Official announcement from NVIDIA)
  19 / 42    
[quote="andysonofbob"][quote="xXxStarManxXx"]I picked up an AW3418DW on sale about a year ago and honestly, about 75% of my games look better at fluid G-Sync 100-120 Hz vs 3D Vision at 2560x1440. [/quote] I can agree with you that games are more fluid at 120 Hz (as every single 3D gamer can attest to, all of us having monitors that are 120Hz capable) but I personally find 60 Hz to be plenty fluid. Frankly, being able to hit CTRL and T whenever I want to experience playing 120Hz confirms this everytime. :P Regarding games looking 'better'? As in better because of higher definition? I think 3D provides a significant boost in visual information, which I think leads to increased 'immersion'. And I reckon immersion blows away resolution... likewise, everytime. [quote="xXxStarManxXx"]I even kept my PG278Q on a monitor arm and still have games that admittedly do look better in 3D Vision (The Witcher 3, Batman: Arkham Knight) when they don't exhibit sub 50 FPS non G-Sync stutter...[/quote] S'cuse my ignorance with this: I have used G-Sync monitors many MANY times and have no idea what the difference is!! Is this stutter something that only happens with SLI? I have only used SLI once and I did notice stuttering. Now I game with single cards I have never seen it, regardless of FPS. Thanks[/quote] In all of my titles I play in 3D Vision, if the FPS dips under 55 FPS (even here there is some stutter) the stutter is extremely noticeable, yes, exactly like poor SLI / X-Fire support. Right now the two games I'm still playing in 3D Vision do this, The Witcher 3 and Batman: Arkham Knight. I've tried literally everything, disabling in-game V-Sync in favor of Nvidia' V-Sync on in NVCP, triple buffering on and off (all this does for me on is increase input lag), maximum pre-renderd frames to 1,2,3, it doesn't matter. The Witcher 3 exhibits atrocious stutter even at 50 FPS. It has forced me to sacrifice quite a bit in visual fidelity, I've turned down a lot, and have gone as far as running a custom resolution that is 10% less than 2560x1440 to address it and I've mostly eliminated it. This isn't a problem with a variable refresh rate panel, it's only V-Sync, and it seems particularly bad with 3D Vision for some reason. Add the double talk, flat 16:9, and panel shortcomings (see previous post) and it's no small wonder I prefer my AW3418DW over my PG278Q for all games, all activities, all movies (no black bars on top and bottom, even on Netflix!) with exception of the two aforementioned games, which I play maybe once every few weeks. I don't know what it is but there is nothing like Batman:Arkham Knight in 3D Vision, I liked the game to begin with, but it's really good in 3D Vision (the fact that it's at night means the instance of double talk is much less than other games, such as The Witcher 3 in any scene in broad daylight).
andysonofbob said:
xXxStarManxXx said:I picked up an AW3418DW on sale about a year ago and honestly, about 75% of my games look better at fluid G-Sync 100-120 Hz vs 3D Vision at 2560x1440.


I can agree with you that games are more fluid at 120 Hz (as every single 3D gamer can attest to, all of us having monitors that are 120Hz capable) but I personally find 60 Hz to be plenty fluid. Frankly, being able to hit CTRL and T whenever I want to experience playing 120Hz confirms this everytime. :P

Regarding games looking 'better'? As in better because of higher definition? I think 3D provides a significant boost in visual information, which I think leads to increased 'immersion'. And I reckon immersion blows away resolution... likewise, everytime.


xXxStarManxXx said:I even kept my PG278Q on a monitor arm and still have games that admittedly do look better in 3D Vision (The Witcher 3, Batman: Arkham Knight) when they don't exhibit sub 50 FPS non G-Sync stutter...


S'cuse my ignorance with this:

I have used G-Sync monitors many MANY times and have no idea what the difference is!! Is this stutter something that only happens with SLI? I have only used SLI once and I did notice stuttering. Now I game with single cards I have never seen it, regardless of FPS.

Thanks


In all of my titles I play in 3D Vision, if the FPS dips under 55 FPS (even here there is some stutter) the stutter is extremely noticeable, yes, exactly like poor SLI / X-Fire support.

Right now the two games I'm still playing in 3D Vision do this, The Witcher 3 and Batman: Arkham Knight. I've tried literally everything, disabling in-game V-Sync in favor of Nvidia' V-Sync on in NVCP, triple buffering on and off (all this does for me on is increase input lag), maximum pre-renderd frames to 1,2,3, it doesn't matter. The Witcher 3 exhibits atrocious stutter even at 50 FPS. It has forced me to sacrifice quite a bit in visual fidelity, I've turned down a lot, and have gone as far as running a custom resolution that is 10% less than 2560x1440 to address it and I've mostly eliminated it.

This isn't a problem with a variable refresh rate panel, it's only V-Sync, and it seems particularly bad with 3D Vision for some reason.

Add the double talk, flat 16:9, and panel shortcomings (see previous post) and it's no small wonder I prefer my AW3418DW over my PG278Q for all games, all activities, all movies (no black bars on top and bottom, even on Netflix!) with exception of the two aforementioned games, which I play maybe once every few weeks.

I don't know what it is but there is nothing like Batman:Arkham Knight in 3D Vision, I liked the game to begin with, but it's really good in 3D Vision (the fact that it's at night means the instance of double talk is much less than other games, such as The Witcher 3 in any scene in broad daylight).

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 03/17/2019 11:41 PM   
I'm sorry for just copy/pasting from the other topic, but I want to share here too the story of my love affair with the stereoscopic 3D Vision and why I am so grateful to the fantastic guys at Helix Mod. [quote="bpooch420"]... And to supplement this discussion I surmise that not many got into this initially because of price and game compatibility, and as time went by, newer forms of VR came around and it was kind of put of the back burner. I've always been interested to see what it was actually like to say play half life 1 in 3DVision. [/quote] bpooch420, if you have had the chance to see how [color="orange"]any[/color] game looks in real 3D, I'm pretty sure you would have found really difficult to go back to playing in flat 2D. Personally, I bought my first gaming 3D device sometime around 2000. I remember the glasses looking very bulky, I believe these ones: [img]http://www.dansdata.com/images/vr100/vr100320.jpg[/img] It didn't matter. It was a revelation! For the first time, I was able to understand how lucky we are to have two eyes. With the help of the new 3D technology, I was [b][color="orange"]inside[/color][/b] the virtual worlds created by the game developers. I wasn't anymore just looking at some moving flat images. As games evolved and became nicer and nicer, the stereoscopic 3D made more and more sense. Volumetric fog, rain, snow, lights, shadows, transparency, all became worthy of being introduced. I bought my first Play Station 3 only because I was keen to play Avatar in 3D... As for the VR, I preordered my Oculus Rift when the commercial version was first available. Yes, it was cool in the beginning. But, the inferior resolution, the bulkiness of the wired headset and the lack of full experiences made me use it less and less. In its current state, the VR can't yet beat the 3D Vision experience. The enjoyment of playing games like the Witcher 2 and 3, Madmax, Fallout, Skyrim, Tomb Raider etc, etc, on a big screen 3D tv (together with a friend/friends, partner/wife) cannot be currently surpassed. For this, I'm grateful to NVidia for making this possible, but I'm even more grateful to the guys at Helix Mod [url]http://helixmod.blogspot.com/2013/10/game-list-automatically-updated.html[/url] who, during many years, passionately have used their own time and energy to keep the 3D alive. All this for free!
I'm sorry for just copy/pasting from the other topic, but I want to share here too the story of my love affair with the stereoscopic 3D Vision and why I am so grateful to the fantastic guys at Helix Mod.


bpooch420 said:... And to supplement this discussion I surmise that not many got into this initially because of price and game compatibility, and as time went by, newer forms of VR came around and it was kind of put of the back burner. I've always been interested to see what it was actually like to say play half life 1 in 3DVision.


bpooch420, if you have had the chance to see how any game looks in real 3D, I'm pretty sure you would have found really difficult to go back to playing in flat 2D.

Personally, I bought my first gaming 3D device sometime around 2000.
I remember the glasses looking very bulky, I believe these ones: Image

It didn't matter.

It was a revelation!

For the first time, I was able to understand how lucky we are to have two eyes. With the help of the new 3D technology, I was inside the virtual worlds created by the game developers. I wasn't anymore just looking at some moving flat images.

As games evolved and became nicer and nicer, the stereoscopic 3D made more and more sense. Volumetric fog, rain, snow, lights, shadows, transparency, all became worthy of being introduced.

I bought my first Play Station 3 only because I was keen to play Avatar in 3D...

As for the VR, I preordered my Oculus Rift when the commercial version was first available. Yes, it was cool in the beginning. But, the inferior resolution, the bulkiness of the wired headset and the lack of full experiences made me use it less and less.

In its current state, the VR can't yet beat the 3D Vision experience.

The enjoyment of playing games like the Witcher 2 and 3, Madmax, Fallout, Skyrim, Tomb Raider etc, etc, on a big screen 3D tv (together with a friend/friends, partner/wife) cannot be currently surpassed.

For this, I'm grateful to NVidia for making this possible, but I'm even more grateful to the guys at Helix Mod http://helixmod.blogspot.com/2013/10/game-list-automatically-updated.html who, during many years, passionately have used their own time and energy to keep the 3D alive. All this for free!

Overclocked Intel® Core™i5-4690k Quad Core
32 Gb RAM
8GB GEFORCE GTX 1080
3D Vision 2
Windows 10 64 Bit
NVidia driver 419.17
SAMSUNG - UE55H8000 Smart 3D 55" Curved
Philips G-Sync 272G
Oculus Rift with Touch controlers

Posted 03/18/2019 12:34 AM   
My love story with 3d gaming started out with elsa glasses and 3dfx around 1994-95, my first 3d game was Descent and since then never have played a game in 2D again. All my GPUs since 3dfx bought by Nvidia has been all Nvidia GPUs, from 3dfx Voodoos to Nvidia TNT and since then been keeping up with most of its gens, currently I have 3 gens in the house, 2 Geforce 275 SLI, GTX1070 in a backup system and RTX2080 in my daily use system. And all this because of 3D Vision gaming. 25 years, the love is during and true. Even currently I have both the rift and odyssey+ VR headsets, 3D Vision gaming is still the main attraction because it's still unsurpassed by VR. Nvidia has been instrumental in all this and I guess that it must be proud of 3D Vision as a product. If Nvidia let 3D vision die it would be a great tragic waste and an unwanted but neccessary stab to the heart of many of its most loyal customers. I don't think Nvidia really wants that to happen and I truly hope it will do the right thing.
My love story with 3d gaming started out with elsa glasses and 3dfx around 1994-95, my first 3d game was Descent and since then never have played a game in 2D again. All my GPUs since 3dfx bought by Nvidia has been all Nvidia GPUs, from 3dfx Voodoos to Nvidia TNT and since then been keeping up with most of its gens, currently I have 3 gens in the house, 2 Geforce 275 SLI, GTX1070 in a backup system and RTX2080 in my daily use system. And all this because of 3D Vision gaming. 25 years, the love is during and true. Even currently I have both the rift and odyssey+ VR headsets, 3D Vision gaming is still the main attraction because it's still unsurpassed by VR.

Nvidia has been instrumental in all this and I guess that it must be proud of 3D Vision as a product. If Nvidia let 3D vision die it would be a great tragic waste and an unwanted but neccessary stab to the heart of many of its most loyal customers. I don't think Nvidia really wants that to happen and I truly hope it will do the right thing.

Xeon X5675 hex cores @4.4 GHz, GTX 1070, win10 pro
i7 7700k 5GHz, RTX 2080, win10 pro
Benq 2720Z, w1070, Oculus Rift cv1, Samsung Odyssey+

Posted 03/18/2019 02:32 AM   
[quote="costiq"]I'm sorry for just copy/pasting from the other topic, but I want to share here too the story of my love affair with the stereoscopic 3D Vision and why I am so grateful to the fantastic guys at Helix Mod. [quote="bpooch420"]... And to supplement this discussion I surmise that not many got into this initially because of price and game compatibility, and as time went by, newer forms of VR came around and it was kind of put of the back burner. I've always been interested to see what it was actually like to say play half life 1 in 3DVision. [/quote] bpooch420, if you have had the chance to see how [color="orange"]any[/color] game looks in real 3D, I'm pretty sure you would have found really difficult to go back to playing in flat 2D. Personally, I bought my first gaming 3D device sometime around 2000. I remember the glasses looking very bulky, I believe these ones: [img]http://www.dansdata.com/images/vr100/vr100320.jpg[/img] It didn't matter. It was a revelation! For the first time, I was able to understand how lucky we are to have two eyes. With the help of the new 3D technology, I was [b][color="orange"]inside[/color][/b] the virtual worlds created by the game developers. I wasn't anymore just looking at some moving flat images. As games evolved and became nicer and nicer, the stereoscopic 3D made more and more sense. Volumetric fog, rain, snow, lights, shadows, transparency, all became worthy of being introduced. I bought my first Play Station 3 only because I was keen to play Avatar in 3D... As for the VR, I preordered my Oculus Rift when the commercial version was first available. Yes, it was cool in the beginning. But, the inferior resolution, the bulkiness of the wired headset and the lack of full experiences made me use it less and less. In its current state, the VR can't yet beat the 3D Vision experience. The enjoyment of playing games like the Witcher 2 and 3, Madmax, Fallout, Skyrim, Tomb Raider etc, etc, on a big screen 3D tv (together with a friend/friends, partner/wife) cannot be currently surpassed. For this, I'm grateful to NVidia for making this possible, but I'm even more grateful to the guys at Helix Mod [url]http://helixmod.blogspot.com/2013/10/game-list-automatically-updated.html[/url] who, during many years, passionately have used their own time and energy to keep the 3D alive. All this for free![/quote] Great post, I agree with most of the games you listed except Fallout, if youre referring to FO4 that is as the CPU bottleneck induced by the "3 Core Bug" renders the game nigh unplayable, and that's with an 8700k @ 5.0 GHz. It's just too CPU intensive for 3D Vision, same thing goes for GTA 5 and a few other titles unfortunately. I'm looking forward to picking up an HTC Vive Pro if / when it goes on sale. I tried the Occulus Rift demo at a local Best Buy and my first observation was the screen door effect. Other than that, from those who have used both, most say that VR is actually better than 3D Vision if I remember correctly. I guess the future for me is VR as a replacement for 3D Vision and all 2D content in curved 21:9. Not a bad future at all really. Once we figure out how to get G-Sync working on Nvidia GPU's I can finally say goodbye to Nvidia for good. They have lost all of my respect and admiration with the Turing GPU rename scam and now the official abandonment of 3D Vision.
costiq said:I'm sorry for just copy/pasting from the other topic, but I want to share here too the story of my love affair with the stereoscopic 3D Vision and why I am so grateful to the fantastic guys at Helix Mod.


bpooch420 said:... And to supplement this discussion I surmise that not many got into this initially because of price and game compatibility, and as time went by, newer forms of VR came around and it was kind of put of the back burner. I've always been interested to see what it was actually like to say play half life 1 in 3DVision.


bpooch420, if you have had the chance to see how any game looks in real 3D, I'm pretty sure you would have found really difficult to go back to playing in flat 2D.

Personally, I bought my first gaming 3D device sometime around 2000.
I remember the glasses looking very bulky, I believe these ones: Image

It didn't matter.

It was a revelation!

For the first time, I was able to understand how lucky we are to have two eyes. With the help of the new 3D technology, I was inside the virtual worlds created by the game developers. I wasn't anymore just looking at some moving flat images.

As games evolved and became nicer and nicer, the stereoscopic 3D made more and more sense. Volumetric fog, rain, snow, lights, shadows, transparency, all became worthy of being introduced.

I bought my first Play Station 3 only because I was keen to play Avatar in 3D...

As for the VR, I preordered my Oculus Rift when the commercial version was first available. Yes, it was cool in the beginning. But, the inferior resolution, the bulkiness of the wired headset and the lack of full experiences made me use it less and less.

In its current state, the VR can't yet beat the 3D Vision experience.

The enjoyment of playing games like the Witcher 2 and 3, Madmax, Fallout, Skyrim, Tomb Raider etc, etc, on a big screen 3D tv (together with a friend/friends, partner/wife) cannot be currently surpassed.

For this, I'm grateful to NVidia for making this possible, but I'm even more grateful to the guys at Helix Mod http://helixmod.blogspot.com/2013/10/game-list-automatically-updated.html who, during many years, passionately have used their own time and energy to keep the 3D alive. All this for free!


Great post, I agree with most of the games you listed except Fallout, if youre referring to FO4 that is as the CPU bottleneck induced by the "3 Core Bug" renders the game nigh unplayable, and that's with an 8700k @ 5.0 GHz. It's just too CPU intensive for 3D Vision, same thing goes for GTA 5 and a few other titles unfortunately.

I'm looking forward to picking up an HTC Vive Pro if / when it goes on sale. I tried the Occulus Rift demo at a local Best Buy and my first observation was the screen door effect. Other than that, from those who have used both, most say that VR is actually better than 3D Vision if I remember correctly.

I guess the future for me is VR as a replacement for 3D Vision and all 2D content in curved 21:9. Not a bad future at all really. Once we figure out how to get G-Sync working on Nvidia GPU's I can finally say goodbye to Nvidia for good. They have lost all of my respect and admiration with the Turing GPU rename scam and now the official abandonment of 3D Vision.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 03/18/2019 03:07 AM   
[quote="distantreader"]My love story with 3d gaming started out with elsa glasses and 3dfx around 1994-95, my first 3d game was Descent and since then never have played a game in 2D again. All my GPUs since 3dfx bought by Nvidia has been all Nvidia GPUs, from 3dfx Voodoos to Nvidia TNT and since then been keeping up with most of its gens, currently I have 3 gens in the house, 2 Geforce 275 SLI, GTX1070 in a backup system and RTX2080 in my daily use system. And all this because of 3D Vision gaming. 25 years, the love is during and true. Even currently I have both the rift and odyssey+ VR headsets, 3D Vision gaming is still the main attraction because it's still unsurpassed by VR. Nvidia has been instrumental in all this and I guess that it must be proud of 3D Vision as a product. If Nvidia let 3D vision die it would be a great tragic waste and an unwanted but neccessary stab to the heart of many of its most loyal customers. I don't think Nvidia really wants that to happen and I truly hope it will do the right thing. [/quote] My love for 3D Vision started just by happenstance when I acquired my PG278Q in early 2014. I didn't get this monitor for 3D Vision, but for G-Sync, but because I was aware of the the feature I figured I would give try it out out of curiosity and so I purchased a used emitter and glasses from ebay and I was ABSOLUTELY blown away with it, left wondering where it was all my life. I hung in there, and I have to say that as with you, all of my GPU upgrades, from 780 Ti to 780 Ti SLI to single 980 Ti and now single 1080 Ti were due, primarily, because of 3D Vision. Hell I even built my new computer, upgrading from an i7 4930k @ 5.0 GHz under a 140mm AIO (NZXT x41) to a binned and delidded 8700k running @ 5.0 GHz under 1kw of rad surface area (shared with 1080 Ti) mostly because of the "3 Core Bug", which was present in Geothermal Valley in Rise of the Tomb Raider, in GTA 5, the "Kings Crowning Dinner Party" in Skillege in TW3 (40 FPS here, all CPU necked, now it's 60 FPS, but I'm no longer there lol, although there are still plenty of other CPU intensive areas of the game) various areas of XCOM 2 etc (I've switched to 21:9 as I got tired of fighting with the unofficial fix, having to change the convergence every time the camera changes gets old real quick, and this is one of those games that looks better at 21:9 vs. 16:9 3D anyway) among many other games. In the end I'm happy I upgraded as I still see CPU bottlenecks in a lot of 2D content (XCOM 2 is bad no matter what) to include Zelda: BOTW etc. and it's nice having the 1080 Ti driving the 3440x1440 panel but yeah, I'm right there with you in terms of having spent so much money to experience 3D Vision at it's best, especially with newer, more demanding titles. But Nvidia absolutely are ending support for it. And this brings up another point that I feel the need to reiterate, we absolutely should NOT reward them by running out and volunteering for the idiot of the week price-gouge-me-now-please $1300 "2080 Ti" purchase simply to extend the life of this as far as we can (you won't be able to upgrade to a new GPU beyond Turing and continue to use 3D Vision). I have to ask the devout 3D Vision community, where are your principles? How many of us have threatened Nvidia along the lines "If you discontinue 3D Vision support / do not improve 3D Vision support then I will sell my emitter, glasses and panel and my next GPU will be from AMD!" or something along these lines? I have, and I've seen the equivalent a lot here. Ok, well Nvidia has announced that they are discontinuing official support for 3D Vision. Time to make a principled stand. They don't care about you! I mean they had to include the possibility of alienating a segment of their loyal consumer-base with the decision to discontinue 3D Vision support, and you know what? It pains me to say this but I'm sure that they came to the conclusion that that the majority of us are too stupid to care or make a principled stand. [i][b]Engineer 1: "If we drop 3D Vision support from the drivers we can focus on other features and cut development cost down a bit" Huang: "What about the consumer-base who is still using the feature, wont we alienate them?" Statistician 1: "Only 2% of those with Nvidia hardware are using the feature according to telemetry analysis" Engineer 1: "That is correct, it's a small minority, and we can recoup some of the R&D expenditure now that RT and DLSS have been a spectacular failure, the public isn't buying it" Tom Peterson: "Yes that is correct, RT and DLSS has been roundly rejected by the consumer-base, and most review outlets that we don't own, ahem, sponsor / authorize state that the features are not worth the performance cost, we need to trim the fat and ditching 3D Vision seems like a good place to start, especially now that we can also expect to lose revenue from selling G-Sync modules." Huang: "Well, screw em, they are so braindead and stupid, I mean according to most sources they fell for the fact that we renamed the entire Turing lineup one GPU higher, doubling the prices, I mean if they fell for that then they will probably continue to buy our products even if we drop 3D Vision, we got them by the balls, full speed ahead!" Tom Peterson: "Great plan boss, you can get that second hardened bunker in New Zealand for the apocalypse with the savings, or the second Bugatti Chiron you wanted, the choice is yours great leader!" [/b][/i] Hearing prominent members of the 3D Vision community announce that they are going to run out and buy a $1300 80 card to reward Nvidia is the most cringe-worthy example of sado-masochism I've seen in a good while. It's like the crack addict stooping to a new low, "I will suck yo _____" Time to man up and stand by your principles, NGreedia just threw you by the wayside and they took a gamble that you wouldn't flinch and would continue to be a loyal customer. Why? To experience 3D Vision for another 3 years with a card some only 25% faster than a card half it's price? The insanity.
distantreader said:My love story with 3d gaming started out with elsa glasses and 3dfx around 1994-95, my first 3d game was Descent and since then never have played a game in 2D again. All my GPUs since 3dfx bought by Nvidia has been all Nvidia GPUs, from 3dfx Voodoos to Nvidia TNT and since then been keeping up with most of its gens, currently I have 3 gens in the house, 2 Geforce 275 SLI, GTX1070 in a backup system and RTX2080 in my daily use system. And all this because of 3D Vision gaming. 25 years, the love is during and true. Even currently I have both the rift and odyssey+ VR headsets, 3D Vision gaming is still the main attraction because it's still unsurpassed by VR.

Nvidia has been instrumental in all this and I guess that it must be proud of 3D Vision as a product. If Nvidia let 3D vision die it would be a great tragic waste and an unwanted but neccessary stab to the heart of many of its most loyal customers. I don't think Nvidia really wants that to happen and I truly hope it will do the right thing.


My love for 3D Vision started just by happenstance when I acquired my PG278Q in early 2014. I didn't get this monitor for 3D Vision, but for G-Sync, but because I was aware of the the feature I figured I would give try it out out of curiosity and so I purchased a used emitter and glasses from ebay and I was ABSOLUTELY blown away with it, left wondering where it was all my life. I hung in there, and I have to say that as with you, all of my GPU upgrades, from 780 Ti to 780 Ti SLI to single 980 Ti and now single 1080 Ti were due, primarily, because of 3D Vision. Hell I even built my new computer, upgrading from an i7 4930k @ 5.0 GHz under a 140mm AIO (NZXT x41) to a binned and delidded 8700k running @ 5.0 GHz under 1kw of rad surface area (shared with 1080 Ti) mostly because of the "3 Core Bug", which was present in Geothermal Valley in Rise of the Tomb Raider, in GTA 5, the "Kings Crowning Dinner Party" in Skillege in TW3 (40 FPS here, all CPU necked, now it's 60 FPS, but I'm no longer there lol, although there are still plenty of other CPU intensive areas of the game) various areas of XCOM 2 etc (I've switched to 21:9 as I got tired of fighting with the unofficial fix, having to change the convergence every time the camera changes gets old real quick, and this is one of those games that looks better at 21:9 vs. 16:9 3D anyway) among many other games.

In the end I'm happy I upgraded as I still see CPU bottlenecks in a lot of 2D content (XCOM 2 is bad no matter what) to include Zelda: BOTW etc. and it's nice having the 1080 Ti driving the 3440x1440 panel but yeah, I'm right there with you in terms of having spent so much money to experience 3D Vision at it's best, especially with newer, more demanding titles.

But Nvidia absolutely are ending support for it. And this brings up another point that I feel the need to reiterate, we absolutely should NOT reward them by running out and volunteering for the idiot of the week price-gouge-me-now-please $1300 "2080 Ti" purchase simply to extend the life of this as far as we can (you won't be able to upgrade to a new GPU beyond Turing and continue to use 3D Vision).

I have to ask the devout 3D Vision community, where are your principles?

How many of us have threatened Nvidia along the lines "If you discontinue 3D Vision support / do not improve 3D Vision support then I will sell my emitter, glasses and panel and my next GPU will be from AMD!" or something along these lines?

I have, and I've seen the equivalent a lot here.

Ok, well Nvidia has announced that they are discontinuing official support for 3D Vision.

Time to make a principled stand.

They don't care about you! I mean they had to include the possibility of alienating a segment of their loyal consumer-base with the decision to discontinue 3D Vision support, and you know what? It pains me to say this but I'm sure that they came to the conclusion that that the majority of us are too stupid to care or make a principled stand.

Engineer 1: "If we drop 3D Vision support from the drivers we can focus on other features and cut development cost down a bit"

Huang: "What about the consumer-base who is still using the feature, wont we alienate them?"

Statistician 1: "Only 2% of those with Nvidia hardware are using the feature according to telemetry analysis"

Engineer 1: "That is correct, it's a small minority, and we can recoup some of the R&D expenditure now that RT and DLSS have been a spectacular failure, the public isn't buying it"

Tom Peterson: "Yes that is correct, RT and DLSS has been roundly rejected by the consumer-base, and most review outlets that we don't own, ahem, sponsor / authorize state that the features are not worth the performance cost, we need to trim the fat and ditching 3D Vision seems like a good place to start, especially now that we can also expect to lose revenue from selling G-Sync modules."

Huang: "Well, screw em, they are so braindead and stupid, I mean according to most sources they fell for the fact that we renamed the entire Turing lineup one GPU higher, doubling the prices, I mean if they fell for that then they will probably continue to buy our products even if we drop 3D Vision, we got them by the balls, full speed ahead!"

Tom Peterson: "Great plan boss, you can get that second hardened bunker in New Zealand for the apocalypse with the savings, or the second Bugatti Chiron you wanted, the choice is yours great leader!"


Hearing prominent members of the 3D Vision community announce that they are going to run out and buy a $1300 80 card to reward Nvidia is the most cringe-worthy example of sado-masochism I've seen in a good while.

It's like the crack addict stooping to a new low, "I will suck yo _____"

Time to man up and stand by your principles, NGreedia just threw you by the wayside and they took a gamble that you wouldn't flinch and would continue to be a loyal customer.

Why?

To experience 3D Vision for another 3 years with a card some only 25% faster than a card half it's price?

The insanity.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 03/18/2019 03:42 AM   
I agree about the 2080Ti being too expensive, principles and all that. It's why I didn't buy it yet (also because of the horror stories about dead on arrival or dead after two weeks GPUs). It's like rewarding them for killing 3D Vision if I buy it instead of my original plan of waiting for the next generation. However, I'm really feeling the need of better performance for 1440p, and a 2080Ti would give me what I need most of the time. I wish it didn't cost more than 900€. I'm hoping for a price drop someday...
I agree about the 2080Ti being too expensive, principles and all that. It's why I didn't buy it yet (also because of the horror stories about dead on arrival or dead after two weeks GPUs). It's like rewarding them for killing 3D Vision if I buy it instead of my original plan of waiting for the next generation.

However, I'm really feeling the need of better performance for 1440p, and a 2080Ti would give me what I need most of the time. I wish it didn't cost more than 900€. I'm hoping for a price drop someday...

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

Posted 03/18/2019 07:29 AM   
hello, i planned time ago to buy a PG278QR+3Dvision glasses in spring, but now with this very bad news maybe soon the 3d vision Monitors can have a massive price drop and is better to wait?
hello, i planned time ago to buy a PG278QR+3Dvision glasses in spring, but now with this very bad news maybe soon the 3d vision Monitors can have a massive price drop and is better to wait?

Posted 03/18/2019 10:01 AM   
@migcar I been purchasing electronics for close to fifty years and I can tell you once inventory of 3D Vision products dry up your paid triple the retail price.
@migcar

I been purchasing electronics for close to fifty years and I can tell you once inventory of 3D Vision products dry up your paid triple the retail price.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 03/18/2019 10:10 AM   
[quote="zig11727"]@migcar I been purchasing electronics for close to fifty years and I can tell you once inventory of 3D Vision products dry up your paid triple the retail price. [/quote] ok TY, this was my alternative fear :lol:
zig11727 said:@migcar

I been purchasing electronics for close to fifty years and I can tell you once inventory of 3D Vision products dry up your paid triple the retail price.



ok TY, this was my alternative fear :lol:

Posted 03/18/2019 10:19 AM   
[quote="masterotaku"]I agree about the 2080Ti being too expensive, principles and all that. It's why I didn't buy it yet (also because of the horror stories about dead on arrival or dead after two weeks GPUs). It's like rewarding them for killing 3D Vision if I buy it instead of my original plan of waiting for the next generation. However, I'm really feeling the need of better performance for 1440p, and a 2080Ti would give me what I need most of the time. I wish it didn't cost more than 900€. I'm hoping for a price drop someday...[/quote] Totally this. This was the first gen I decided to skip for a while. So, annoying. The conspiracy theorist in me.....
masterotaku said:I agree about the 2080Ti being too expensive, principles and all that. It's why I didn't buy it yet (also because of the horror stories about dead on arrival or dead after two weeks GPUs). It's like rewarding them for killing 3D Vision if I buy it instead of my original plan of waiting for the next generation.

However, I'm really feeling the need of better performance for 1440p, and a 2080Ti would give me what I need most of the time. I wish it didn't cost more than 900€. I'm hoping for a price drop someday...


Totally this. This was the first gen I decided to skip for a while.

So, annoying.
The conspiracy theorist in me.....

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

Posted 03/18/2019 12:30 PM   
NVidia makes me very sad with this. The day that I am no longer able to play in 3D is the day I will switch to Radeon GPU's. Hopefully you experts will find a way to keep this going! 3D is the only way I will play games that are not 2D, or non graphics oriented games. Probably would switch to a console at that point for the improved reliability. I do help donating after some of the big fixes. I would certainly offer to help financially support those who could keep 3D vision alive if needed after Nvidia gives up..
NVidia makes me very sad with this. The day that I am no longer able to play in 3D is the day I will switch to Radeon GPU's. Hopefully you experts will find a way to keep this going!
3D is the only way I will play games that are not 2D, or non graphics oriented games.

Probably would switch to a console at that point for the improved reliability.

I do help donating after some of the big fixes.
I would certainly offer to help financially support those who could keep 3D vision alive if needed after Nvidia gives up..

NZXT Noctis 450. Asus ROG Formula VIII, 6700k, NZXT Kracken x61. Avexir Core DDR4 (Red) 16g. Windows 10. Samsung Evo 1T & 2T SSD. Asus Strix 2080 ti. EVGA 1300w Modular Gold PSU.
Asus ROG Swift PG278Q Monitor: 1440p 3D Vision

Posted 03/18/2019 03:26 PM   
I'm not happy with the price of RTX 2080Ti (which I own) this is the only point I agree with xXxStarManxXx then I read on Guru 3D a sneak peek of new a GPU. https://www.guru3d.com/news-story/nvidia-might-give-7nm-gpu-teaser-on-gtc.html NVidia nothing like keeping your customer base happy.
I'm not happy with the price of RTX 2080Ti (which I own) this is the only point I agree with
xXxStarManxXx then I read on Guru 3D a sneak peek of new a GPU.

https://www.guru3d.com/news-story/nvidia-might-give-7nm-gpu-teaser-on-gtc.html

NVidia nothing like keeping your customer base happy.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 03/18/2019 04:09 PM   
I would be surprised if nVidia would reveal anything pre-launch - it doesn't help them. People will just stop buying current cards, as bad as their sales might be - to wait for new cards. Last time, we didn't even know whether their new card was going to be called a "1180" or a "2080" until the actual launch event... If you don't want to give nVidia your money, one can buy second hand - barely used 2080Tis going for 800 Euros / 750 GBP with almost full warranty. On the other end of the spectrum, for those who are made of money and want something ~10% faster than a 2080Ti, there is this: [url]https://www.nvidia.com/en-gb/titan/titan-rtx/[/url]
I would be surprised if nVidia would reveal anything pre-launch - it doesn't help them.

People will just stop buying current cards, as bad as their sales might be - to wait for new cards.

Last time, we didn't even know whether their new card was going to be called a "1180" or a "2080" until the actual launch event...

If you don't want to give nVidia your money, one can buy second hand - barely used 2080Tis going for 800 Euros / 750 GBP with almost full warranty.

On the other end of the spectrum, for those who are made of money and want something ~10% faster than a 2080Ti, there is this:

https://www.nvidia.com/en-gb/titan/titan-rtx/

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 03/18/2019 05:23 PM   
Well the GTC livestream is in a little over 2 hours from now. According to Nvidia's Blog.. "Huang, speaking at the San Jose State University Events Center at 2 pm Pacific, will highlight the company’s latest innovations in AI, autonomous vehicles and robotics." https://blogs.nvidia.com/blog/2019/03/15/how-to-watch-nvidia-ceo-jensen-huangs-keynote-monday-at-gtc/ (livestream link on page)
Well the GTC livestream is in a little over 2 hours from now.

According to Nvidia's Blog..

"Huang, speaking at the San Jose State University Events Center at 2 pm Pacific, will highlight the company’s latest innovations in AI, autonomous vehicles and robotics."


https://blogs.nvidia.com/blog/2019/03/15/how-to-watch-nvidia-ceo-jensen-huangs-keynote-monday-at-gtc/


(livestream link on page)

Posted 03/18/2019 06:41 PM   
So what would be my best bet in a couple of years to keep playing my collection of DX9/10 3d games? I am a GTX 1060 6GB now on Win 10. Should I setup a dual boot with Win7 and keep the GTX 1060 on that boot with 418 drivers and then on my Win 10 boot run the latest drivers? In the future can I have 2 Nvidia cards of 2 different gens installed in my system and do the following? 1. Get a dual link DVI KVM switcher 2. For Win 7 boot run off the 1060 with 418 release drivers and disable the newer gen card in Device Manager? 3. For Win 10 run off the new gen card with current drivers and disable the 1060 card in Device Manager? Or maybe I should just use my existing components the next time I upgrade CPU/MB/GPU and build a "legacy" PC for 3dvision gaming on the GTX 1060?
So what would be my best bet in a couple of years to keep playing my collection of DX9/10 3d games? I am a GTX 1060 6GB now on Win 10.

Should I setup a dual boot with Win7 and keep the GTX 1060 on that boot with 418 drivers and then on my Win 10 boot run the latest drivers?

In the future can I have 2 Nvidia cards of 2 different gens installed in my system and do the following?

1. Get a dual link DVI KVM switcher
2. For Win 7 boot run off the 1060 with 418 release drivers and disable the newer gen card in Device Manager?
3. For Win 10 run off the new gen card with current drivers and disable the 1060 card in Device Manager?

Or maybe I should just use my existing components the next time I upgrade CPU/MB/GPU and build a "legacy" PC for 3dvision gaming on the GTX 1060?

AMD FX-8350 4GHz
Gigabyte 990FXA-UD3 Rev 4.0
G-Skill PC3-10700- 16GB
Gigabyte Windforce GTX 1060 OC 6GB - 417.01
Creative Soundblaster Z
ViewSonic VX2268WM Black 22" 1680x1050 5ms 120Hz 3Dvision
Windows 10 x64 1709

Posted 03/18/2019 11:44 PM   
  19 / 42    
Scroll To Top