NVidia supporting older cards, yes or no?
Let's take this discussion over here, to avoid trashing the other threads. We can also talk about whether this affects 3D specifically. I'd like to solicit some other opinions outside of the three or four very loud voices we already have- ignoring the internet and what you read, what has been your [i]actual[/i] experience in terms of support? I'll start with my experience as being pretty good. Based on my experience with my current and last gen cards, it seems to me that people saying there is a slowdown have some sort of agenda. I'm still running ancient and weak cards by today's standards. I replaced my SLI 580s in the last 6 months, with SLI 760. Bought a used GTX 690 as well. So I actually only have experience with the older stuff. I haven't seen any sign of slowdowns in games, and I benchmark everything. I benchmarked GTA5 on an older driver, and compared to recent drivers, and it's dramatically superior in current drivers.
Let's take this discussion over here, to avoid trashing the other threads. We can also talk about whether this affects 3D specifically.

I'd like to solicit some other opinions outside of the three or four very loud voices we already have- ignoring the internet and what you read, what has been your actual experience in terms of support?


I'll start with my experience as being pretty good. Based on my experience with my current and last gen cards, it seems to me that people saying there is a slowdown have some sort of agenda.

I'm still running ancient and weak cards by today's standards. I replaced my SLI 580s in the last 6 months, with SLI 760. Bought a used GTX 690 as well. So I actually only have experience with the older stuff.

I haven't seen any sign of slowdowns in games, and I benchmark everything. I benchmarked GTA5 on an older driver, and compared to recent drivers, and it's dramatically superior in current drivers.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#1
Posted 05/31/2015 08:18 PM   
[quote=ibnlanizi]very good points. Yes, apparently this is normal practice. so let me ask you; as a consumer and a loyal costumer of nvidia throughout many years. Is this something you think is acceptable? are you willing to upgrade every 1-2 years just because your previous card, which you paid top dollar for, is no longer supported? I think its BS practice and only contributes to them selling more cards when in actuallity older cards could stick around for much longer. This makes me question whether I want to keep buying Nvidia cards because in todays world we vote and voice our opinion with our $$$[/quote] @Helixfax: This is why I opened my big trap in the other thread- this idea that NVidia deliberately sabotages older cards is simply false. It's gotten repeated enough on the internet now that people think this is common business practice and it just isn't true. There usually comes a day when a given product is not going to be supported, but that day for video cards is very late. Look at the most recent driver release notes. They removed features for GTX 2xx cards. Now that is turning off those cards, make them no longer usable for anything new. Notwithstanding they will still run their current stuff just fine. How old are those cards? You absolutely do not have to upgrade every year or even every two years. It depends upon what you want to do. If you just want to run the games on your current setup, there is nothing here that is going to require a yearly upgrade cycle. Maybe this all came up because of the new consoles. Before PCs were dramatically superior to consoles, where ALL games are targeted first. That gap has narrowed to a sliver right now, and so the newest games make our old hardware seem too slow.
ibnlanizi said:very good points. Yes, apparently this is normal practice.

so let me ask you; as a consumer and a loyal costumer of nvidia throughout many years. Is this something you think is acceptable? are you willing to upgrade every 1-2 years just because your previous card, which you paid top dollar for, is no longer supported?

I think its BS practice and only contributes to them selling more cards when in actuallity older cards could stick around for much longer.

This makes me question whether I want to keep buying Nvidia cards

because in todays world we vote and voice our opinion with our $$$

@Helixfax:
This is why I opened my big trap in the other thread- this idea that NVidia deliberately sabotages older cards is simply false. It's gotten repeated enough on the internet now that people think this is common business practice and it just isn't true.

There usually comes a day when a given product is not going to be supported, but that day for video cards is very late. Look at the most recent driver release notes. They removed features for GTX 2xx cards. Now that is turning off those cards, make them no longer usable for anything new. Notwithstanding they will still run their current stuff just fine. How old are those cards?

You absolutely do not have to upgrade every year or even every two years. It depends upon what you want to do. If you just want to run the games on your current setup, there is nothing here that is going to require a yearly upgrade cycle.


Maybe this all came up because of the new consoles. Before PCs were dramatically superior to consoles, where ALL games are targeted first. That gap has narrowed to a sliver right now, and so the newest games make our old hardware seem too slow.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#2
Posted 05/31/2015 08:26 PM   
I have : - Gtx295 (first ever DUAL GPU ^_^) Still kicks ass even today! Works solid (as it is also water-cooled) and still produces good FPS (in 2D @ 1680x1050). Drivers are 340.65 - I believe:)) Mostly used by my folks in the living room - gaming included:) - 555M on a Laptop. While the laptop is not 3D Vision ready I can plug it on one of my monitors. I am using the 347.88 drivers and so far all the game are working (even DA:I on absolute lowest works correctly in 3D Vision lewl). No problems. - Old "trusty" GTX590 on my dad's PC (also equipped with 3D Vision 2 + Asus monitor). Using 314.22 drivers (best performance) + DRS from current 350.12 drivers. Still kicks ass (That SLI really makes a difference). He usually plays "Shooting" games and love FarCry 4 (big thank you all for the fix ^_^). Works even with 2xDSR factor for around 30Fps in 3D (Yahoo!!). Really pleased with the card. - 880M on another Laptop. Using 350.12 + DRS from 353.00. And 353.00 for DSR in other/older games (that 4K looks epic ^_^). No problems. Also both 3DMigoto + Flugan's wrapper are not working correctly with Dragon Age:I in any driver after 350.12 for me. (Fix either doesn;t start or it disables itself soon after). - 2x780Ti on Desktop. Here I am jumping quite a lot since nvidia BROKE surround support (yey another BUG) from 347.88 onwards. Using DRS from 352.86 driver for obvious reasons (like playing GTAV. Witcher3 requires at least 350.12 to work in SLI on single Monitor. Surround SLI broken on all drivers currently). So overall I still am using the Fermi and Kepler architectures. But drivers were/are a mess. Once I find a good driver I usually keep it in mind and use that one as long as I can. If I need a special driver for a game to work (ex Witcher3) I install that driver before playing the game:)) revert to my "trusty" driver afterwards... A FREAKING SHAME Microsoft decided NOT TO SUPPORT hardware profiles anymore (the good XP way) and we are forced to uninstall/install a lot. Frankly the problem that I currently am seeing is not related to the actual GPUS/Hardware but with drivers... We get more and more DX11 games were 3D Vision driver doesn't work properly: Lords of the Fallen, Dragon Age:Inquisition, Witcher3, GTAV (that damn disabling anyone) and probably the list continues... I am also a bit "terrified" at the moment a new API will hit the playground as 3D Vision currently only works on DX9-DX10-DX11. (But I guess that will be in the not so near future). So overall I am really pleased with NVIDIA GPUS and mostly the tech! (I am still looking at buying a 4k Monitor + GSYNC but currently I can only find TN panels..) The real problem is that their software started to be less and less and less good (is beyond average IMO - Shadow of Mordor anyone on latest drivers? Nope...broken). Things like this are not hardware specific (doesn't matter if you have a GTX 410 or a TITAN X - you use the same shiety software :( ) Edit: @bo3b: Yupp, I noticed the removal of the GTX200 series from release notes as well:)) While I still "have" a GTX295 that works perfectly fine to the day, basically nothing new got developed for the GPU in a long long time. Btw, the gtx200 series is DX9 and DX10 ONLY. No wonder they are dropping support! You can't play DX11 games on it anyway. Now with the new consoles it seems a logical and NORMAL move! Honestly I moved from my GTX590 to the 2x780Tis not because the 590 was obsolete! No no no... that CARD STILL KICKS ass even in 3D VIsion on one monitor... However in 3D Surround was simply not enough power in it ! And I skipped the 600 series and "700" series until the 780Ti got released (basically the last card of the Kepler generation where the hardware is refined and so on). Will I buy a 900 series ? Defo not! There is no need honestly. Will I buy a 1000 series (or what the name will be) ? Maybe? Remains to be seen. I usually buy new cards when the "current" one can't produce good FPS (30+) on settings higher or at least medium quality:)) (No card today in SLI or non SLI can sustain 3D Surround at 60fps on modern games - not even the Titans X. So I have a medium taste in FPS to say it like that ;)) ) Note: DSR = Dynamic Super Resolution. DRS = Nvidia profiles + settings stored as .bin (binary files) that come with each driver release.
I have :

- Gtx295 (first ever DUAL GPU ^_^) Still kicks ass even today! Works solid (as it is also water-cooled) and still produces good FPS (in 2D @ 1680x1050). Drivers are 340.65 - I believe:)) Mostly used by my folks in the living room - gaming included:)

- 555M on a Laptop. While the laptop is not 3D Vision ready I can plug it on one of my monitors. I am using the 347.88 drivers and so far all the game are working (even DA:I on absolute lowest works correctly in 3D Vision lewl). No problems.

- Old "trusty" GTX590 on my dad's PC (also equipped with 3D Vision 2 + Asus monitor). Using 314.22 drivers (best performance) + DRS from current 350.12 drivers. Still kicks ass (That SLI really makes a difference). He usually plays "Shooting" games and love FarCry 4 (big thank you all for the fix ^_^). Works even with 2xDSR factor for around 30Fps in 3D (Yahoo!!). Really pleased with the card.

- 880M on another Laptop. Using 350.12 + DRS from 353.00. And 353.00 for DSR in other/older games (that 4K looks epic ^_^). No problems. Also both 3DMigoto + Flugan's wrapper are not working correctly with Dragon Age:I in any driver after 350.12 for me. (Fix either doesn;t start or it disables itself soon after).

- 2x780Ti on Desktop. Here I am jumping quite a lot since nvidia BROKE surround support (yey another BUG) from 347.88 onwards. Using DRS from 352.86 driver for obvious reasons (like playing GTAV. Witcher3 requires at least 350.12 to work in SLI on single Monitor. Surround SLI broken on all drivers currently).

So overall I still am using the Fermi and Kepler architectures. But drivers were/are a mess. Once I find a good driver I usually keep it in mind and use that one as long as I can. If I need a special driver for a game to work (ex Witcher3) I install that driver before playing the game:)) revert to my "trusty" driver afterwards...

A FREAKING SHAME Microsoft decided NOT TO SUPPORT hardware profiles anymore (the good XP way) and we are forced to uninstall/install a lot.

Frankly the problem that I currently am seeing is not related to the actual GPUS/Hardware but with drivers... We get more and more DX11 games were 3D Vision driver doesn't work properly: Lords of the Fallen, Dragon Age:Inquisition, Witcher3, GTAV (that damn disabling anyone) and probably the list continues...

I am also a bit "terrified" at the moment a new API will hit the playground as 3D Vision currently only works on DX9-DX10-DX11. (But I guess that will be in the not so near future).

So overall I am really pleased with NVIDIA GPUS and mostly the tech! (I am still looking at buying a 4k Monitor + GSYNC but currently I can only find TN panels..) The real problem is that their software started to be less and less and less good (is beyond average IMO - Shadow of Mordor anyone on latest drivers? Nope...broken). Things like this are not hardware specific (doesn't matter if you have a GTX 410 or a TITAN X - you use the same shiety software :( )

Edit:
@bo3b:

Yupp, I noticed the removal of the GTX200 series from release notes as well:)) While I still "have" a GTX295 that works perfectly fine to the day, basically nothing new got developed for the GPU in a long long time. Btw, the gtx200 series is DX9 and DX10 ONLY. No wonder they are dropping support! You can't play DX11 games on it anyway. Now with the new consoles it seems a logical and NORMAL move!

Honestly I moved from my GTX590 to the 2x780Tis not because the 590 was obsolete! No no no... that CARD STILL KICKS ass even in 3D VIsion on one monitor... However in 3D Surround was simply not enough power in it ! And I skipped the 600 series and "700" series until the 780Ti got released (basically the last card of the Kepler generation where the hardware is refined and so on). Will I buy a 900 series ? Defo not! There is no need honestly. Will I buy a 1000 series (or what the name will be) ? Maybe? Remains to be seen. I usually buy new cards when the "current" one can't produce good FPS (30+) on settings higher or at least medium quality:)) (No card today in SLI or non SLI can sustain 3D Surround at 60fps on modern games - not even the Titans X. So I have a medium taste in FPS to say it like that ;)) )


Note:
DSR = Dynamic Super Resolution.
DRS = Nvidia profiles + settings stored as .bin (binary files) that come with each driver release.

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#3
Posted 05/31/2015 08:34 PM   
Now? The thread created minutes after we agreed to end the discussion? :D I disagree with bo3b here: [quote]NVidia deliberately sabotages older cards is simply false.[/quote] You can technically say that it isn't a deliberate sabotage since it's not something that Nvidia does, or does not, but if what it should do is reasonably considered as "required, needed, obvious", and it costs Nvidia 100$, then it's technically "budget/time cut", but practically, it's a deliberate sabotage. I might be wrong here, but I don't think giving the Kepler series the updates takes more than one hour of work, of one single person. Just as I don't believe that Nvidia needs to wait to release the fix, as they do now. They know the longer they hide the truth about 7xx series being faster than the most hyped game of the year shows (W3), the more money they earn. I believe that this is not a fake: http://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019 and if it's not, then the flaws of game code, after being noticed and corrected by Nvidia, could be also easily corrected in Kepler family drivers. Just as it happens in most cases. I'd really like to see some site doing a comparison. Let say after a year from now. Let's take the first reviews of GTX9xx series, with 5-10 older games tested, and let's repeat the tests on older and on newer drivers, to be 100% sure the test is fair. Let's see if the older games have the same difference as they did (760 ~960 and 780Ti~970-980) or were there any real Maxwell driver optimizations. Based on the fact I saw similar moves from Nvidia on the same architecture, I'm pretty certain I already can accurately predict the results of such tests. The other topic is the shape of drivers nowadays. If what that "Promit" guy wrote is true about AAA devs making stupid and most basic mistakes, then I really hope to see DX12 bringing the chance to change all that. Nvidia and AMD should say "no, f..k it. You made your game run at 20fps on 1000$ card*, then you deal with it, see if your clients like that, or sit your ass and improve your code". (* - just an example, not related to real situation) If "Promit" said the truth, then we really need more discipline across the devs worldwide, so there's no need to update driver every few month. It should really be enough to release one driver version per year, with exceptions to new architectures of course since early bugs always happen. The unfair compatition between AMD and Nvidia would end, consumers would benefit, and then AMD and Nvidia would benefit from this. At the end, even devs would benefit from this, after initial backlash about forcing them to work harder, of course ;) (or rather pursuading their publishers to give them more time for the required work, since let's be honest, most of code f.. ups are not because the coder was lazy, but because their boss (the publishers) says "20fps and no native keyboard&mouse support is good enough, 10 crashes per 5 hours of gameplay is perfectably acceptable". It's not normal that we have an architecture for 2 years, a new game comes out, and it needs the new drivers because it works 20-50% slower without it. I think this is what caused AMD to loose the market share. Quoting John Carmack's twitter: "Vulcan, save us!". Or DX12. The chance for changes is here, let's hope industry takes it.
Now? The thread created minutes after we agreed to end the discussion?
:D

I disagree with bo3b here:
NVidia deliberately sabotages older cards is simply false.


You can technically say that it isn't a deliberate sabotage since it's not something that Nvidia does, or does not, but if what it should do is reasonably considered as "required, needed, obvious", and it costs Nvidia 100$, then it's technically "budget/time cut", but practically, it's a deliberate sabotage.
I might be wrong here, but I don't think giving the Kepler series the updates takes more than one hour of work, of one single person.
Just as I don't believe that Nvidia needs to wait to release the fix, as they do now. They know the longer they hide the truth about 7xx series being faster than the most hyped game of the year shows (W3), the more money they earn.
I believe that this is not a fake:
http://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019

and if it's not, then the flaws of game code, after being noticed and corrected by Nvidia, could be also easily corrected in Kepler family drivers. Just as it happens in most cases.

I'd really like to see some site doing a comparison. Let say after a year from now. Let's take the first reviews of GTX9xx series, with 5-10 older games tested, and let's repeat the tests on older and on newer drivers, to be 100% sure the test is fair. Let's see if the older games have the same difference as they did (760 ~960 and 780Ti~970-980) or were there any real Maxwell driver optimizations.
Based on the fact I saw similar moves from Nvidia on the same architecture, I'm pretty certain I already can accurately predict the results of such tests.

The other topic is the shape of drivers nowadays. If what that "Promit" guy wrote is true about AAA devs making stupid and most basic mistakes, then I really hope to see DX12 bringing the chance to change all that. Nvidia and AMD should say "no, f..k it. You made your game run at 20fps on 1000$ card*, then you deal with it, see if your clients like that, or sit your ass and improve your code".

(* - just an example, not related to real situation)

If "Promit" said the truth, then we really need more discipline across the devs worldwide, so there's no need to update driver every few month. It should really be enough to release one driver version per year, with exceptions to new architectures of course since early bugs always happen.
The unfair compatition between AMD and Nvidia would end, consumers would benefit, and then AMD and Nvidia would benefit from this.
At the end, even devs would benefit from this, after initial backlash about forcing them to work harder, of course ;) (or rather pursuading their publishers to give them more time for the required work, since let's be honest, most of code f.. ups are not because the coder was lazy, but because their boss (the publishers) says "20fps and no native keyboard&mouse support is good enough, 10 crashes per 5 hours of gameplay is perfectably acceptable".

It's not normal that we have an architecture for 2 years, a new game comes out, and it needs the new drivers because it works 20-50% slower without it.

I think this is what caused AMD to loose the market share.
Quoting John Carmack's twitter: "Vulcan, save us!". Or DX12. The chance for changes is here, let's hope industry takes it.

#4
Posted 05/31/2015 09:06 PM   
[quote=""]Now? The thread created minutes after we agreed to end the discussion? :D [/quote] Well;)) Is a bit different discussion here than on the other thread:)) [quote=""] I think this is what caused AMD to loose the market share. Quoting John Carmack's twitter: "Vulcan, save us!". Or DX12. The chance for changes is here, let's hope industry takes it. [/quote] TOTALLY AGREE!! I really am in favor of OGL (Vulcan will technically be OpenGL 5) more than DX12 since how many "optimized" driver did you require for OpenGL games anyway? (NONE)... But not likely to happen as long as DX is created by a corporation and OpenGL is well OPEN ;)) EDIT: Big big big thanks for the LINK! EVERYONE SHOULD READ THAT!!! More over it explains very well why this is more likely to be a bug in the Kepler section of the driver rather than a "feature on purpose" :) Awesome read!
said:Now? The thread created minutes after we agreed to end the discussion?
:D


Well;)) Is a bit different discussion here than on the other thread:))

said:
I think this is what caused AMD to loose the market share.
Quoting John Carmack's twitter: "Vulcan, save us!". Or DX12. The chance for changes is here, let's hope industry takes it.


TOTALLY AGREE!! I really am in favor of OGL (Vulcan will technically be OpenGL 5) more than DX12 since how many "optimized" driver did you require for OpenGL games anyway? (NONE)... But not likely to happen as long as DX is created by a corporation and OpenGL is well OPEN ;))


EDIT: Big big big thanks for the LINK! EVERYONE SHOULD READ THAT!!!
More over it explains very well why this is more likely to be a bug in the Kepler section of the driver rather than a "feature on purpose" :)

Awesome read!

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#5
Posted 05/31/2015 09:52 PM   
SO To get the "full benefits of DX12, need 9xx series ? :D
SO To get the "full benefits of DX12, need 9xx series ? :D

#6
Posted 06/01/2015 03:57 AM   
I saw an image that shows the 9XX v1 series supports Tier 1, 9xx v2 supports Tier 2 and AMD supports Tier 3 in dx12. I've no idea what that means but you can see the image at https://i.imgur.com/o4wOYxm.png?1 There's a lot of yes's in the AMD column. To add to the Nvidia conspiracy theory, I've noticed a lot more screen tearing since G-sync was announced. Surely they did this because they want us to buy a new monitor also.
I saw an image that shows the 9XX v1 series supports Tier 1, 9xx v2 supports Tier 2 and AMD supports Tier 3 in dx12.

I've no idea what that means but you can see the image at https://i.imgur.com/o4wOYxm.png?1


There's a lot of yes's in the AMD column.

To add to the Nvidia conspiracy theory, I've noticed a lot more screen tearing since G-sync was announced. Surely they did this because they want us to buy a new monitor also.

#7
Posted 06/01/2015 04:14 AM   
I haven't noticed any slowdowns, I have however been experiencing more crashes and other issues related to the new drivers, also am pissed about all these features they come out with an promote and then not actually support them or fix any issues. Surround is still a mess. I think Nvidia driver team has lost the plot and I think they need to go back to different drivers for each gen.
I haven't noticed any slowdowns, I have however been experiencing more crashes and other issues related to the new drivers, also am pissed about all these features they come out with an promote and then not actually support them or fix any issues. Surround is still a mess.

I think Nvidia driver team has lost the plot and I think they need to go back to different drivers for each gen.

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#8
Posted 06/01/2015 06:08 AM   
Well, I'm coming in late to this discussion, so I don't really know what's going on, but to answer the original question posed: My actual experience in terms of support? I think I would probably stick with Nvidia even if 3D vision is completely killed off. I've always used AMD (actually it was still ATI when I was using the 'red' products), but I did a build years ago and put a couple of 450s in SLI and have stuck with NVidia ever since. I have never been happier with a card than with my GTX690, and so far, I haven't felt the need to upgrade. We will see once Arkham Knight comes out... That'll be the gut check. If it runs like crap I will start feeling the urge to upgrade. But if it works in 3D Vision, or at the very least, if someone can fix it to work in 3D Vision, there's a ZERO percent chance that my upgrade would be to an AMD card.
Well, I'm coming in late to this discussion, so I don't really know what's going on, but to answer the original question posed: My actual experience in terms of support? I think I would probably stick with Nvidia even if 3D vision is completely killed off. I've always used AMD (actually it was still ATI when I was using the 'red' products), but I did a build years ago and put a couple of 450s in SLI and have stuck with NVidia ever since. I have never been happier with a card than with my GTX690, and so far, I haven't felt the need to upgrade. We will see once Arkham Knight comes out... That'll be the gut check. If it runs like crap I will start feeling the urge to upgrade. But if it works in 3D Vision, or at the very least, if someone can fix it to work in 3D Vision, there's a ZERO percent chance that my upgrade would be to an AMD card.

|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64

#9
Posted 06/01/2015 12:52 PM   
Scroll To Top