The next generation of consoles is coming soon, which will heavily impact next years' PC games. Those consoles will have significantly more video RAM than most current video cards. For that reason, I'd suggest definitely getting a card with at least 3GB of video memory - preferably more. That probably means a Titan, or one of the GTX 7xx cards. Or wait till Nvidia's GTX 8xx series, which is likely going to be significantly stronger than anything available today, as it will use a brand new architecture IIRC.
I wouldn't bother going AMD unless you're on a budget. They've cleaned up their act a bit in recent times, but they still tend to make weaker cards (they currently don't have anything that even comes close to Nvidia's flagship cards), and tend to have more deficiencies than Nvidia (microstuttering in CrossfireX, belated driver updates, problems with PhysX, missing a few bells and whistles such as adaptive vsync or TXAA-type antialiasing etc.).
Having said that, the next consoles will be running AMD hardware, which means almost all games will be designed with AMD hardware primarily in mind. In theory, this could mean that PC games will work better on AMD graphics cards. Though in practice, probably not (though expect some games to require a week or two before nvidia gets optimised drivers out)
I think Nvidia is still the better option. Case in point: The consoles will also have AMD CPUs, but most people would agree that Intel CPUs are still a better option for PC gamers, since AMD CPUs are famously crap.
But yeah, the bottom line is: get something strong, with plenty of VRAM. Otherwise, it'll likely start feeling old this time next year, as games are about to jump up in their tech requirements.
The next generation of consoles is coming soon, which will heavily impact next years' PC games. Those consoles will have significantly more video RAM than most current video cards. For that reason, I'd suggest definitely getting a card with at least 3GB of video memory - preferably more. That probably means a Titan, or one of the GTX 7xx cards. Or wait till Nvidia's GTX 8xx series, which is likely going to be significantly stronger than anything available today, as it will use a brand new architecture IIRC.
I wouldn't bother going AMD unless you're on a budget. They've cleaned up their act a bit in recent times, but they still tend to make weaker cards (they currently don't have anything that even comes close to Nvidia's flagship cards), and tend to have more deficiencies than Nvidia (microstuttering in CrossfireX, belated driver updates, problems with PhysX, missing a few bells and whistles such as adaptive vsync or TXAA-type antialiasing etc.).
Having said that, the next consoles will be running AMD hardware, which means almost all games will be designed with AMD hardware primarily in mind. In theory, this could mean that PC games will work better on AMD graphics cards. Though in practice, probably not (though expect some games to require a week or two before nvidia gets optimised drivers out)
I think Nvidia is still the better option. Case in point: The consoles will also have AMD CPUs, but most people would agree that Intel CPUs are still a better option for PC gamers, since AMD CPUs are famously crap.
But yeah, the bottom line is: get something strong, with plenty of VRAM. Otherwise, it'll likely start feeling old this time next year, as games are about to jump up in their tech requirements.
It's my belief that what really matters is the software, not the hardware.
I think a better take on the question is "What games do you want to play?"
If you having something specific in mind, there is a chance that AMD with Tridef does a better job.
If you don't have a specific game in mind, and just want a wider variety, it's pretty clear that 3D Vision is the way to go. Here's a decent comparative review:
[url]http://techreport.com/review/22350/pc-gaming-in-3d-stereo-3d-vision-2-vs-hd3d[/url]
It's my belief that what really matters is the software, not the hardware.
I think a better take on the question is "What games do you want to play?"
If you having something specific in mind, there is a chance that AMD with Tridef does a better job.
If you don't have a specific game in mind, and just want a wider variety, it's pretty clear that 3D Vision is the way to go. Here's a decent comparative review:
[quote="Volnaiskra"]The next generation of consoles is coming soon, which will heavily impact next years' PC games. Those consoles will have significantly more video RAM than most current video cards. For that reason, I'd suggest definitely getting a card with at least 3GB of video memory - preferably more. That probably means a Titan, or one of the GTX 7xx cards. Or wait till Nvidia's GTX 8xx series, which is likely going to be significantly stronger than anything available today, as it will use a brand new architecture IIRC.
[/quote]
I don't think thats how its going to be. For more video memory to be useful, the card has to actually be fast enough to use it, take a look at the benchmarks for 2GB vs 4GB 6xx cards, theres 0 difference. With console hardware, this is 100% true. Plus, with the fact that they are limited to 1080p, and console games will unlikely be using AA at all (not needed due to typical console sitting distance), more video memory isn't going to be needed any time soon, especially with the 1080p limitation of 3D. Plus, theres also the fact that all these new console games is going to make for a swarm of DX11 titles, which, as we know, MSAA is utterly useless in. Super sampling will be the primary AA for the foreseeable future, and it doesn't have anything like the VRAM hit of MSAA.
Volnaiskra said:The next generation of consoles is coming soon, which will heavily impact next years' PC games. Those consoles will have significantly more video RAM than most current video cards. For that reason, I'd suggest definitely getting a card with at least 3GB of video memory - preferably more. That probably means a Titan, or one of the GTX 7xx cards. Or wait till Nvidia's GTX 8xx series, which is likely going to be significantly stronger than anything available today, as it will use a brand new architecture IIRC.
I don't think thats how its going to be. For more video memory to be useful, the card has to actually be fast enough to use it, take a look at the benchmarks for 2GB vs 4GB 6xx cards, theres 0 difference. With console hardware, this is 100% true. Plus, with the fact that they are limited to 1080p, and console games will unlikely be using AA at all (not needed due to typical console sitting distance), more video memory isn't going to be needed any time soon, especially with the 1080p limitation of 3D. Plus, theres also the fact that all these new console games is going to make for a swarm of DX11 titles, which, as we know, MSAA is utterly useless in. Super sampling will be the primary AA for the foreseeable future, and it doesn't have anything like the VRAM hit of MSAA.
[quote="Cookybiscuit"][quote="Volnaiskra"]The next generation of consoles is coming soon, which will heavily impact next years' PC games. Those consoles will have significantly more video RAM than most current video cards. For that reason, I'd suggest definitely getting a card with at least 3GB of video memory - preferably more. That probably means a Titan, or one of the GTX 7xx cards. Or wait till Nvidia's GTX 8xx series, which is likely going to be significantly stronger than anything available today, as it will use a brand new architecture IIRC.
[/quote]
I don't think thats how its going to be. For more video memory to be useful, the card has to actually be fast enough to use it, take a look at the benchmarks for 2GB vs 4GB 6xx cards, theres 0 difference. With console hardware, this is 100% true. Plus, with the fact that they are limited to 1080p, and console games will unlikely be using AA at all (not needed due to typical console sitting distance), more video memory isn't going to be needed any time soon, especially with the 1080p limitation of 3D. Plus, theres also the fact that all these new console games is going to make for a swarm of DX11 titles, which, as we know, MSAA is utterly useless in. Super sampling will be the primary AA for the foreseeable future, and it doesn't have anything like the VRAM hit of MSAA.[/quote]VRAM impacts more than just resolution or AA. I wasn't even thinking about either of those things.
For example, more VRAM allows higher resolution textures. Most games today have crappy low-res textures, because that's all that the consoles can handle.
Sometimes, PC mods allow us to use higher resolution textures (eg. the various HD texture mods for Skyrim, or the official high-res texture pack for Crysis 2). Typically, whether or not you can happily run these mods depends on your amount of VRAM, not on the actual horsepower of your card. They don't typically slow down your fps much, unless you run out of VRAM, in which case it becomes a bottleneck and the FPS plummets. (That's where you'd see big differences between your 2GB and 4GB 6xx cards)
In-game textures will almost definitely start being much more high-res next year. And where this happens, VRAM will become a common bottleneck. I suspect that many other things will also impact on VRAM next year too, such as larger levels and more NPCs.
I would think it's commonsense that if you want the best chance of smoothly running games that are built for next-gen consoles, then you should have a PC that matches or exceeds the specs of those consoles.
As we know, the actual horsepower of those consoles is already going to be lower than today's high-end PCs (in terms of teraflops etc.) One of the few areas where the specs of those consoles is significantly greater than today's PCs is VRAM. PS4 will have 8GB of unified memory (RAM + VRAM).
Therefore,as long as it has lots of VRAM, one of today's high-end cards should be fine to handle anything 2014 throws at it. (On paper at least).
Volnaiskra said:The next generation of consoles is coming soon, which will heavily impact next years' PC games. Those consoles will have significantly more video RAM than most current video cards. For that reason, I'd suggest definitely getting a card with at least 3GB of video memory - preferably more. That probably means a Titan, or one of the GTX 7xx cards. Or wait till Nvidia's GTX 8xx series, which is likely going to be significantly stronger than anything available today, as it will use a brand new architecture IIRC.
I don't think thats how its going to be. For more video memory to be useful, the card has to actually be fast enough to use it, take a look at the benchmarks for 2GB vs 4GB 6xx cards, theres 0 difference. With console hardware, this is 100% true. Plus, with the fact that they are limited to 1080p, and console games will unlikely be using AA at all (not needed due to typical console sitting distance), more video memory isn't going to be needed any time soon, especially with the 1080p limitation of 3D. Plus, theres also the fact that all these new console games is going to make for a swarm of DX11 titles, which, as we know, MSAA is utterly useless in. Super sampling will be the primary AA for the foreseeable future, and it doesn't have anything like the VRAM hit of MSAA.
VRAM impacts more than just resolution or AA. I wasn't even thinking about either of those things.
For example, more VRAM allows higher resolution textures. Most games today have crappy low-res textures, because that's all that the consoles can handle.
Sometimes, PC mods allow us to use higher resolution textures (eg. the various HD texture mods for Skyrim, or the official high-res texture pack for Crysis 2). Typically, whether or not you can happily run these mods depends on your amount of VRAM, not on the actual horsepower of your card. They don't typically slow down your fps much, unless you run out of VRAM, in which case it becomes a bottleneck and the FPS plummets. (That's where you'd see big differences between your 2GB and 4GB 6xx cards)
In-game textures will almost definitely start being much more high-res next year. And where this happens, VRAM will become a common bottleneck. I suspect that many other things will also impact on VRAM next year too, such as larger levels and more NPCs.
I would think it's commonsense that if you want the best chance of smoothly running games that are built for next-gen consoles, then you should have a PC that matches or exceeds the specs of those consoles.
As we know, the actual horsepower of those consoles is already going to be lower than today's high-end PCs (in terms of teraflops etc.) One of the few areas where the specs of those consoles is significantly greater than today's PCs is VRAM. PS4 will have 8GB of unified memory (RAM + VRAM).
Therefore,as long as it has lots of VRAM, one of today's high-end cards should be fine to handle anything 2014 throws at it. (On paper at least).
A fair point. I'm not so sure how much emphasis will be placed on texture quality though, I played console exclusively up until about a year ago, and only noticed how terrible textures are in the games I used to play once I was in a desktop environment.
A fair point. I'm not so sure how much emphasis will be placed on texture quality though, I played console exclusively up until about a year ago, and only noticed how terrible textures are in the games I used to play once I was in a desktop environment.
Did you actually think that a geforce forum would recommend AMD? (ducking)
I've used both. The both have their good and bad points. Some of those have been pointed out by others in this thread.
Since you've already purchased a projector which card(s) to get might be dictated by which brand is more compatible with your projector. With that said, if you projector is not on the officially supported Nvidia device list and your planning on using HDMI, a number of us have 'tricked' the drivers with custom .INF files to get it to work. Without looking at the specific model you have, as long as it supports standard HDMI 1.4a formats, I think you'll be covered.
IMHO what could possibly be the best S3D experience might be delivered by dual high resolution 2D projectors with polarizing filters on the lenses and driven by 1 dedicated GPU per projector.
AMD/TriDef has alot more output options and don't seem to be 'locked in' to any one technology. I think of their solutions as 'open' using open industry standards.
Nvidia on the other hand likes to limit and restrict what you can do with their cards for marketing and other reasons. Granted a full set of 3DVision approved Hw/Sw when it work looks much better than the popular AMD/Tridef solutions but I think of Nvidia as more of a 'closed system' using only their technology.
S3D technology: AMD has pretty much 'outsourced' their solution to TriDef. They have a dedicated website and team of people that work to create profiles for popular titles and have a community that also provides 'tweaks' and patches to get the best S3D results. However, there are often long delays before a 'beta' profile is released and even longer before its really 'usable' or 'playable' without visual defects. Some defects are never fixed.
While Nvidia did have in the past a more visible S3D support role and often offered technical support to developers creating for the 3Dvision solution, they have been rather MIA of late and while we still get a few '3DVision games' a year, in many/most cases whatever '3DVision readyness' Nvidia rates a game as having is a sad joke rather than an accurate statement of S3D visual worthyness and much like the AMD/Tridef camp, some issues are never fixed by the developer+Nvidia. Here's where a handful or less of dedicated S3D gamers come to the rescue to fix many games so that they will work with 3DVision. Without their hard work and support many games would not be playable asis or not look perfect or nearly perfect.
Both camps (AMD and Nvidia) do tend to get at least one S3D worthy release out a year that don't require anything or nothing more than minor patches to look good in S3D.
Performance. Because of the 'single source solution' here's where Nvidia has an edge. Games that utilize 3DVision require less resources than games running Tridef. An added bonus is that if you add SLI to the mix, games that were developed with SLI in mind perform even better.
If your looking for the most display compatibility (excluding 3DVision branded monitors) go with AMD. If you want performance, single company support (if and when its there) and a robust group of talented and dedicated S3D game fixers go with Nvidia. (the Tridef driver will work with (of course) limitations and drawbacks on Nvidia hardware)
I've been gaming/building my own computers and have worked in the industry (design, validate and test) since the early 1980's. I've owned countless brands of video cards over the years. Most of the brands I've used no longer exist. Its very likely that the major reason behind me being invested in Nvidia hardware right now is because the last ATI card I owned died because of overheating due to them sticking the cheapest sleave bearing fan they could find on a $400+ video card.
Did you actually think that a geforce forum would recommend AMD? (ducking)
I've used both. The both have their good and bad points. Some of those have been pointed out by others in this thread.
Since you've already purchased a projector which card(s) to get might be dictated by which brand is more compatible with your projector. With that said, if you projector is not on the officially supported Nvidia device list and your planning on using HDMI, a number of us have 'tricked' the drivers with custom .INF files to get it to work. Without looking at the specific model you have, as long as it supports standard HDMI 1.4a formats, I think you'll be covered.
IMHO what could possibly be the best S3D experience might be delivered by dual high resolution 2D projectors with polarizing filters on the lenses and driven by 1 dedicated GPU per projector.
AMD/TriDef has alot more output options and don't seem to be 'locked in' to any one technology. I think of their solutions as 'open' using open industry standards.
Nvidia on the other hand likes to limit and restrict what you can do with their cards for marketing and other reasons. Granted a full set of 3DVision approved Hw/Sw when it work looks much better than the popular AMD/Tridef solutions but I think of Nvidia as more of a 'closed system' using only their technology.
S3D technology: AMD has pretty much 'outsourced' their solution to TriDef. They have a dedicated website and team of people that work to create profiles for popular titles and have a community that also provides 'tweaks' and patches to get the best S3D results. However, there are often long delays before a 'beta' profile is released and even longer before its really 'usable' or 'playable' without visual defects. Some defects are never fixed.
While Nvidia did have in the past a more visible S3D support role and often offered technical support to developers creating for the 3Dvision solution, they have been rather MIA of late and while we still get a few '3DVision games' a year, in many/most cases whatever '3DVision readyness' Nvidia rates a game as having is a sad joke rather than an accurate statement of S3D visual worthyness and much like the AMD/Tridef camp, some issues are never fixed by the developer+Nvidia. Here's where a handful or less of dedicated S3D gamers come to the rescue to fix many games so that they will work with 3DVision. Without their hard work and support many games would not be playable asis or not look perfect or nearly perfect.
Both camps (AMD and Nvidia) do tend to get at least one S3D worthy release out a year that don't require anything or nothing more than minor patches to look good in S3D.
Performance. Because of the 'single source solution' here's where Nvidia has an edge. Games that utilize 3DVision require less resources than games running Tridef. An added bonus is that if you add SLI to the mix, games that were developed with SLI in mind perform even better.
If your looking for the most display compatibility (excluding 3DVision branded monitors) go with AMD. If you want performance, single company support (if and when its there) and a robust group of talented and dedicated S3D game fixers go with Nvidia. (the Tridef driver will work with (of course) limitations and drawbacks on Nvidia hardware)
I've been gaming/building my own computers and have worked in the industry (design, validate and test) since the early 1980's. I've owned countless brands of video cards over the years. Most of the brands I've used no longer exist. Its very likely that the major reason behind me being invested in Nvidia hardware right now is because the last ATI card I owned died because of overheating due to them sticking the cheapest sleave bearing fan they could find on a $400+ video card.
[quote="Cookybiscuit"]A fair point. I'm not so sure how much emphasis will be placed on texture quality though, I played console exclusively up until about a year ago, and only noticed how terrible textures are in the games I used to play once I was in a desktop environment.[/quote]That's a good point. I guess it's like AA in that mostly PC gamers care about it.
Though I think we're all just so used to very low res textures that most of us just take them for granted. And devs probably make games in such a way as to hide the low res textures most of the time. But maybe that'll change.
Imagine, for example, a game where you walk up to an open book on a table, and you can actually read all of the text on the page. That would be pretty sweet for immersion, and would make a huge difference even to the console experience, but you'd almost never see it happen in today's games. But maybe that sort of thing will become commonplace in next-gen games.
Cookybiscuit said:A fair point. I'm not so sure how much emphasis will be placed on texture quality though, I played console exclusively up until about a year ago, and only noticed how terrible textures are in the games I used to play once I was in a desktop environment.
That's a good point. I guess it's like AA in that mostly PC gamers care about it.
Though I think we're all just so used to very low res textures that most of us just take them for granted. And devs probably make games in such a way as to hide the low res textures most of the time. But maybe that'll change.
Imagine, for example, a game where you walk up to an open book on a table, and you can actually read all of the text on the page. That would be pretty sweet for immersion, and would make a huge difference even to the console experience, but you'd almost never see it happen in today's games. But maybe that sort of thing will become commonplace in next-gen games.
3D Vision 2 user here and when it's working it's bloody amazing, the 3d is out of this world.
Everything is super crisp and well just looks stunning. So it's pure awesomeness thanks mostly to Helix and other great dudes making this patches.
Of course I would have liked that nvidia supported there own tech but the dont give a crap about it any more and not a single word from them over here and thats why I think among other things they are doucebags but hey the 3D and the stability of nvidia cards are amazing :)
3D Vision 2 user here and when it's working it's bloody amazing, the 3d is out of this world.
Everything is super crisp and well just looks stunning. So it's pure awesomeness thanks mostly to Helix and other great dudes making this patches.
Of course I would have liked that nvidia supported there own tech but the dont give a crap about it any more and not a single word from them over here and thats why I think among other things they are doucebags but hey the 3D and the stability of nvidia cards are amazing :)
Thanks to everyone for your valuable opnions and experience. I'm at this point 80/20 for Nvidia/AMD.
Cheers
[quote="mbloof"]Did you actually think that a geforce forum would recommend AMD? (ducking)
I've used both. The both have their good and bad points. Some of those have been pointed out by others in this thread.
Since you've already purchased a projector which card(s) to get might be dictated by which brand is more compatible with your projector. With that said, if you projector is not on the officially supported Nvidia device list and your planning on using HDMI, a number of us have 'tricked' the drivers with custom .INF files to get it to work. Without looking at the specific model you have, as long as it supports standard HDMI 1.4a formats, I think you'll be covered.
IMHO what could possibly be the best S3D experience might be delivered by dual high resolution 2D projectors with polarizing filters on the lenses and driven by 1 dedicated GPU per projector.
AMD/TriDef has alot more output options and don't seem to be 'locked in' to any one technology. I think of their solutions as 'open' using open industry standards.
Nvidia on the other hand likes to limit and restrict what you can do with their cards for marketing and other reasons. Granted a full set of 3DVision approved Hw/Sw when it work looks much better than the popular AMD/Tridef solutions but I think of Nvidia as more of a 'closed system' using only their technology.
S3D technology: AMD has pretty much 'outsourced' their solution to TriDef. They have a dedicated website and team of people that work to create profiles for popular titles and have a community that also provides 'tweaks' and patches to get the best S3D results. However, there are often long delays before a 'beta' profile is released and even longer before its really 'usable' or 'playable' without visual defects. Some defects are never fixed.
While Nvidia did have in the past a more visible S3D support role and often offered technical support to developers creating for the 3Dvision solution, they have been rather MIA of late and while we still get a few '3DVision games' a year, in many/most cases whatever '3DVision readyness' Nvidia rates a game as having is a sad joke rather than an accurate statement of S3D visual worthyness and much like the AMD/Tridef camp, some issues are never fixed by the developer+Nvidia. Here's where a handful or less of dedicated S3D gamers come to the rescue to fix many games so that they will work with 3DVision. Without their hard work and support many games would not be playable asis or not look perfect or nearly perfect.
Both camps (AMD and Nvidia) do tend to get at least one S3D worthy release out a year that don't require anything or nothing more than minor patches to look good in S3D.
Performance. Because of the 'single source solution' here's where Nvidia has an edge. Games that utilize 3DVision require less resources than games running Tridef. An added bonus is that if you add SLI to the mix, games that were developed with SLI in mind perform even better.
If your looking for the most display compatibility (excluding 3DVision branded monitors) go with AMD. If you want performance, single company support (if and when its there) and a robust group of talented and dedicated S3D game fixers go with Nvidia. (the Tridef driver will work with (of course) limitations and drawbacks on Nvidia hardware)
I've been gaming/building my own computers and have worked in the industry (design, validate and test) since the early 1980's. I've owned countless brands of video cards over the years. Most of the brands I've used no longer exist. Its very likely that the major reason behind me being invested in Nvidia hardware right now is because the last ATI card I owned died because of overheating due to them sticking the cheapest sleave bearing fan they could find on a $400+ video card.[/quote]
Thanks to everyone for your valuable opnions and experience. I'm at this point 80/20 for Nvidia/AMD.
Cheers
mbloof said:Did you actually think that a geforce forum would recommend AMD? (ducking)
I've used both. The both have their good and bad points. Some of those have been pointed out by others in this thread.
Since you've already purchased a projector which card(s) to get might be dictated by which brand is more compatible with your projector. With that said, if you projector is not on the officially supported Nvidia device list and your planning on using HDMI, a number of us have 'tricked' the drivers with custom .INF files to get it to work. Without looking at the specific model you have, as long as it supports standard HDMI 1.4a formats, I think you'll be covered.
IMHO what could possibly be the best S3D experience might be delivered by dual high resolution 2D projectors with polarizing filters on the lenses and driven by 1 dedicated GPU per projector.
AMD/TriDef has alot more output options and don't seem to be 'locked in' to any one technology. I think of their solutions as 'open' using open industry standards.
Nvidia on the other hand likes to limit and restrict what you can do with their cards for marketing and other reasons. Granted a full set of 3DVision approved Hw/Sw when it work looks much better than the popular AMD/Tridef solutions but I think of Nvidia as more of a 'closed system' using only their technology.
S3D technology: AMD has pretty much 'outsourced' their solution to TriDef. They have a dedicated website and team of people that work to create profiles for popular titles and have a community that also provides 'tweaks' and patches to get the best S3D results. However, there are often long delays before a 'beta' profile is released and even longer before its really 'usable' or 'playable' without visual defects. Some defects are never fixed.
While Nvidia did have in the past a more visible S3D support role and often offered technical support to developers creating for the 3Dvision solution, they have been rather MIA of late and while we still get a few '3DVision games' a year, in many/most cases whatever '3DVision readyness' Nvidia rates a game as having is a sad joke rather than an accurate statement of S3D visual worthyness and much like the AMD/Tridef camp, some issues are never fixed by the developer+Nvidia. Here's where a handful or less of dedicated S3D gamers come to the rescue to fix many games so that they will work with 3DVision. Without their hard work and support many games would not be playable asis or not look perfect or nearly perfect.
Both camps (AMD and Nvidia) do tend to get at least one S3D worthy release out a year that don't require anything or nothing more than minor patches to look good in S3D.
Performance. Because of the 'single source solution' here's where Nvidia has an edge. Games that utilize 3DVision require less resources than games running Tridef. An added bonus is that if you add SLI to the mix, games that were developed with SLI in mind perform even better.
If your looking for the most display compatibility (excluding 3DVision branded monitors) go with AMD. If you want performance, single company support (if and when its there) and a robust group of talented and dedicated S3D game fixers go with Nvidia. (the Tridef driver will work with (of course) limitations and drawbacks on Nvidia hardware)
I've been gaming/building my own computers and have worked in the industry (design, validate and test) since the early 1980's. I've owned countless brands of video cards over the years. Most of the brands I've used no longer exist. Its very likely that the major reason behind me being invested in Nvidia hardware right now is because the last ATI card I owned died because of overheating due to them sticking the cheapest sleave bearing fan they could find on a $400+ video card.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
I wouldn't bother going AMD unless you're on a budget. They've cleaned up their act a bit in recent times, but they still tend to make weaker cards (they currently don't have anything that even comes close to Nvidia's flagship cards), and tend to have more deficiencies than Nvidia (microstuttering in CrossfireX, belated driver updates, problems with PhysX, missing a few bells and whistles such as adaptive vsync or TXAA-type antialiasing etc.).
Having said that, the next consoles will be running AMD hardware, which means almost all games will be designed with AMD hardware primarily in mind. In theory, this could mean that PC games will work better on AMD graphics cards. Though in practice, probably not (though expect some games to require a week or two before nvidia gets optimised drivers out)
I think Nvidia is still the better option. Case in point: The consoles will also have AMD CPUs, but most people would agree that Intel CPUs are still a better option for PC gamers, since AMD CPUs are famously crap.
But yeah, the bottom line is: get something strong, with plenty of VRAM. Otherwise, it'll likely start feeling old this time next year, as games are about to jump up in their tech requirements.
I think a better take on the question is "What games do you want to play?"
If you having something specific in mind, there is a chance that AMD with Tridef does a better job.
If you don't have a specific game in mind, and just want a wider variety, it's pretty clear that 3D Vision is the way to go. Here's a decent comparative review:
http://techreport.com/review/22350/pc-gaming-in-3d-stereo-3d-vision-2-vs-hd3d
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I don't think thats how its going to be. For more video memory to be useful, the card has to actually be fast enough to use it, take a look at the benchmarks for 2GB vs 4GB 6xx cards, theres 0 difference. With console hardware, this is 100% true. Plus, with the fact that they are limited to 1080p, and console games will unlikely be using AA at all (not needed due to typical console sitting distance), more video memory isn't going to be needed any time soon, especially with the 1080p limitation of 3D. Plus, theres also the fact that all these new console games is going to make for a swarm of DX11 titles, which, as we know, MSAA is utterly useless in. Super sampling will be the primary AA for the foreseeable future, and it doesn't have anything like the VRAM hit of MSAA.
For example, more VRAM allows higher resolution textures. Most games today have crappy low-res textures, because that's all that the consoles can handle.
Sometimes, PC mods allow us to use higher resolution textures (eg. the various HD texture mods for Skyrim, or the official high-res texture pack for Crysis 2). Typically, whether or not you can happily run these mods depends on your amount of VRAM, not on the actual horsepower of your card. They don't typically slow down your fps much, unless you run out of VRAM, in which case it becomes a bottleneck and the FPS plummets. (That's where you'd see big differences between your 2GB and 4GB 6xx cards)
In-game textures will almost definitely start being much more high-res next year. And where this happens, VRAM will become a common bottleneck. I suspect that many other things will also impact on VRAM next year too, such as larger levels and more NPCs.
I would think it's commonsense that if you want the best chance of smoothly running games that are built for next-gen consoles, then you should have a PC that matches or exceeds the specs of those consoles.
As we know, the actual horsepower of those consoles is already going to be lower than today's high-end PCs (in terms of teraflops etc.) One of the few areas where the specs of those consoles is significantly greater than today's PCs is VRAM. PS4 will have 8GB of unified memory (RAM + VRAM).
Therefore,as long as it has lots of VRAM, one of today's high-end cards should be fine to handle anything 2014 throws at it. (On paper at least).
I've used both. The both have their good and bad points. Some of those have been pointed out by others in this thread.
Since you've already purchased a projector which card(s) to get might be dictated by which brand is more compatible with your projector. With that said, if you projector is not on the officially supported Nvidia device list and your planning on using HDMI, a number of us have 'tricked' the drivers with custom .INF files to get it to work. Without looking at the specific model you have, as long as it supports standard HDMI 1.4a formats, I think you'll be covered.
IMHO what could possibly be the best S3D experience might be delivered by dual high resolution 2D projectors with polarizing filters on the lenses and driven by 1 dedicated GPU per projector.
AMD/TriDef has alot more output options and don't seem to be 'locked in' to any one technology. I think of their solutions as 'open' using open industry standards.
Nvidia on the other hand likes to limit and restrict what you can do with their cards for marketing and other reasons. Granted a full set of 3DVision approved Hw/Sw when it work looks much better than the popular AMD/Tridef solutions but I think of Nvidia as more of a 'closed system' using only their technology.
S3D technology: AMD has pretty much 'outsourced' their solution to TriDef. They have a dedicated website and team of people that work to create profiles for popular titles and have a community that also provides 'tweaks' and patches to get the best S3D results. However, there are often long delays before a 'beta' profile is released and even longer before its really 'usable' or 'playable' without visual defects. Some defects are never fixed.
While Nvidia did have in the past a more visible S3D support role and often offered technical support to developers creating for the 3Dvision solution, they have been rather MIA of late and while we still get a few '3DVision games' a year, in many/most cases whatever '3DVision readyness' Nvidia rates a game as having is a sad joke rather than an accurate statement of S3D visual worthyness and much like the AMD/Tridef camp, some issues are never fixed by the developer+Nvidia. Here's where a handful or less of dedicated S3D gamers come to the rescue to fix many games so that they will work with 3DVision. Without their hard work and support many games would not be playable asis or not look perfect or nearly perfect.
Both camps (AMD and Nvidia) do tend to get at least one S3D worthy release out a year that don't require anything or nothing more than minor patches to look good in S3D.
Performance. Because of the 'single source solution' here's where Nvidia has an edge. Games that utilize 3DVision require less resources than games running Tridef. An added bonus is that if you add SLI to the mix, games that were developed with SLI in mind perform even better.
If your looking for the most display compatibility (excluding 3DVision branded monitors) go with AMD. If you want performance, single company support (if and when its there) and a robust group of talented and dedicated S3D game fixers go with Nvidia. (the Tridef driver will work with (of course) limitations and drawbacks on Nvidia hardware)
I've been gaming/building my own computers and have worked in the industry (design, validate and test) since the early 1980's. I've owned countless brands of video cards over the years. Most of the brands I've used no longer exist. Its very likely that the major reason behind me being invested in Nvidia hardware right now is because the last ATI card I owned died because of overheating due to them sticking the cheapest sleave bearing fan they could find on a $400+ video card.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
Though I think we're all just so used to very low res textures that most of us just take them for granted. And devs probably make games in such a way as to hide the low res textures most of the time. But maybe that'll change.
Imagine, for example, a game where you walk up to an open book on a table, and you can actually read all of the text on the page. That would be pretty sweet for immersion, and would make a huge difference even to the console experience, but you'd almost never see it happen in today's games. But maybe that sort of thing will become commonplace in next-gen games.
Everything is super crisp and well just looks stunning. So it's pure awesomeness thanks mostly to Helix and other great dudes making this patches.
Of course I would have liked that nvidia supported there own tech but the dont give a crap about it any more and not a single word from them over here and thats why I think among other things they are doucebags but hey the 3D and the stability of nvidia cards are amazing :)
Cheers
Xeon X5675 hex cores @4.4 GHz, GTX 1070, win10 pro
i7 7700k 5GHz, RTX 2080, win10 pro
Benq 2720Z, w1070, Oculus Rift cv1, Samsung Odyssey+