Should I buy a Titan X for 1440p Rog Swift? Does VRAM usage go up in 3D? SLI Issues in 3D?
Hi guys, just looking for some opinions.
So basically I've heard that the Rog Swift does not work properly in SLI, but supposedly Nvidia has identified the problem and is working on a fix. Now, bearing that in mind...
In Canada the price of 980 SLI and the Titan X is about the same price
There are games now that are requiring more then 4gb VRAM (watch Dogs, few others)in 2D. Does VRAM usage go up in 3D? Titan X has 12gb, so I assume that will never be an issue, especially with future games like GTA V, which will most likely use a LOT of VRAM. I will most likely not be using anti alliasing in 3D, how much VRAM would I save from not using anti alliasing?
Have you guys found that SLI works worse in 3D then 2D? There is no 3D SLI benchmarks that compare 3D vs non 3D SLI.
On overclocked Titan looks to be just a little slower then 980 SLI (non overclocked)
What would you recommend 980 SLI or Titax X?
I have a i7 3930. Would that in anyway bottleneck me?
thanks
So basically I've heard that the Rog Swift does not work properly in SLI, but supposedly Nvidia has identified the problem and is working on a fix. Now, bearing that in mind...
In Canada the price of 980 SLI and the Titan X is about the same price
There are games now that are requiring more then 4gb VRAM (watch Dogs, few others)in 2D. Does VRAM usage go up in 3D? Titan X has 12gb, so I assume that will never be an issue, especially with future games like GTA V, which will most likely use a LOT of VRAM. I will most likely not be using anti alliasing in 3D, how much VRAM would I save from not using anti alliasing?
Have you guys found that SLI works worse in 3D then 2D? There is no 3D SLI benchmarks that compare 3D vs non 3D SLI.
On overclocked Titan looks to be just a little slower then 980 SLI (non overclocked)
What would you recommend 980 SLI or Titax X?
I have a i7 3930. Would that in anyway bottleneck me?
SLI is great for 3D but.....
SLI is a joke nowadays. Don't let anyone convince you otherwise. MAYBE 1/5 AAA games have support without issues as of like last 2 years.
TBH, I think a lot of people gave up on it. If you go in SLI section of forums its a ghost town and most of the people who post there are regulars/die hards.
VRAM goes up.
SLI is great for 3D but.....
SLI is a joke nowadays. Don't let anyone convince you otherwise. MAYBE 1/5 AAA games have support without issues as of like last 2 years.
TBH, I think a lot of people gave up on it. If you go in SLI section of forums its a ghost town and most of the people who post there are regulars/die hards.
VRAM goes up.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
I've compared 2D and 3D SLI performance here: http://www.volnapc.com/all-posts/3d-and-sli-performance-tested
As you can see, the scaling in 3D is usually very nice, and better than in 2D. But I would also caution against SLI, though mainly because of the microstuttering.
All multi-GPU solutions are prone to microstuttering, and in some games it's very minor, while in others it's really annoying. For example, I played all of Watch_Dogs in single-GPU mode, because it was just a stuttering mess in SLI. (I didn't even bother trying to play in 3D, because the performance was already pretty lousy in 2D).
Many on this forum disagree, but my belief is that you should get the absolute best single GPU you can afford. And if the performance you get is not quite enough, then - and only then - you should save up and get a 2nd for SLI. SLI on its own isn't a dependable enough solution. It should be treated as the icing on the cake, rather than the cake itself.
That was the strategy I took in 2013 with my Titans (instead of going 680 SLI, I bought one Titan, and later on added another), and it's worked out very well for me. 2 years later, I still have near-top-end performance with my Titan SLI, but whenever SLI isn't desirable (because of microstuttering or bad driver support), I can fall back on a single Titan, which is still a beast. If I had bought 2 680s instead, I would be left today with a mediocre system that struggles with new games.
However, the timing was different for me than for you. In the past 2 years, barely anything has changed in the GPU space. But 2 years from now, the true next generation of nvidia cards will exist, built probably on 16nm architecture, and they'll probably be much more powerful than anything today.
VRAM
When it comes to VRAM: the more the better. Few people agreed with me when I used to say this 2 years ago, but now that the new consoles have hit, everyone realises just how punishing a lack of VRAM can be. It's one of those things that doesn't have much impact until it maxes out 100%, which is when your FPS takes a nose dive. You basically never want it to max out.
At the very least, you want your new system to keep up with the consoles. They have 8GB combined memory (RAM + VRAM), so getting cards with only 4GB VRAM isn't giving you much of a buffer. PC Games' usage of VRAM is only going to keep increasing.
The main thing about VRAM is that it will enable you to select high-res textures in games. We're finally at the point where every texture in a game can be shown at 1:1 resolution rather than all blown up and blurry. And this makes a BIG difference. Shadow of Mordor with the Ultra texture DLC is the first game I've seen like this, and it was a joy to behold. It makes me really glad that I invested in 6GB VRAM cards two years ago, even though back then it was a bit overkill.
As you can see, the scaling in 3D is usually very nice, and better than in 2D. But I would also caution against SLI, though mainly because of the microstuttering.
All multi-GPU solutions are prone to microstuttering, and in some games it's very minor, while in others it's really annoying. For example, I played all of Watch_Dogs in single-GPU mode, because it was just a stuttering mess in SLI. (I didn't even bother trying to play in 3D, because the performance was already pretty lousy in 2D).
Many on this forum disagree, but my belief is that you should get the absolute best single GPU you can afford. And if the performance you get is not quite enough, then - and only then - you should save up and get a 2nd for SLI. SLI on its own isn't a dependable enough solution. It should be treated as the icing on the cake, rather than the cake itself.
That was the strategy I took in 2013 with my Titans (instead of going 680 SLI, I bought one Titan, and later on added another), and it's worked out very well for me. 2 years later, I still have near-top-end performance with my Titan SLI, but whenever SLI isn't desirable (because of microstuttering or bad driver support), I can fall back on a single Titan, which is still a beast. If I had bought 2 680s instead, I would be left today with a mediocre system that struggles with new games.
However, the timing was different for me than for you. In the past 2 years, barely anything has changed in the GPU space. But 2 years from now, the true next generation of nvidia cards will exist, built probably on 16nm architecture, and they'll probably be much more powerful than anything today.
VRAM
When it comes to VRAM: the more the better. Few people agreed with me when I used to say this 2 years ago, but now that the new consoles have hit, everyone realises just how punishing a lack of VRAM can be. It's one of those things that doesn't have much impact until it maxes out 100%, which is when your FPS takes a nose dive. You basically never want it to max out.
At the very least, you want your new system to keep up with the consoles. They have 8GB combined memory (RAM + VRAM), so getting cards with only 4GB VRAM isn't giving you much of a buffer. PC Games' usage of VRAM is only going to keep increasing.
The main thing about VRAM is that it will enable you to select high-res textures in games. We're finally at the point where every texture in a game can be shown at 1:1 resolution rather than all blown up and blurry. And this makes a BIG difference. Shadow of Mordor with the Ultra texture DLC is the first game I've seen like this, and it was a joy to behold. It makes me really glad that I invested in 6GB VRAM cards two years ago, even though back then it was a bit overkill.
Microstuttering is real, but I play a lot of game, in 3D, with my SLI, not really a problem for me, and yes 3D use a bit more VRAM, don't know why. I can't remember issues with SLI, last problem that I can recall is Dying Light in 3D some objets won't appear. I play mostly AAA games, maybe this help though... It's true that some games don't use SLI at all, to work Nvidia have to make new driver with better profiles. FYI: I've got Gigabytes 570 SLI (1,3GB), Asus 680 SLI (4GB) and now MSI 980 SLI (4GB).
Watch dogs in 3D maxed out 1080p no filter with 680 SLI, no problem, FPS counter in the up center, I have play the game with new 980 SLI for confort: (restart your webbrowser if it won't start)
https://www.youtube.com/watch?v=gnlDYu8DA38
My 3D videos:
https://www.youtube.com/channel/UC8i2iKj7QmCXtwGM0x8IG3A
SLI give you some better framerate but not x2, but it really helps for 3D. You have to know, 40fps average in single card is OK but on SLI it's quite not, too much drop so be aware of that.
[img]http://images.anandtech.com/graphs/graph9059/72511.png[/img]
http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/11
For the VRAM, the BUS is doubled in SLi, so 980 with "only" 4GB, can easily switch textures if needed. VRAM is most of the time, full by the textures of the level loaded even if they won't be used...
Here some game VRAM:
[img]http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan_X/images/memory.gif[/img]
[i]"Our results clearly show that 12 GB is overkill for the Titan X. Eight GB of memory would have been more than enough, and, in my opinion, future-proof. Even today, the 6 GB of the two-year-old Titan are too much as 4 GB is enough for all titles to date. Both the Xbox One and PS4 have 8 GB of VRAM of which 5 GB and 4.5 GB are, respectively, available to games, so I seriously doubt developers will exceed that amount any time soon.
Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised."[/i]
Source:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html
My english is not very good, but I hope I've helped.
.
Microstuttering is real, but I play a lot of game, in 3D, with my SLI, not really a problem for me, and yes 3D use a bit more VRAM, don't know why. I can't remember issues with SLI, last problem that I can recall is Dying Light in 3D some objets won't appear. I play mostly AAA games, maybe this help though... It's true that some games don't use SLI at all, to work Nvidia have to make new driver with better profiles. FYI: I've got Gigabytes 570 SLI (1,3GB), Asus 680 SLI (4GB) and now MSI 980 SLI (4GB).
Watch dogs in 3D maxed out 1080p no filter with 680 SLI, no problem, FPS counter in the up center, I have play the game with new 980 SLI for confort: (restart your webbrowser if it won't start)
SLI give you some better framerate but not x2, but it really helps for 3D. You have to know, 40fps average in single card is OK but on SLI it's quite not, too much drop so be aware of that.
For the VRAM, the BUS is doubled in SLi, so 980 with "only" 4GB, can easily switch textures if needed. VRAM is most of the time, full by the textures of the level loaded even if they won't be used...
Here some game VRAM:
"Our results clearly show that 12 GB is overkill for the Titan X. Eight GB of memory would have been more than enough, and, in my opinion, future-proof. Even today, the 6 GB of the two-year-old Titan are too much as 4 GB is enough for all titles to date. Both the Xbox One and PS4 have 8 GB of VRAM of which 5 GB and 4.5 GB are, respectively, available to games, so I seriously doubt developers will exceed that amount any time soon.
Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised."
In the short look I took at the Anandtech review, the Titan X is pretty soundly spanked by SLI 980 in their game tests:
http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review
Having noted that, I agree with eqzitara that SLI is a lost art. There are fewer titles where it works properly, especially in 3D.
Tough call.
[quote]All multi-GPU solutions are prone to microstuttering, and in some games it's very minor, while in others it's really annoying. For example, I played all of Watch_Dogs in single-GPU mode, because it was just a stuttering mess in SLI. (I didn't even bother trying to play in 3D, because the performance was already pretty lousy in 2D). [/quote]
Oh man! I'm sorry to hear that. WatchDogs is one of our best fixes. The solution to the stuttering was to go back to an older driver 320.49. With that driver, I got full SLI scaling and no stutter. You could have run it with your Titan's.
Having noted that, I agree with eqzitara that SLI is a lost art. There are fewer titles where it works properly, especially in 3D.
Tough call.
All multi-GPU solutions are prone to microstuttering, and in some games it's very minor, while in others it's really annoying. For example, I played all of Watch_Dogs in single-GPU mode, because it was just a stuttering mess in SLI. (I didn't even bother trying to play in 3D, because the performance was already pretty lousy in 2D).
Oh man! I'm sorry to hear that. WatchDogs is one of our best fixes. The solution to the stuttering was to go back to an older driver 320.49. With that driver, I got full SLI scaling and no stutter. You could have run it with your Titan's.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]
Oh man! I'm sorry to hear that. WatchDogs is one of our best fixes. The solution to the stuttering was to go back to an older driver 320.49. With that driver, I got full SLI scaling and no stutter. You could have run it with your Titan's.[/quote]Ha! I didn't know about the driver thing. That's a shame. Oh well, I find 2D quite tolerable if I haven't played 3D in a while (which was the case then). It's mainly when I go immediately from 3D to 2D that my eyes start burning.
But I suspect that there still would have been some microstuttering even with the different driver. I notice it in SLI more often than not (different people have different sensitivities - Dugom for example seems to be less prone to it).
[quote]"Our results clearly show that 12 GB is overkill for the Titan X. Eight GB of memory would have been more than enough, and, in my opinion, future-proof. Even today, the 6 GB of the two-year-old Titan are too much as 4 GB is enough for all titles to date. Both the Xbox One and PS4 have 8 GB of VRAM of which 5 GB and 4.5 GB are, respectively, available to games, so I seriously doubt developers will exceed that amount any time soon.[/quote]
This conclusion really is nonsense, and the person who wrote it should know better. It's basically like that famous Bill Gates quote "640 kilobytes of RAM should be enough for everyone".
For a start, it says that 4GB is enough for all titles, yet the graph shows 4(!) games that contradict that. Those 4 games are just the tip of the iceberg, as we're still at the beginning of the console cycle, and true 'next-gen' games have barely begun to hit the market.
And then he says that he doubts that devs will go above the console specs any time soon. Seriously? Find me a game in the last 5 years that [i]didn't[/i] go over the console specs! PC games almost always have options for more resolution, more AA, higher res textures, etc. And these things keep on climbing on PC even while consoles stay the same. If 4GB VRAM is barely enough to max out today's games, you can bet that it won't stand a chance in a couple of years.
Ok, take a look at this image. (Yes, it's a totally shameless plug of my own upcoming mod)
[IMG]http://i58.tinypic.com/2q2mlo1.jpg[/IMG]
The top image shows you the level of the last generation consoles. Skyrim was designed for machines that had 256-512MB of combined memory, and this blurry travesty was the result.
Modders quickly made high res textures for the PC version. Most of those textures were 2K x 2K resolution, sometimes 4K x 4K, and pushed many people's systems to the limit.
The bottom image is from my upcoming mod. It's a whopping 8K x 4K resolution. I chose this huge size because these murals are both very big in-game, and quite interesting (ie. I probably wouldn't use a 8K texture for something more mundane like a bench or a patch of ground).
But here's the thing: [b][i]This 8k x 4k texture - 16x higher than the vanilla version - is still not enough for a 1:1 crisp in-game image![/i][/b] When you walk right up to it in first-person, it still gets a bit stretched and blurred......and this is on my 1080p screen. 4K resolutions and beyond will require even higher res textures to get optimal crispness.
So you can bet that while the consoles remain at 8GB combined memory for this cycle, PC games (and modders) will continue to push the bar, because there's plenty of juice left in using more more VRAM. PC games' hardware requirements will eventually leave consoles' ones in the dust, as always. (just imagine what mods for Fallout 4 or Elder Srolls 6 will be like!)
There's another important factor: the new consoles are 64bit. Before, consoles were 32bit, so most PC ports would be 32bit. As a result, most of us had way more RAM than our games actually used, and VRAM usage wasn't pushed much either. Skyrim is still 32bit, and no number of mods can change that. But now, games are becoming 64bit, so all that RAM and VRAM potential is finally unlocked for devs to use, and the sky's the limit.
As everyone surely knows from personal experience, the time it takes PC hardware to go from 'overkill' to 'obsolete' isn't very long.
bo3b said:
Oh man! I'm sorry to hear that. WatchDogs is one of our best fixes. The solution to the stuttering was to go back to an older driver 320.49. With that driver, I got full SLI scaling and no stutter. You could have run it with your Titan's.
Ha! I didn't know about the driver thing. That's a shame. Oh well, I find 2D quite tolerable if I haven't played 3D in a while (which was the case then). It's mainly when I go immediately from 3D to 2D that my eyes start burning.
But I suspect that there still would have been some microstuttering even with the different driver. I notice it in SLI more often than not (different people have different sensitivities - Dugom for example seems to be less prone to it).
"Our results clearly show that 12 GB is overkill for the Titan X. Eight GB of memory would have been more than enough, and, in my opinion, future-proof. Even today, the 6 GB of the two-year-old Titan are too much as 4 GB is enough for all titles to date. Both the Xbox One and PS4 have 8 GB of VRAM of which 5 GB and 4.5 GB are, respectively, available to games, so I seriously doubt developers will exceed that amount any time soon.
This conclusion really is nonsense, and the person who wrote it should know better. It's basically like that famous Bill Gates quote "640 kilobytes of RAM should be enough for everyone".
For a start, it says that 4GB is enough for all titles, yet the graph shows 4(!) games that contradict that. Those 4 games are just the tip of the iceberg, as we're still at the beginning of the console cycle, and true 'next-gen' games have barely begun to hit the market.
And then he says that he doubts that devs will go above the console specs any time soon. Seriously? Find me a game in the last 5 years that didn't go over the console specs! PC games almost always have options for more resolution, more AA, higher res textures, etc. And these things keep on climbing on PC even while consoles stay the same. If 4GB VRAM is barely enough to max out today's games, you can bet that it won't stand a chance in a couple of years.
Ok, take a look at this image. (Yes, it's a totally shameless plug of my own upcoming mod)
The top image shows you the level of the last generation consoles. Skyrim was designed for machines that had 256-512MB of combined memory, and this blurry travesty was the result.
Modders quickly made high res textures for the PC version. Most of those textures were 2K x 2K resolution, sometimes 4K x 4K, and pushed many people's systems to the limit.
The bottom image is from my upcoming mod. It's a whopping 8K x 4K resolution. I chose this huge size because these murals are both very big in-game, and quite interesting (ie. I probably wouldn't use a 8K texture for something more mundane like a bench or a patch of ground).
But here's the thing: This 8k x 4k texture - 16x higher than the vanilla version - is still not enough for a 1:1 crisp in-game image! When you walk right up to it in first-person, it still gets a bit stretched and blurred......and this is on my 1080p screen. 4K resolutions and beyond will require even higher res textures to get optimal crispness.
So you can bet that while the consoles remain at 8GB combined memory for this cycle, PC games (and modders) will continue to push the bar, because there's plenty of juice left in using more more VRAM. PC games' hardware requirements will eventually leave consoles' ones in the dust, as always. (just imagine what mods for Fallout 4 or Elder Srolls 6 will be like!)
There's another important factor: the new consoles are 64bit. Before, consoles were 32bit, so most PC ports would be 32bit. As a result, most of us had way more RAM than our games actually used, and VRAM usage wasn't pushed much either. Skyrim is still 32bit, and no number of mods can change that. But now, games are becoming 64bit, so all that RAM and VRAM potential is finally unlocked for devs to use, and the sky's the limit.
As everyone surely knows from personal experience, the time it takes PC hardware to go from 'overkill' to 'obsolete' isn't very long.
http://i.imgur.com/w036PxI.png
Here's a big series of benchmarks of the Titan X at 1440p (and 4k).
Some games run quite well, but not all. Crysis 3, 40fps. Dead Rising 3, 56fps. Dragon Age Inquisition, 51fps. Far Cry 4, 70fps. Metro Last Light, 70fps. Tomb Raider, 52fps.
Now if you cut those framerates in half to get your 3D framerates, you're looking at 25FPS in Dragon Age, or 35 for Far Cry 4. That's going to suck to play. Granted, these are all with settings maxed and AA on, but even so it doesn't look good. To me, that just proves we don't yet have enough power to play the newest games in 1440p 3D, without sacrificing image quality.
I'm going to bat for SLI here, too. While it's absolutely true (and frustrating) that a lot of games don't scale well in SLI and microstutter is an occasional problem (though I rarely notice it), there are also games that only work properly in 3D+SLI. Not many, but it happens. Saint's Row 4 won't run in 3D on a single card without issues, for example.
I just don't think 1440p 3D is worth it at the moment. Stick with the 980/970, consider SLI, and stay at 1080p where you can get solid framerates.
Here's a big series of benchmarks of the Titan X at 1440p (and 4k).
Some games run quite well, but not all. Crysis 3, 40fps. Dead Rising 3, 56fps. Dragon Age Inquisition, 51fps. Far Cry 4, 70fps. Metro Last Light, 70fps. Tomb Raider, 52fps.
Now if you cut those framerates in half to get your 3D framerates, you're looking at 25FPS in Dragon Age, or 35 for Far Cry 4. That's going to suck to play. Granted, these are all with settings maxed and AA on, but even so it doesn't look good. To me, that just proves we don't yet have enough power to play the newest games in 1440p 3D, without sacrificing image quality.
I'm going to bat for SLI here, too. While it's absolutely true (and frustrating) that a lot of games don't scale well in SLI and microstutter is an occasional problem (though I rarely notice it), there are also games that only work properly in 3D+SLI. Not many, but it happens. Saint's Row 4 won't run in 3D on a single card without issues, for example.
I just don't think 1440p 3D is worth it at the moment. Stick with the 980/970, consider SLI, and stay at 1080p where you can get solid framerates.
[quote="Volnaiskra"]Dugom for example seems to be less prone to it.[/quote]Yep don't care at all, i can play 30fps, no need to get 144fps, active glasses won't let games be perfectly fluid anyway...
Texture are switched via the BUS and in SLI the BUS is doubled, if you can't put all the level textures in the VRAM is not realy an issue ! Some textures loaded into the VRAM are never used, and some are usefull only when you watch it like your wall texture, when you go some place else this texture of the wall can be switched, by some other(s), it will not be use again, exept if you go to see your wall again.
Be carefull some games, like Wolfenstein (2014), Doom (2015), Carmageddon (2015) use an unique texture for all the level called MegaTexture, that may cause trouble !
.
Volnaiskra said:Dugom for example seems to be less prone to it.
Yep don't care at all, i can play 30fps, no need to get 144fps, active glasses won't let games be perfectly fluid anyway...
Texture are switched via the BUS and in SLI the BUS is doubled, if you can't put all the level textures in the VRAM is not realy an issue ! Some textures loaded into the VRAM are never used, and some are usefull only when you watch it like your wall texture, when you go some place else this texture of the wall can be switched, by some other(s), it will not be use again, exept if you go to see your wall again.
Be carefull some games, like Wolfenstein (2014), Doom (2015), Carmageddon (2015) use an unique texture for all the level called MegaTexture, that may cause trouble !
[quote="Volnaiskra"]
VRAM
When it comes to VRAM: the more the better. Few people agreed with me when I used to say this 2 years ago, but now that the new consoles have hit, everyone realises just how punishing a lack of VRAM can be. It's one of those things that doesn't have much impact until it maxes out 100%, which is when your FPS takes a nose dive. You basically never want it to max out.
At the very least, you want your new system to keep up with the consoles. They have 8GB combined memory (RAM + VRAM), so getting cards with only 4GB VRAM isn't giving you much of a buffer. PC Games' usage of VRAM is only going to keep increasing.
The main thing about VRAM is that it will enable you to select high-res textures in games. We're finally at the point where every texture in a game can be shown at 1:1 resolution rather than all blown up and blurry. And this makes a BIG difference. Shadow of Mordor with the Ultra texture DLC is the first game I've seen like this, and it was a joy to behold. It makes me really glad that I invested in 6GB VRAM cards two years ago, even though back then it was a bit overkill. [/quote]
QFT.
When it comes to VRAM: the more the better. Few people agreed with me when I used to say this 2 years ago, but now that the new consoles have hit, everyone realises just how punishing a lack of VRAM can be. It's one of those things that doesn't have much impact until it maxes out 100%, which is when your FPS takes a nose dive. You basically never want it to max out.
At the very least, you want your new system to keep up with the consoles. They have 8GB combined memory (RAM + VRAM), so getting cards with only 4GB VRAM isn't giving you much of a buffer. PC Games' usage of VRAM is only going to keep increasing.
The main thing about VRAM is that it will enable you to select high-res textures in games. We're finally at the point where every texture in a game can be shown at 1:1 resolution rather than all blown up and blurry. And this makes a BIG difference. Shadow of Mordor with the Ultra texture DLC is the first game I've seen like this, and it was a joy to behold. It makes me really glad that I invested in 6GB VRAM cards two years ago, even though back then it was a bit overkill.
Thank you for opening the thread.
I am actually in a quite similar situation. I do play 3D vision on a GTX 680 and never actually bothered to get a SLI for it.
Now that I finally want to upgrade my system - most of the new games don't run good enough on 3d anymore - I also wonder if I should go for 980 SLI or Titan X. Price would be similar. But considering issues and the problems mentionned with support for SLI as well as the simple fact that 12 VRAM is more than 2 X 4 VRAm, I am almost decided to just go for a Titan X. If needed in the future, I will get a second one.
I also think that this console generation will have been peaked graphically in 2 years and if then there will not be a major change (new consoles, Steam consoles, sth completely new, or people come back to PC, etc.), we could be stucked with for the future mediocre graphics alike during Xbox360 and PS3 generation. But back then my 2 VRAM GPU has always been more than perfect for fluid 3D vision gaming. Of course I was not playing surround.
If there would be a Titan X with 12 VRAM and two GPU processors on one, I guess I would go for that.
Thank you for opening the thread.
I am actually in a quite similar situation. I do play 3D vision on a GTX 680 and never actually bothered to get a SLI for it.
Now that I finally want to upgrade my system - most of the new games don't run good enough on 3d anymore - I also wonder if I should go for 980 SLI or Titan X. Price would be similar. But considering issues and the problems mentionned with support for SLI as well as the simple fact that 12 VRAM is more than 2 X 4 VRAm, I am almost decided to just go for a Titan X. If needed in the future, I will get a second one.
I also think that this console generation will have been peaked graphically in 2 years and if then there will not be a major change (new consoles, Steam consoles, sth completely new, or people come back to PC, etc.), we could be stucked with for the future mediocre graphics alike during Xbox360 and PS3 generation. But back then my 2 VRAM GPU has always been more than perfect for fluid 3D vision gaming. Of course I was not playing surround.
If there would be a Titan X with 12 VRAM and two GPU processors on one, I guess I would go for that.
[quote="mrorange55"]If there would be a Titan X with 12 VRAM and two GPU processors on one, I guess I would go for that.[/quote]SLI issues will be exactly the same on Bi-GPU card...
.
[quote="Dugom"][quote="mrorange55"]If there would be a Titan X with 12 VRAM and two GPU processors on one, I guess I would go for that.[/quote]SLI issues will be exactly the same on Bi-GPU card...
[/quote]
I know therefore I said it to show actually that there is no clear answer to the mentionned issue. I follow in my life the adage: Reflect profoundly about the issue first and then decide with your gut-feeling. Always worked for me yet.
So I will go for the Titan X now.
mrorange55 said:If there would be a Titan X with 12 VRAM and two GPU processors on one, I guess I would go for that.
SLI issues will be exactly the same on Bi-GPU card...
I know therefore I said it to show actually that there is no clear answer to the mentionned issue. I follow in my life the adage: Reflect profoundly about the issue first and then decide with your gut-feeling. Always worked for me yet.
For me that choice was simple from very beginning - 3D quality on ROG Swift is amazing so i have to get better single gpu solution to run those all new games. Titan X is just the best solution at the moment and it looks like 980 "Ti" won`t even get close to that. The point after Titan is that this card is build at max posible technology which allows you to push this gpu through insane overclocking. Nvidia knows about it and restricts design to reference one - retailers are not allow to change that or play with specs. You can expect a lot more from single card then all those results shows you right now. Also Drivers are not the best ones - in tests they used the latest for 980 and the proper ones are about to show up in a few weeks.
Anyway - my card is about to arrive this weekend (fingers crossed) so i`ll post some opinion about the whole mighty combo: ROG Swift with Titan X.
For me that choice was simple from very beginning - 3D quality on ROG Swift is amazing so i have to get better single gpu solution to run those all new games. Titan X is just the best solution at the moment and it looks like 980 "Ti" won`t even get close to that. The point after Titan is that this card is build at max posible technology which allows you to push this gpu through insane overclocking. Nvidia knows about it and restricts design to reference one - retailers are not allow to change that or play with specs. You can expect a lot more from single card then all those results shows you right now. Also Drivers are not the best ones - in tests they used the latest for 980 and the proper ones are about to show up in a few weeks.
Anyway - my card is about to arrive this weekend (fingers crossed) so i`ll post some opinion about the whole mighty combo: ROG Swift with Titan X.
"Product out of Stock" :/
Delivery next weekend. Anybody got lucky ?
Volnaiskra i bet you got it. Anyway can you let me know direct address to your Skyrim Mod when you release it ?
Looks great !
Sure, I'll make a note to PM you when it's ready. It won't be ready for a while though. It's a pretty big project. It's 8 large murals, and because the vanilla ones are so useless, I have to do everything from scratch, including researching my own visual references - I've already referenced used photos of Viking woodwork, scientific illustrations of bear skulls, and Rembrandt, to name a few :D Also, it has to compete with my work on Spryke and (as of late) 3Dmigoto fixes
Sure, I'll make a note to PM you when it's ready. It won't be ready for a while though. It's a pretty big project. It's 8 large murals, and because the vanilla ones are so useless, I have to do everything from scratch, including researching my own visual references - I've already referenced used photos of Viking woodwork, scientific illustrations of bear skulls, and Rembrandt, to name a few :D Also, it has to compete with my work on Spryke and (as of late) 3Dmigoto fixes
So basically I've heard that the Rog Swift does not work properly in SLI, but supposedly Nvidia has identified the problem and is working on a fix. Now, bearing that in mind...
In Canada the price of 980 SLI and the Titan X is about the same price
There are games now that are requiring more then 4gb VRAM (watch Dogs, few others)in 2D. Does VRAM usage go up in 3D? Titan X has 12gb, so I assume that will never be an issue, especially with future games like GTA V, which will most likely use a LOT of VRAM. I will most likely not be using anti alliasing in 3D, how much VRAM would I save from not using anti alliasing?
Have you guys found that SLI works worse in 3D then 2D? There is no 3D SLI benchmarks that compare 3D vs non 3D SLI.
On overclocked Titan looks to be just a little slower then 980 SLI (non overclocked)
What would you recommend 980 SLI or Titax X?
I have a i7 3930. Would that in anyway bottleneck me?
thanks
SLI is a joke nowadays. Don't let anyone convince you otherwise. MAYBE 1/5 AAA games have support without issues as of like last 2 years.
TBH, I think a lot of people gave up on it. If you go in SLI section of forums its a ghost town and most of the people who post there are regulars/die hards.
VRAM goes up.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
As you can see, the scaling in 3D is usually very nice, and better than in 2D. But I would also caution against SLI, though mainly because of the microstuttering.
All multi-GPU solutions are prone to microstuttering, and in some games it's very minor, while in others it's really annoying. For example, I played all of Watch_Dogs in single-GPU mode, because it was just a stuttering mess in SLI. (I didn't even bother trying to play in 3D, because the performance was already pretty lousy in 2D).
Many on this forum disagree, but my belief is that you should get the absolute best single GPU you can afford. And if the performance you get is not quite enough, then - and only then - you should save up and get a 2nd for SLI. SLI on its own isn't a dependable enough solution. It should be treated as the icing on the cake, rather than the cake itself.
That was the strategy I took in 2013 with my Titans (instead of going 680 SLI, I bought one Titan, and later on added another), and it's worked out very well for me. 2 years later, I still have near-top-end performance with my Titan SLI, but whenever SLI isn't desirable (because of microstuttering or bad driver support), I can fall back on a single Titan, which is still a beast. If I had bought 2 680s instead, I would be left today with a mediocre system that struggles with new games.
However, the timing was different for me than for you. In the past 2 years, barely anything has changed in the GPU space. But 2 years from now, the true next generation of nvidia cards will exist, built probably on 16nm architecture, and they'll probably be much more powerful than anything today.
VRAM
When it comes to VRAM: the more the better. Few people agreed with me when I used to say this 2 years ago, but now that the new consoles have hit, everyone realises just how punishing a lack of VRAM can be. It's one of those things that doesn't have much impact until it maxes out 100%, which is when your FPS takes a nose dive. You basically never want it to max out.
At the very least, you want your new system to keep up with the consoles. They have 8GB combined memory (RAM + VRAM), so getting cards with only 4GB VRAM isn't giving you much of a buffer. PC Games' usage of VRAM is only going to keep increasing.
The main thing about VRAM is that it will enable you to select high-res textures in games. We're finally at the point where every texture in a game can be shown at 1:1 resolution rather than all blown up and blurry. And this makes a BIG difference. Shadow of Mordor with the Ultra texture DLC is the first game I've seen like this, and it was a joy to behold. It makes me really glad that I invested in 6GB VRAM cards two years ago, even though back then it was a bit overkill.
Watch dogs in 3D maxed out 1080p no filter with 680 SLI, no problem, FPS counter in the up center, I have play the game with new 980 SLI for confort: (restart your webbrowser if it won't start)
My 3D videos:
https://www.youtube.com/channel/UC8i2iKj7QmCXtwGM0x8IG3A
SLI give you some better framerate but not x2, but it really helps for 3D. You have to know, 40fps average in single card is OK but on SLI it's quite not, too much drop so be aware of that.
http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/11
For the VRAM, the BUS is doubled in SLi, so 980 with "only" 4GB, can easily switch textures if needed. VRAM is most of the time, full by the textures of the level loaded even if they won't be used...
Here some game VRAM:
"Our results clearly show that 12 GB is overkill for the Titan X. Eight GB of memory would have been more than enough, and, in my opinion, future-proof. Even today, the 6 GB of the two-year-old Titan are too much as 4 GB is enough for all titles to date. Both the Xbox One and PS4 have 8 GB of VRAM of which 5 GB and 4.5 GB are, respectively, available to games, so I seriously doubt developers will exceed that amount any time soon.
Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised."
Source:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html
My english is not very good, but I hope I've helped.
.
i7 4790K @4.8Ghz / 2x 1080 8GB SLI @2000Mhz / 16GB @2400Mhz
Just click:
My 3D videos and crosstalk test pattern
3DVision Fixes:
HelixMod Site
Universal fix for UnrealEngine 4 Games
Universal fix for Unity Games
Universal fix for FrostBite 3 Games
Universal fix for TellTales Games
Compability Mode Unleashed
Please donate if you can:
-----> Donations to 3DVision Fixers
.
http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review
Having noted that, I agree with eqzitara that SLI is a lost art. There are fewer titles where it works properly, especially in 3D.
Tough call.
Oh man! I'm sorry to hear that. WatchDogs is one of our best fixes. The solution to the stuttering was to go back to an older driver 320.49. With that driver, I got full SLI scaling and no stutter. You could have run it with your Titan's.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
But I suspect that there still would have been some microstuttering even with the different driver. I notice it in SLI more often than not (different people have different sensitivities - Dugom for example seems to be less prone to it).
This conclusion really is nonsense, and the person who wrote it should know better. It's basically like that famous Bill Gates quote "640 kilobytes of RAM should be enough for everyone".
For a start, it says that 4GB is enough for all titles, yet the graph shows 4(!) games that contradict that. Those 4 games are just the tip of the iceberg, as we're still at the beginning of the console cycle, and true 'next-gen' games have barely begun to hit the market.
And then he says that he doubts that devs will go above the console specs any time soon. Seriously? Find me a game in the last 5 years that didn't go over the console specs! PC games almost always have options for more resolution, more AA, higher res textures, etc. And these things keep on climbing on PC even while consoles stay the same. If 4GB VRAM is barely enough to max out today's games, you can bet that it won't stand a chance in a couple of years.
Ok, take a look at this image. (Yes, it's a totally shameless plug of my own upcoming mod)
The top image shows you the level of the last generation consoles. Skyrim was designed for machines that had 256-512MB of combined memory, and this blurry travesty was the result.
Modders quickly made high res textures for the PC version. Most of those textures were 2K x 2K resolution, sometimes 4K x 4K, and pushed many people's systems to the limit.
The bottom image is from my upcoming mod. It's a whopping 8K x 4K resolution. I chose this huge size because these murals are both very big in-game, and quite interesting (ie. I probably wouldn't use a 8K texture for something more mundane like a bench or a patch of ground).
But here's the thing: This 8k x 4k texture - 16x higher than the vanilla version - is still not enough for a 1:1 crisp in-game image! When you walk right up to it in first-person, it still gets a bit stretched and blurred......and this is on my 1080p screen. 4K resolutions and beyond will require even higher res textures to get optimal crispness.
So you can bet that while the consoles remain at 8GB combined memory for this cycle, PC games (and modders) will continue to push the bar, because there's plenty of juice left in using more more VRAM. PC games' hardware requirements will eventually leave consoles' ones in the dust, as always. (just imagine what mods for Fallout 4 or Elder Srolls 6 will be like!)
There's another important factor: the new consoles are 64bit. Before, consoles were 32bit, so most PC ports would be 32bit. As a result, most of us had way more RAM than our games actually used, and VRAM usage wasn't pushed much either. Skyrim is still 32bit, and no number of mods can change that. But now, games are becoming 64bit, so all that RAM and VRAM potential is finally unlocked for devs to use, and the sky's the limit.
As everyone surely knows from personal experience, the time it takes PC hardware to go from 'overkill' to 'obsolete' isn't very long.
Here's a big series of benchmarks of the Titan X at 1440p (and 4k).
Some games run quite well, but not all. Crysis 3, 40fps. Dead Rising 3, 56fps. Dragon Age Inquisition, 51fps. Far Cry 4, 70fps. Metro Last Light, 70fps. Tomb Raider, 52fps.
Now if you cut those framerates in half to get your 3D framerates, you're looking at 25FPS in Dragon Age, or 35 for Far Cry 4. That's going to suck to play. Granted, these are all with settings maxed and AA on, but even so it doesn't look good. To me, that just proves we don't yet have enough power to play the newest games in 1440p 3D, without sacrificing image quality.
I'm going to bat for SLI here, too. While it's absolutely true (and frustrating) that a lot of games don't scale well in SLI and microstutter is an occasional problem (though I rarely notice it), there are also games that only work properly in 3D+SLI. Not many, but it happens. Saint's Row 4 won't run in 3D on a single card without issues, for example.
I just don't think 1440p 3D is worth it at the moment. Stick with the 980/970, consider SLI, and stay at 1080p where you can get solid framerates.
Texture are switched via the BUS and in SLI the BUS is doubled, if you can't put all the level textures in the VRAM is not realy an issue ! Some textures loaded into the VRAM are never used, and some are usefull only when you watch it like your wall texture, when you go some place else this texture of the wall can be switched, by some other(s), it will not be use again, exept if you go to see your wall again.
Be carefull some games, like Wolfenstein (2014), Doom (2015), Carmageddon (2015) use an unique texture for all the level called MegaTexture, that may cause trouble !
.
i7 4790K @4.8Ghz / 2x 1080 8GB SLI @2000Mhz / 16GB @2400Mhz
Just click:
My 3D videos and crosstalk test pattern
3DVision Fixes:
HelixMod Site
Universal fix for UnrealEngine 4 Games
Universal fix for Unity Games
Universal fix for FrostBite 3 Games
Universal fix for TellTales Games
Compability Mode Unleashed
Please donate if you can:
-----> Donations to 3DVision Fixers
.
QFT.
I am actually in a quite similar situation. I do play 3D vision on a GTX 680 and never actually bothered to get a SLI for it.
Now that I finally want to upgrade my system - most of the new games don't run good enough on 3d anymore - I also wonder if I should go for 980 SLI or Titan X. Price would be similar. But considering issues and the problems mentionned with support for SLI as well as the simple fact that 12 VRAM is more than 2 X 4 VRAm, I am almost decided to just go for a Titan X. If needed in the future, I will get a second one.
I also think that this console generation will have been peaked graphically in 2 years and if then there will not be a major change (new consoles, Steam consoles, sth completely new, or people come back to PC, etc.), we could be stucked with for the future mediocre graphics alike during Xbox360 and PS3 generation. But back then my 2 VRAM GPU has always been more than perfect for fluid 3D vision gaming. Of course I was not playing surround.
If there would be a Titan X with 12 VRAM and two GPU processors on one, I guess I would go for that.
Intel Core i7-3820, 4 X 3,60 GHz overclocked to 4,50 GHz ; EVGA Titan X 12VRAM ; 16 GB Corsair Vengeance DDR-1600 (4x 4 GB) ; Asus VG278H 27-inch incl. 3D vision 2 glasses, integrated transmitter ; Xbox One Elite wireless controller ; Windows 10HTC VIVE 2,5 m2 roomscale3D VISION GAMERS - VISIT ME ON STEAM and feel free to add me: http://steamcommunity.com/profiles/76561198064106555 YOUTUBE: https://www.youtube.com/channel/UC1UE5TPoF0HX0HVpF_E4uPQ STEAM CURATOR: https://store.steampowered.com/curator/33611530-Streaming-Deluxe/
.
i7 4790K @4.8Ghz / 2x 1080 8GB SLI @2000Mhz / 16GB @2400Mhz
Just click:
My 3D videos and crosstalk test pattern
3DVision Fixes:
HelixMod Site
Universal fix for UnrealEngine 4 Games
Universal fix for Unity Games
Universal fix for FrostBite 3 Games
Universal fix for TellTales Games
Compability Mode Unleashed
Please donate if you can:
-----> Donations to 3DVision Fixers
.
I know therefore I said it to show actually that there is no clear answer to the mentionned issue. I follow in my life the adage: Reflect profoundly about the issue first and then decide with your gut-feeling. Always worked for me yet.
So I will go for the Titan X now.
Intel Core i7-3820, 4 X 3,60 GHz overclocked to 4,50 GHz ; EVGA Titan X 12VRAM ; 16 GB Corsair Vengeance DDR-1600 (4x 4 GB) ; Asus VG278H 27-inch incl. 3D vision 2 glasses, integrated transmitter ; Xbox One Elite wireless controller ; Windows 10HTC VIVE 2,5 m2 roomscale3D VISION GAMERS - VISIT ME ON STEAM and feel free to add me: http://steamcommunity.com/profiles/76561198064106555 YOUTUBE: https://www.youtube.com/channel/UC1UE5TPoF0HX0HVpF_E4uPQ STEAM CURATOR: https://store.steampowered.com/curator/33611530-Streaming-Deluxe/
Anyway - my card is about to arrive this weekend (fingers crossed) so i`ll post some opinion about the whole mighty combo: ROG Swift with Titan X.
https://steamcommunity.com/profiles/76561198014296177/
Delivery next weekend. Anybody got lucky ?
Volnaiskra i bet you got it. Anyway can you let me know direct address to your Skyrim Mod when you release it ?
Looks great !
https://steamcommunity.com/profiles/76561198014296177/