Zig11727 I agree.
I should have known better going on an nvidia forum to get excited about the new cards.
It seemed so plausible.
So am I the only one buying one on here?
Zig11727 I agree.
I should have known better going on an nvidia forum to get excited about the new cards.
It seemed so plausible.
So am I the only one buying one on here?
NVidia’s multi-view rendering for wide-view VR headsets sounds interesting - wonder if this technology will be incorporated into 3D Vision? See [url]https://www.roadtovr.com/nvidias-geforce-rtx-cards-bring-new-vr-rendering-features-and-enhancements/[/url], for example.
I have pre-ordered 2080 Ti (November ship date), by the way. I plan to replace factory cooler (air) with hybrid cooler (fluid), when/if Arctic releases appropriate cooler, like I did with my current Pascal Titan X card. I also plan to get either Pimax 8k or StarVR One, when they are eventually released (currently using Vive Pro - and LG’s 4K 55E6 for 3DV).
I have pre-ordered 2080 Ti (November ship date), by the way. I plan to replace factory cooler (air) with hybrid cooler (fluid), when/if Arctic releases appropriate cooler, like I did with my current Pascal Titan X card. I also plan to get either Pimax 8k or StarVR One, when they are eventually released (currently using Vive Pro - and LG’s 4K 55E6 for 3DV).
I might want to buy 2080ti if i buy the starVrone. But thats a really big if.
As for 3D vision i don't know is there Any Sense of upgrading from titanx Pascal. And i don't like bying pig in a Bag.
And im not in Any hurry.
I might want to buy 2080ti if i buy the starVrone. But thats a really big if.
As for 3D vision i don't know is there Any Sense of upgrading from titanx Pascal. And i don't like bying pig in a Bag.
And im not in Any hurry.
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
I was going to pre-order a 2080TI but missed out. Logic being that I believe they were expected to be shipped after the benchmarks came out, so I could cancel the preorder if I wasn't happy.
The stumbling block was that was worried about new drivers not supporting 3d vision as nothing had been confirmed, but I believe it's officialy listed somewhere as a supported feature.
As it stands, I also need to understand how the 2080 matches up to the 1080TI, as I still might get 1080TI SLI.
But then, SLI does bottleneck in some scenarios, you can see that with the modifications DSS made to 3dmigoto, the link can become saturated in some games, so again, how good is NVLink going to be?
I'm excited about the new cards and the new technologies, yes, just too many unknowns at this point.
I was going to pre-order a 2080TI but missed out. Logic being that I believe they were expected to be shipped after the benchmarks came out, so I could cancel the preorder if I wasn't happy.
The stumbling block was that was worried about new drivers not supporting 3d vision as nothing had been confirmed, but I believe it's officialy listed somewhere as a supported feature.
As it stands, I also need to understand how the 2080 matches up to the 1080TI, as I still might get 1080TI SLI.
But then, SLI does bottleneck in some scenarios, you can see that with the modifications DSS made to 3dmigoto, the link can become saturated in some games, so again, how good is NVLink going to be?
I'm excited about the new cards and the new technologies, yes, just too many unknowns at this point.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
@GibsonRed
An extremely pertinent question it has to be said, considering what forum we're on! Let's be honest, everyone's taking a punt purchasing any high-end GPU right now. Good luck with your purchase though, and I hope it works out for you.
@rustyk21
I've placed my bet, and ordered 2 1080 Ti cards (EVGA Black Edition), on import from the US. Combined, that's just over the £1300 mark, which stacks up pretty well with the very best (or most expensive) single 2080 Ti models currently on offer. I'm looking to put a new rig together over the next 2-3 months, since it'll actually take that long for me to afford purchasing all of the component parts. I'm still a believer in SLI (to heck with what the likes of Sora think). Not being the biggest fan of Windows 10, I'll be looking to get an i7-6850k to enable each card to run in a PCI-E x16 + x16 configuration, with the aim of getting at least 10fps extra performance over and above an x8 + x8 alternative. That might just offset some CPU bottlenecking in my view, but I won't know that for sure until sometime later in the year :S I'm realistic about the scaling not being anywhere near optimal using 2 over-powered cards, as I'll just be using a single 3d 1440p monitor, but nevertheless I'm still hoping for scaling somewhere between say 65-85% in the best cases. Also not forgetting that the 1440p pixel count in 3d isn't too far short of 4k in 2d.
An extremely pertinent question it has to be said, considering what forum we're on! Let's be honest, everyone's taking a punt purchasing any high-end GPU right now. Good luck with your purchase though, and I hope it works out for you.
@rustyk21
I've placed my bet, and ordered 2 1080 Ti cards (EVGA Black Edition), on import from the US. Combined, that's just over the £1300 mark, which stacks up pretty well with the very best (or most expensive) single 2080 Ti models currently on offer. I'm looking to put a new rig together over the next 2-3 months, since it'll actually take that long for me to afford purchasing all of the component parts. I'm still a believer in SLI (to heck with what the likes of Sora think). Not being the biggest fan of Windows 10, I'll be looking to get an i7-6850k to enable each card to run in a PCI-E x16 + x16 configuration, with the aim of getting at least 10fps extra performance over and above an x8 + x8 alternative. That might just offset some CPU bottlenecking in my view, but I won't know that for sure until sometime later in the year :S I'm realistic about the scaling not being anywhere near optimal using 2 over-powered cards, as I'll just be using a single 3d 1440p monitor, but nevertheless I'm still hoping for scaling somewhere between say 65-85% in the best cases. Also not forgetting that the 1440p pixel count in 3d isn't too far short of 4k in 2d.
[quote="Tothepoint2"]@GibsonRed
An extremely pertinent question it has to be said, considering what forum we're on! Let's be honest, everyone's taking a punt purchasing any high-end GPU right now. Good luck with your purchase though, and I hope it works out for you.
@rustyk21
I've placed my bet, and ordered 2 1080 Ti cards (EVGA Black Edition), on import from the US. Combined, that's just over the £1300 mark, which stacks up pretty well with the very best (or most expensive) single 2080 Ti models currently on offer. I'm looking to put a new rig together over the next 2-3 months, since it'll actually take that long for me to afford purchasing all of the component parts. I'm still a believer in SLI (to heck with what the likes of Sora think). Not being the biggest fan of Windows 10, I'll be looking to get an i7-6850k to enable each card to run in a PCI-E x16 + x16 configuration, with the aim of getting at least 10fps extra performance over and above an x8 + x8 alternative. That might just offset some CPU bottlenecking in my view, but I won't know that for sure until sometime later in the year :S I'm realistic about the scaling not being anywhere near optimal using 2 over-powered cards, as I'll just be using a single 3d 1440p monitor, but nevertheless I'm still hoping for scaling somewhere between say 65-85% in the best cases. Also not forgetting that the 1440p pixel count in 3d isn't too far short of 4k in 2d. [/quote]
Return them if you can.
Sora can be annoying but he /she is correct about SLI. SLI is dying, I've had 780 Ti SLI in the past and I was so relieved to move from that to single 980 Ti in 2015 and the situation has only deteriorated further.
I watched an ebay auction of a 1080 Ti FE with EK waterblock end before my eyes, it had a block from day one (according to the listing), indicating that it was likely not used for mining and keeping the temps down around 45C about the entire card does wonders for longevity, but it wasn't the condition nor the price of the card that dissuaded me from bidding, it's that SLI support is absolutely abysmal and getting worse. The auction ended for $600. The block by itself is $150 and had zero signs of wear and included an EK backplate. Meaning, this was a $450 1080 Ti that someone just snagged.
I spent 3-4 hours mulling it over, and right now I only have 4-5 games that I can justify getting a second 1080 Ti for, and half of them have zero SLI support:
SLI Supported:
Mass Effect: Andromeda (3D Vision)
The Witcher 3: (3D Vision)
Shadow of the Tomb Raider (3D Vision)
Watch Dogs 2, either 3D at 2560x1440 on PG278Q or 2D, G-Sync, @ 3440x1440 on AW3418DW
No SLI Support:
Mafia 3 Including all DLC (great game IMHO), seeing 70 FPS at 3440x1440
No Man's Sky
Middle Earth: Shadow of War
Forza Horizon 3 and 4
The rest of my games just aren't demanding enough at 3D 2560x1440 or 2D 3440x1440 to warrant more GPU compute.
The problem is, SLI support is down around 65% of games, with many new titles offering zero support.
Titanfall 2, no SLI support to this day
Any OpenGL title, i.e. Wolfenstein, Doom, The Evil Within, No Man's Sky, no SLI support.
Bethesda usually has great SLI support and scaling, as does Nixxes, behind Tomb Raider franchise.
I HATE to say it but youre better off returning the 1080 Ti's if you can and waiting until we get concrete benchmarks before upgrading.
Timespy is a great indicator of performance, but we truly don't know how 2080 Ti is going to perform, nor do we honestly know whether or not the leaked benches are even real.
This could all be reverse psychology on the part of Nvidia, softening our expectations, so when we see real benches indicating higher than say 20% we are more inclined to be interested in purchasing vs thinking we are looking at a 50% bump in performance vis a vis 1080 Ti and then benches show 30%, then no-one would buy.
Youre better off with +30% in 100% of games than +60% in 50% of games and +30% or so in the other 15% of games with stuttering and other problems, having to mess around with Nvidia Inspector (i.e. setting SLI to AFR2 in Metro and dealing with scaling loss). I mean SLI truly just sucks. And if youre a 3D Vision enthusiast, many if not all of the fixes will indicate a problem with SLI, i.e. The Witcher 3 fix water reflections being bugged with SLI etc. etc. etc.
I don't want to be a downer or a negative nancy, but once you open those 1080 Ti's you may be beyond the point of no return and end up regretting going the SLI route.
The 3rd option is just waiting.
Nvidia is most certainly going to release a version of TU102 with an uncut die. We absolutely are going to see a Titan card this generation, and if history is any indication, we may see one before the end of the year. Titan cards always follow the main launch by 3 months. Watch them release a Titan card for $1200 that is 15% faster than 2080 Ti whilst simultaneously dropping the price of 2080 Ti, 2080 and 2070 by $100 like they did when they released 1080 Ti.
If you can wait further, there's a chance that we will see 7nm Volta late 2019, depending on what AMD does with Navi.
I'm probably going to wait, the only way I wont is if 2080 Ti is at minimum 40% faster in the aforementioned and future titles (SOTR, Fallout 76 etc.) and I can attain one for $999. That is the only way. Otherwise I'm waiting.
An extremely pertinent question it has to be said, considering what forum we're on! Let's be honest, everyone's taking a punt purchasing any high-end GPU right now. Good luck with your purchase though, and I hope it works out for you.
@rustyk21
I've placed my bet, and ordered 2 1080 Ti cards (EVGA Black Edition), on import from the US. Combined, that's just over the £1300 mark, which stacks up pretty well with the very best (or most expensive) single 2080 Ti models currently on offer. I'm looking to put a new rig together over the next 2-3 months, since it'll actually take that long for me to afford purchasing all of the component parts. I'm still a believer in SLI (to heck with what the likes of Sora think). Not being the biggest fan of Windows 10, I'll be looking to get an i7-6850k to enable each card to run in a PCI-E x16 + x16 configuration, with the aim of getting at least 10fps extra performance over and above an x8 + x8 alternative. That might just offset some CPU bottlenecking in my view, but I won't know that for sure until sometime later in the year :S I'm realistic about the scaling not being anywhere near optimal using 2 over-powered cards, as I'll just be using a single 3d 1440p monitor, but nevertheless I'm still hoping for scaling somewhere between say 65-85% in the best cases. Also not forgetting that the 1440p pixel count in 3d isn't too far short of 4k in 2d.
Return them if you can.
Sora can be annoying but he /she is correct about SLI. SLI is dying, I've had 780 Ti SLI in the past and I was so relieved to move from that to single 980 Ti in 2015 and the situation has only deteriorated further.
I watched an ebay auction of a 1080 Ti FE with EK waterblock end before my eyes, it had a block from day one (according to the listing), indicating that it was likely not used for mining and keeping the temps down around 45C about the entire card does wonders for longevity, but it wasn't the condition nor the price of the card that dissuaded me from bidding, it's that SLI support is absolutely abysmal and getting worse. The auction ended for $600. The block by itself is $150 and had zero signs of wear and included an EK backplate. Meaning, this was a $450 1080 Ti that someone just snagged.
I spent 3-4 hours mulling it over, and right now I only have 4-5 games that I can justify getting a second 1080 Ti for, and half of them have zero SLI support:
SLI Supported:
Mass Effect: Andromeda (3D Vision)
The Witcher 3: (3D Vision)
Shadow of the Tomb Raider (3D Vision)
Watch Dogs 2, either 3D at 2560x1440 on PG278Q or 2D, G-Sync, @ 3440x1440 on AW3418DW
No SLI Support:
Mafia 3 Including all DLC (great game IMHO), seeing 70 FPS at 3440x1440
No Man's Sky
Middle Earth: Shadow of War
Forza Horizon 3 and 4
The rest of my games just aren't demanding enough at 3D 2560x1440 or 2D 3440x1440 to warrant more GPU compute.
The problem is, SLI support is down around 65% of games, with many new titles offering zero support.
Titanfall 2, no SLI support to this day
Any OpenGL title, i.e. Wolfenstein, Doom, The Evil Within, No Man's Sky, no SLI support.
Bethesda usually has great SLI support and scaling, as does Nixxes, behind Tomb Raider franchise.
I HATE to say it but youre better off returning the 1080 Ti's if you can and waiting until we get concrete benchmarks before upgrading.
Timespy is a great indicator of performance, but we truly don't know how 2080 Ti is going to perform, nor do we honestly know whether or not the leaked benches are even real.
This could all be reverse psychology on the part of Nvidia, softening our expectations, so when we see real benches indicating higher than say 20% we are more inclined to be interested in purchasing vs thinking we are looking at a 50% bump in performance vis a vis 1080 Ti and then benches show 30%, then no-one would buy.
Youre better off with +30% in 100% of games than +60% in 50% of games and +30% or so in the other 15% of games with stuttering and other problems, having to mess around with Nvidia Inspector (i.e. setting SLI to AFR2 in Metro and dealing with scaling loss). I mean SLI truly just sucks. And if youre a 3D Vision enthusiast, many if not all of the fixes will indicate a problem with SLI, i.e. The Witcher 3 fix water reflections being bugged with SLI etc. etc. etc.
I don't want to be a downer or a negative nancy, but once you open those 1080 Ti's you may be beyond the point of no return and end up regretting going the SLI route.
The 3rd option is just waiting.
Nvidia is most certainly going to release a version of TU102 with an uncut die. We absolutely are going to see a Titan card this generation, and if history is any indication, we may see one before the end of the year. Titan cards always follow the main launch by 3 months. Watch them release a Titan card for $1200 that is 15% faster than 2080 Ti whilst simultaneously dropping the price of 2080 Ti, 2080 and 2070 by $100 like they did when they released 1080 Ti.
If you can wait further, there's a chance that we will see 7nm Volta late 2019, depending on what AMD does with Navi.
I'm probably going to wait, the only way I wont is if 2080 Ti is at minimum 40% faster in the aforementioned and future titles (SOTR, Fallout 76 etc.) and I can attain one for $999. That is the only way. Otherwise I'm waiting.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
Follow up:
3D Vision may be possible in Batman: Arkham Knight soon and a single 1080 Ti might not cut it for 60 FPS @ 2560x1440 and the game has no SLI support.
Just another example.
I started playing this game again recently, and honestly, it's really a good game. It's probably the only game that I haven't uninstalled in the past 3 years, along with Planetside 2, another game that doesn't have SLI support (but honestly doesn't need it anymore).
https://forums.geforce.com/default/topic/846640/3d-vision/batman-arkham-knight-3d-report/40/
Doom Eternal is also going to use OpenGL again and also not have SLI support. I mean how many good games do you want to come out to either not have the FPS you want in 3D Vision or 2D (i.e. No Man's Sky)?
I'm also hearing that there is an issue with G-Sync and SLI?
https://forums.geforce.com/default/topic/987529/sli-and-gsync-do-gsync-works-while-sli-is-on-/
Doom Eternal is also going to use OpenGL again and also not have SLI support. I mean how many good games do you want to come out to either not have the FPS you want in 3D Vision or 2D (i.e. No Man's Sky)?
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
I don'tget why sli support is supposedly so Bad nowdays. When switched from 660ti 2way sli to 980ti the performance boost for me was about 0 % Almoust all games in 3D (note) utilised both gpu's Almoust to the max. Atleast the afterburner always showed both gpu's peaking to max. The usagebility was the only one increades when moving to a 980ti
Though i remember you always had to hunt for the correct sli bits to get the scaling up
I don'tget why sli support is supposedly so Bad nowdays. When switched from 660ti 2way sli to 980ti the performance boost for me was about 0 % Almoust all games in 3D (note) utilised both gpu's Almoust to the max. Atleast the afterburner always showed both gpu's peaking to max. The usagebility was the only one increades when moving to a 980ti
Though i remember you always had to hunt for the correct sli bits to get the scaling up
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
Ray Tracing on Or Off: Paul and Kyle Face Off!
https://youtu.be/DynYRiTXR28?t=20m
And Paul so cannot even pretend to be enthusiastic about this shiite.
And Paul so cannot even pretend to be enthusiastic about this shiite.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
@xXxStarManxXx
It's important to establish that the notion of 'value for money' is an entirely relative concept. For instance, buying a newly released games console and then being fairly selective about the games that you buy from then on, is by comparison a fraction of the cost of pc gaming. Anybody who games using even a half-decent pc and thinks that they're getting true value for money, needs their head examined. Entities like Steam and other online vendors have certainly put downward pressure on the cost of pc games so that any 'perceived value' is then slowly clawed back over time, but as I say, it's all relative in my opinion.
Sora isn't just annoying, he's arrogant with it. SLI may well be dying for Sora, but it isn't for me. It's interesting that you did a short list of SLI and non-SLI games. It just so happens that I actually like and own most of the games in your former list, and have had no interest in purchasing any of the games in the latter, excluding Middle Earth: Shadow of War which I do own. On my current rig, that particular game performs at around 45-50fps with zero stuttering even though it's not using both GPUs, and much like GTA V, it's perfectly playable and stutter-free even below 60fps. I can honestly say that not one of my games stutter using SLI, and I've also been very careful as to which drivers I use. Both 331.82 on Windows 8 (weighted towards DX9) and 388.71 on Windows 7 (weighted towards DX11) have worked well for me. Incidentally, my racing game of choice is the GTR series from Simbin, which all perform well enough in SLI, and I reckon there's a decent chance that the upcoming GTR3 will do the same. I'll test Doom (2016) on my current pc, if only to remind myself as to how it performs, both with and without SLI in 3D.
My experience of SLI has been extremely good. My current rig that's still shown in my signature was distinctly overkill when I bought it almost 5 years ago, and @ 1080p I can still to this day play the games on your aforementioned list at 60fps in most instances, with a few graphical settings turned down here and there.
Fair play to you, in that you're obviously directly familiar with both the 780 Ti and the 1080 Ti, which in my opinion are the best price/performance ratio high-end cards that Nvidia have produced. I do realise that a single 1080 Ti should suffice at 1440p, but there'll always be some games that are more demanding than others, and therefore would need an extra boost to get them past that 60fps bare minimum. I'm hedging that there's more [u]perceived[/u] price/performance value to be had with a 1080 Ti as opposed to a 2080 Ti, so to me, getting a second 1080 Ti is money well spent at that price point. It's important to stress that word because it is often a matter of perception. I had briefly considered getting 2 1080 Ti cards with water-blocks, but the overall cost didn't justify buying 2 cards at this point in time, so I'm going with air-cooled this time around with no additional OC until that's actually needed.
You say that in your own experience, many games aren't demanding enough using a single 1080 Ti and that's fair enough. That's what should be expected with a brand new or fairly new GPU. However, I'm thinking as to how my system will perform 4 to 5 years from now with 2 GPUs. I'm quite happy to admit that having a second GPU is an indulgence and a gamble, so I'd imagine you certainly wouldn't appreciate me putting a 1050 Ti in as an additional dedicated PhysX renderer, which I'd certainly consider, depending entirely upon just how much my 2 1080 Ti GPUs are graphically taxed in the future or if I ever changed to 3D Surround. With 2 GTX Titans (not Titan X's or Titan XP's mind you) my system can still punch well above its weight @ 1080p in PhysX games such as Killing Floor 2, Warframe, Lords of the Fallen and the entire Metro series, all running at 60fps or above. The only advantage that the original Titan had over the 780 Ti was the amount of VRAM it had, which is why I never felt the need to replace them.
I have no interest in RTX GPUs at 12nm. If (and it's a big if) the performance benchmarks on the 2080 Ti aren't as good as expected, the 1080 Ti may be well be in higher demand at a comparatively cheaper alternative, hence putting upward pressure on their price which is why I'm buying now rather than waiting for the results to come out. Also bear mind that the price of a 1080 Ti, along with numerous other GPUs, spiked to around twice their original retail value earlier in the year, due to mining and crypto-currency pressures. So again it makes sense to get 2 now at their current comparative price.
I personally haven't needed to resort to using AFR2 in any of the Metro games to improve SLI scaling at 1080p, unless you're saying that that's a problem that you've experienced at 1440p. Again with Witcher 3, I'm sorry to hear that you've had problems with it, but my experience with Witcher 3 in SLI has been excellent with no problems in 3D at all. In fact, I was shocked at how well it ran in 3D. I expected a huge performance loss on my system, and I fully expected not to be able to play it in 3D. As it turns out, in Novigrad for instance, I still get around 50fps @1080p. It's not ideal, but it's still perfectly playable. The only game so far that I've had to resort to turning the resolution down just a single notch, was ELEX. Once again SLI does work with that game too, at least it does for me. Admittedly, the scaling may not be fantastic, but SLI enables ELEX to run at a resolution and a frame rate that isn't jarring to the point of hindering gameplay. In fact graphically it still looks really good in 3D even at a slightly lower resolution.
The option of waiting any further isn't an option for me in that I've waited almost 5 years already. 1080 Ti's are ones for me, fully acknowledging what you've said about any possible emergence of an RTX Titan this year. I know that it is genuine concern on your part given your own experiences with SLI and 3D, but I guess I've been that much luckier with that combination. I'm really not [i]expecting[/i] games in future to support SLI, except perhaps Star Citizen and Cyberpunk 2077, which in that case would definitely piss me off. Other than that though, I regard it as being a great bonus and nothing more. I've got a Vive headset that I haven't even taken out of it's box yet, and which I purchased 2 years ago, so that's another incentive for me not to wait any longer.
I'm really not too fussed about G-Sync to be honest, assuming there is actually an issue with it. I'd disable it, just to retain SLI if it ever came to it.
For your sake, I obviously hope that Volta is eventually worth your wait. Seeing as there's a low probability of there ever being a 4k 3D monitor appearing on the horizon, that would probably be the only reason for me to actually justify waiting any longer for a 7nm GPU, let alone two of them. So basically, we couldn't be more opposite on this issue, but there you go. I sure that whenever these upcoming benchmarks do appear, I somehow suspect that this forum might just provide a little extra entertainment value all by itself.
It's important to establish that the notion of 'value for money' is an entirely relative concept. For instance, buying a newly released games console and then being fairly selective about the games that you buy from then on, is by comparison a fraction of the cost of pc gaming. Anybody who games using even a half-decent pc and thinks that they're getting true value for money, needs their head examined. Entities like Steam and other online vendors have certainly put downward pressure on the cost of pc games so that any 'perceived value' is then slowly clawed back over time, but as I say, it's all relative in my opinion.
Sora isn't just annoying, he's arrogant with it. SLI may well be dying for Sora, but it isn't for me. It's interesting that you did a short list of SLI and non-SLI games. It just so happens that I actually like and own most of the games in your former list, and have had no interest in purchasing any of the games in the latter, excluding Middle Earth: Shadow of War which I do own. On my current rig, that particular game performs at around 45-50fps with zero stuttering even though it's not using both GPUs, and much like GTA V, it's perfectly playable and stutter-free even below 60fps. I can honestly say that not one of my games stutter using SLI, and I've also been very careful as to which drivers I use. Both 331.82 on Windows 8 (weighted towards DX9) and 388.71 on Windows 7 (weighted towards DX11) have worked well for me. Incidentally, my racing game of choice is the GTR series from Simbin, which all perform well enough in SLI, and I reckon there's a decent chance that the upcoming GTR3 will do the same. I'll test Doom (2016) on my current pc, if only to remind myself as to how it performs, both with and without SLI in 3D.
My experience of SLI has been extremely good. My current rig that's still shown in my signature was distinctly overkill when I bought it almost 5 years ago, and @ 1080p I can still to this day play the games on your aforementioned list at 60fps in most instances, with a few graphical settings turned down here and there.
Fair play to you, in that you're obviously directly familiar with both the 780 Ti and the 1080 Ti, which in my opinion are the best price/performance ratio high-end cards that Nvidia have produced. I do realise that a single 1080 Ti should suffice at 1440p, but there'll always be some games that are more demanding than others, and therefore would need an extra boost to get them past that 60fps bare minimum. I'm hedging that there's more perceived price/performance value to be had with a 1080 Ti as opposed to a 2080 Ti, so to me, getting a second 1080 Ti is money well spent at that price point. It's important to stress that word because it is often a matter of perception. I had briefly considered getting 2 1080 Ti cards with water-blocks, but the overall cost didn't justify buying 2 cards at this point in time, so I'm going with air-cooled this time around with no additional OC until that's actually needed.
You say that in your own experience, many games aren't demanding enough using a single 1080 Ti and that's fair enough. That's what should be expected with a brand new or fairly new GPU. However, I'm thinking as to how my system will perform 4 to 5 years from now with 2 GPUs. I'm quite happy to admit that having a second GPU is an indulgence and a gamble, so I'd imagine you certainly wouldn't appreciate me putting a 1050 Ti in as an additional dedicated PhysX renderer, which I'd certainly consider, depending entirely upon just how much my 2 1080 Ti GPUs are graphically taxed in the future or if I ever changed to 3D Surround. With 2 GTX Titans (not Titan X's or Titan XP's mind you) my system can still punch well above its weight @ 1080p in PhysX games such as Killing Floor 2, Warframe, Lords of the Fallen and the entire Metro series, all running at 60fps or above. The only advantage that the original Titan had over the 780 Ti was the amount of VRAM it had, which is why I never felt the need to replace them.
I have no interest in RTX GPUs at 12nm. If (and it's a big if) the performance benchmarks on the 2080 Ti aren't as good as expected, the 1080 Ti may be well be in higher demand at a comparatively cheaper alternative, hence putting upward pressure on their price which is why I'm buying now rather than waiting for the results to come out. Also bear mind that the price of a 1080 Ti, along with numerous other GPUs, spiked to around twice their original retail value earlier in the year, due to mining and crypto-currency pressures. So again it makes sense to get 2 now at their current comparative price.
I personally haven't needed to resort to using AFR2 in any of the Metro games to improve SLI scaling at 1080p, unless you're saying that that's a problem that you've experienced at 1440p. Again with Witcher 3, I'm sorry to hear that you've had problems with it, but my experience with Witcher 3 in SLI has been excellent with no problems in 3D at all. In fact, I was shocked at how well it ran in 3D. I expected a huge performance loss on my system, and I fully expected not to be able to play it in 3D. As it turns out, in Novigrad for instance, I still get around 50fps @1080p. It's not ideal, but it's still perfectly playable. The only game so far that I've had to resort to turning the resolution down just a single notch, was ELEX. Once again SLI does work with that game too, at least it does for me. Admittedly, the scaling may not be fantastic, but SLI enables ELEX to run at a resolution and a frame rate that isn't jarring to the point of hindering gameplay. In fact graphically it still looks really good in 3D even at a slightly lower resolution.
The option of waiting any further isn't an option for me in that I've waited almost 5 years already. 1080 Ti's are ones for me, fully acknowledging what you've said about any possible emergence of an RTX Titan this year. I know that it is genuine concern on your part given your own experiences with SLI and 3D, but I guess I've been that much luckier with that combination. I'm really not expecting games in future to support SLI, except perhaps Star Citizen and Cyberpunk 2077, which in that case would definitely piss me off. Other than that though, I regard it as being a great bonus and nothing more. I've got a Vive headset that I haven't even taken out of it's box yet, and which I purchased 2 years ago, so that's another incentive for me not to wait any longer.
I'm really not too fussed about G-Sync to be honest, assuming there is actually an issue with it. I'd disable it, just to retain SLI if it ever came to it.
For your sake, I obviously hope that Volta is eventually worth your wait. Seeing as there's a low probability of there ever being a 4k 3D monitor appearing on the horizon, that would probably be the only reason for me to actually justify waiting any longer for a 7nm GPU, let alone two of them. So basically, we couldn't be more opposite on this issue, but there you go. I sure that whenever these upcoming benchmarks do appear, I somehow suspect that this forum might just provide a little extra entertainment value all by itself.
There's also the issue of DX12 titles not supporting SLI.
You waited 5 years, you could have waited another 5 months!
I would send them back!
Are you going to try to push 1080 Ti SLI with a 4770k? Cause I can tell your right now that my 4930k @ 4.5 GHz in my previous system was a bottleneck with a single 1080 Ti.
Oh and G-Sync is legit.
It would be a hard decision choosing between G-Sync or +80-90% SLI performance in a game unless the game in question was a 3D Vision title and G-Sync didn't work in it. Way less latency, zero stutter or micro-stutter. If you haven't experienced a variable refresh rate monitor technology yet you honestly don't know what youre missing.
There's also the issue of DX12 titles not supporting SLI.
You waited 5 years, you could have waited another 5 months!
I would send them back!
Are you going to try to push 1080 Ti SLI with a 4770k? Cause I can tell your right now that my 4930k @ 4.5 GHz in my previous system was a bottleneck with a single 1080 Ti.
Oh and G-Sync is legit.
It would be a hard decision choosing between G-Sync or +80-90% SLI performance in a game unless the game in question was a 3D Vision title and G-Sync didn't work in it. Way less latency, zero stutter or micro-stutter. If you haven't experienced a variable refresh rate monitor technology yet you honestly don't know what youre missing.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
2080 Launch Update!
Get this, they delayed the 2080 review NDA to the 17th and the 2080 Ti review to the 19th!
I mean, to me I think it's rather ridiculous that there is an NDA to begin with but this is problematic for a few reasons.
It could mean that they want to shorten the window amount of time consumers have between seeing abysmal rasterization performance improvements and cancelling their orders and it could also mean that they are desperately trying to wring as much performance as they can out of the drivers for DX11 benchmarks but aren't really making any headwind.
Either way, this is a rather ominous sign that the leaked Timespy benches we've seen thus far, where an overclocked 2080 is 3% slower than an overclocked 1080 Ti and a 2080 Ti, presumably at only +90 MHz (but maybe with another 5% headroom on top of that, Peterson has stated on record that they will do 2100 MHz and that's about it) being only 18% faster than overclocked 1080 Ti, are going to be more or less what we can expect on the DX11 side of things as well.
This is absolutely not good. As I said, I don't know why there is a review NDA at all unless you want to sell something to consumers without them making an informed decision. That Shadow of the Tomb Raider, which also launches around the same time will have Ray Tracing disabled only buttresses the perception that they absolutely want to hide the performance of both rasterization and ray tracing with this launch.
https://www.reddit.com/r/nvidia/comments/9d58y4/nvidia_geforce_rtx_2080_reviews_go_live_on/
Having a look at the collective sentiment there, man this thing does not look good at all. I could see no other reason for Nvidia delaying review NDA right up until launch unless they are hoping that a percentage of consumers don't cancel in time and are too lazy to physically return the product in question once they receive it.
That the reviewers that will actually post reviews one or two days before launch all had to go through a stringent "selection" process only makes this look that much more worse:
https://www.overclock.net/forum/225-hardware-news/1707000-hardocp-nvidia-controls-aib-launch-driver-distribution.html
https://www.hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
"First and foremost, NVIDIA has demanded that its AIBs tell NVIDIA who will be reviewing the AIB's custom RTX 2080 and 2080 Ti cards. We were forwarded emails from other reviewers, from the AIBs that were asking specifically, at NVIDIA's direction, "Who will be performing the review content?" "What is that person's phone number and email address?" That is a bit odd, as we have never seen this before in 20 years of reviewing video cards. AIBs in the past have been left to pretty much operate their own review campaigns on new video cards, but that seems to have come to an end. From these lists of reviewers submitted to NVIDIA by the AIBs, NVIDIA has put together its own list of "approved reviewers," and sent their approved list back to the AIBs in order to let them know who they are allowed to sample review cards to. Much like NVIDIA exerted control over AIB's and OEM's brands with GPP, it is now exerting control over who the AIB has review its own custom cards.
This is where it gets a bit more interesting, and likely should give you concern with any leaked benchmarks you see on the web. NVIDIA is not allowing its AIBs to distribute drivers with their review cards. For a reviewer to have access, he must first sign NVIDIA's multi-year NDA (which is fine if you are "just" a card reviewer), then he will log into a protected site which is most likely a secured version of GeForce Experience in order to obtain the driver, and download from there into a specific machine with the new RTX card being present."
Get this, they delayed the 2080 review NDA to the 17th and the 2080 Ti review to the 19th!
I mean, to me I think it's rather ridiculous that there is an NDA to begin with but this is problematic for a few reasons.
It could mean that they want to shorten the window amount of time consumers have between seeing abysmal rasterization performance improvements and cancelling their orders and it could also mean that they are desperately trying to wring as much performance as they can out of the drivers for DX11 benchmarks but aren't really making any headwind.
Either way, this is a rather ominous sign that the leaked Timespy benches we've seen thus far, where an overclocked 2080 is 3% slower than an overclocked 1080 Ti and a 2080 Ti, presumably at only +90 MHz (but maybe with another 5% headroom on top of that, Peterson has stated on record that they will do 2100 MHz and that's about it) being only 18% faster than overclocked 1080 Ti, are going to be more or less what we can expect on the DX11 side of things as well.
This is absolutely not good. As I said, I don't know why there is a review NDA at all unless you want to sell something to consumers without them making an informed decision. That Shadow of the Tomb Raider, which also launches around the same time will have Ray Tracing disabled only buttresses the perception that they absolutely want to hide the performance of both rasterization and ray tracing with this launch.
https://www.reddit.com/r/nvidia/comments/9d58y4/nvidia_geforce_rtx_2080_reviews_go_live_on/
Having a look at the collective sentiment there, man this thing does not look good at all. I could see no other reason for Nvidia delaying review NDA right up until launch unless they are hoping that a percentage of consumers don't cancel in time and are too lazy to physically return the product in question once they receive it.
"First and foremost, NVIDIA has demanded that its AIBs tell NVIDIA who will be reviewing the AIB's custom RTX 2080 and 2080 Ti cards. We were forwarded emails from other reviewers, from the AIBs that were asking specifically, at NVIDIA's direction, "Who will be performing the review content?" "What is that person's phone number and email address?" That is a bit odd, as we have never seen this before in 20 years of reviewing video cards. AIBs in the past have been left to pretty much operate their own review campaigns on new video cards, but that seems to have come to an end. From these lists of reviewers submitted to NVIDIA by the AIBs, NVIDIA has put together its own list of "approved reviewers," and sent their approved list back to the AIBs in order to let them know who they are allowed to sample review cards to. Much like NVIDIA exerted control over AIB's and OEM's brands with GPP, it is now exerting control over who the AIB has review its own custom cards.
This is where it gets a bit more interesting, and likely should give you concern with any leaked benchmarks you see on the web. NVIDIA is not allowing its AIBs to distribute drivers with their review cards. For a reviewer to have access, he must first sign NVIDIA's multi-year NDA (which is fine if you are "just" a card reviewer), then he will log into a protected site which is most likely a secured version of GeForce Experience in order to obtain the driver, and download from there into a specific machine with the new RTX card being present."
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
[quote="xXxStarManxXx"]There's also the issue of DX12 titles not supporting SLI.
[/quote]
LOL there is also the issue of , the LACK of DirectX 12 games ROFL
The most stupid dx release, dx12 . First we wait many years for some games and
The there is few that bench about the same in dx11 than dx12.
xXxStarManxXx said:There's also the issue of DX12 titles not supporting SLI.
LOL there is also the issue of , the LACK of DirectX 12 games ROFL
The most stupid dx release, dx12 . First we wait many years for some games and
The there is few that bench about the same in dx11 than dx12.
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
[quote="xXxStarManxXx"]There's also the issue of DX12 titles not supporting SLI.
...
[/quote]
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.
xXxStarManxXx said:There's also the issue of DX12 titles not supporting SLI.
...
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote="Metal-O-Holic"][quote="xXxStarManxXx"]There's also the issue of DX12 titles not supporting SLI.
[/quote]
LOL there is also the issue of , the LACK of DirectX 12 games ROFL
The most stupid dx release, dx12 . First we wait many years for some games and
The there is few that bench about the same in dx11 than dx12.[/quote]
I agree, just pointing it out. And Rise of the Tomb Raider runs considerably smoother in DX12 vs DX11 in problematic areas, i.e. Geothermal Valley on my 8700k and 4930k, unfortunately that's 2D only.
But yeah, most other games DX12 has been a joke, BF1 it's garbage. I don't know what happened, I figured DX12 would be more widespread by now. Maybe when the next consoles release and Microsoft starts developing games for their next-gen solely on DX12.
[quote="rustyk21"][quote="xXxStarManxXx"]There's also the issue of DX12 titles not supporting SLI.
...
[/quote]
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.[/quote]
Well if you have a multi-core CPU and are experiencing a bottleneck that benefits from DX12, say in Rise of the Tomb Raider etc then it's an issue of do you want to disable SLI for DX12 to deal with the CPU bottleneck etc?
And Gears of War 4, this is NOT a good example.
It was released in Oct. 2016 and SLI support didn't happen until May 1st, 2017. I mean yeah, some of us are playing games 6 months after release, but if it's a multiplayer title, say Titanfall 2 or Star Wars: Battlefront or Call of Duty: Black Ops 3, 6 months can be the end of the game's multiplayer life-cycle. So by the time you can actually run the game at an appreciable frame-rate, the player-base has nearly completely dwindled to nothing.
And that's exactly what I'm talking about. SLI just straight up sucks. It's great for a few niche titles that support it, be it Fallout 4, Rise of the Tomb Raider, Crysis 3 etc. but if it's a multiplayer game, if there is no implementation out of the gate it usually takes them months to implement it and by the time they do in many cases the game is dead.
This was exactly my experience in 2014, I told everyone in various places on the web that I was moving from a laptop to desktop and going with a single 780 Ti. I was told to wait for Maxwell, as this was literally 6 months before the release of the next architecture. I told them I couldn't wait, I've waited long enough, and that I will build a PC now (primarily to get away from 680M SLI lol). So I did, and inevitably, after picking up my Swift PG278Q and falling in love with 3D Vision and realizing that one 780 Ti wasn't enough to push Assassins Creed Black Flag and Tomb Raider 2013 etc. I picked up another 780 Ti for 780 Ti SLI and yeah sure, those two games ran great, but then it was one problem after another. Dragon Age: Inquisition was released, and there was atrocious stutter with SLI. Batman: Arkham Knight was released. Zero SLI support. COD: Black Ops 3, no TAA support with SLI. Planetside 2, no proper SLI support (a single 780 Ti was challenged by this game back then).
I was completely relieved to pull that out in favor of 980 Ti. I've been with X-Fire and SLI since I got into PC Gaming: ATI 5870M X-Fire, GTX 580M SLI, GTX 680M SLI, GTX 780 TI SLI. SLI just straight up sucks.
And with going from SLI to single GPU there was a noticeable improvement in smoothness that's hard to describe. It's not even micro-stutter, it's like latency + frame-timing and G-Sync being impacted all together. Single GPU is absolutely better than SLI, 100% hands down. Also worth mentioning, two 300W GPU's in SLI will heat up your room really quickly. If you don't have great AC or it isn't naturally cool where you are you need to think about this before adding a 2nd 300W GPU.
So when I see people making the same mistake I did way back in 2014, I am trying to warn them, pointing to my experience. And SLI has only gotten worse, much worse since then.
xXxStarManxXx said:There's also the issue of DX12 titles not supporting SLI.
LOL there is also the issue of , the LACK of DirectX 12 games ROFL
The most stupid dx release, dx12 . First we wait many years for some games and
The there is few that bench about the same in dx11 than dx12.
I agree, just pointing it out. And Rise of the Tomb Raider runs considerably smoother in DX12 vs DX11 in problematic areas, i.e. Geothermal Valley on my 8700k and 4930k, unfortunately that's 2D only.
But yeah, most other games DX12 has been a joke, BF1 it's garbage. I don't know what happened, I figured DX12 would be more widespread by now. Maybe when the next consoles release and Microsoft starts developing games for their next-gen solely on DX12.
rustyk21 said:
xXxStarManxXx said:There's also the issue of DX12 titles not supporting SLI.
...
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.
Well if you have a multi-core CPU and are experiencing a bottleneck that benefits from DX12, say in Rise of the Tomb Raider etc then it's an issue of do you want to disable SLI for DX12 to deal with the CPU bottleneck etc?
And Gears of War 4, this is NOT a good example.
It was released in Oct. 2016 and SLI support didn't happen until May 1st, 2017. I mean yeah, some of us are playing games 6 months after release, but if it's a multiplayer title, say Titanfall 2 or Star Wars: Battlefront or Call of Duty: Black Ops 3, 6 months can be the end of the game's multiplayer life-cycle. So by the time you can actually run the game at an appreciable frame-rate, the player-base has nearly completely dwindled to nothing.
And that's exactly what I'm talking about. SLI just straight up sucks. It's great for a few niche titles that support it, be it Fallout 4, Rise of the Tomb Raider, Crysis 3 etc. but if it's a multiplayer game, if there is no implementation out of the gate it usually takes them months to implement it and by the time they do in many cases the game is dead.
This was exactly my experience in 2014, I told everyone in various places on the web that I was moving from a laptop to desktop and going with a single 780 Ti. I was told to wait for Maxwell, as this was literally 6 months before the release of the next architecture. I told them I couldn't wait, I've waited long enough, and that I will build a PC now (primarily to get away from 680M SLI lol). So I did, and inevitably, after picking up my Swift PG278Q and falling in love with 3D Vision and realizing that one 780 Ti wasn't enough to push Assassins Creed Black Flag and Tomb Raider 2013 etc. I picked up another 780 Ti for 780 Ti SLI and yeah sure, those two games ran great, but then it was one problem after another. Dragon Age: Inquisition was released, and there was atrocious stutter with SLI. Batman: Arkham Knight was released. Zero SLI support. COD: Black Ops 3, no TAA support with SLI. Planetside 2, no proper SLI support (a single 780 Ti was challenged by this game back then).
I was completely relieved to pull that out in favor of 980 Ti. I've been with X-Fire and SLI since I got into PC Gaming: ATI 5870M X-Fire, GTX 580M SLI, GTX 680M SLI, GTX 780 TI SLI. SLI just straight up sucks.
And with going from SLI to single GPU there was a noticeable improvement in smoothness that's hard to describe. It's not even micro-stutter, it's like latency + frame-timing and G-Sync being impacted all together. Single GPU is absolutely better than SLI, 100% hands down. Also worth mentioning, two 300W GPU's in SLI will heat up your room really quickly. If you don't have great AC or it isn't naturally cool where you are you need to think about this before adding a 2nd 300W GPU.
So when I see people making the same mistake I did way back in 2014, I am trying to warn them, pointing to my experience. And SLI has only gotten worse, much worse since then.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
I should have known better going on an nvidia forum to get excited about the new cards.
It seemed so plausible.
So am I the only one buying one on here?
I have pre-ordered 2080 Ti (November ship date), by the way. I plan to replace factory cooler (air) with hybrid cooler (fluid), when/if Arctic releases appropriate cooler, like I did with my current Pascal Titan X card. I also plan to get either Pimax 8k or StarVR One, when they are eventually released (currently using Vive Pro - and LG’s 4K 55E6 for 3DV).
As for 3D vision i don't know is there Any Sense of upgrading from titanx Pascal. And i don't like bying pig in a Bag.
And im not in Any hurry.
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
The stumbling block was that was worried about new drivers not supporting 3d vision as nothing had been confirmed, but I believe it's officialy listed somewhere as a supported feature.
As it stands, I also need to understand how the 2080 matches up to the 1080TI, as I still might get 1080TI SLI.
But then, SLI does bottleneck in some scenarios, you can see that with the modifications DSS made to 3dmigoto, the link can become saturated in some games, so again, how good is NVLink going to be?
I'm excited about the new cards and the new technologies, yes, just too many unknowns at this point.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
An extremely pertinent question it has to be said, considering what forum we're on! Let's be honest, everyone's taking a punt purchasing any high-end GPU right now. Good luck with your purchase though, and I hope it works out for you.
@rustyk21
I've placed my bet, and ordered 2 1080 Ti cards (EVGA Black Edition), on import from the US. Combined, that's just over the £1300 mark, which stacks up pretty well with the very best (or most expensive) single 2080 Ti models currently on offer. I'm looking to put a new rig together over the next 2-3 months, since it'll actually take that long for me to afford purchasing all of the component parts. I'm still a believer in SLI (to heck with what the likes of Sora think). Not being the biggest fan of Windows 10, I'll be looking to get an i7-6850k to enable each card to run in a PCI-E x16 + x16 configuration, with the aim of getting at least 10fps extra performance over and above an x8 + x8 alternative. That might just offset some CPU bottlenecking in my view, but I won't know that for sure until sometime later in the year :S I'm realistic about the scaling not being anywhere near optimal using 2 over-powered cards, as I'll just be using a single 3d 1440p monitor, but nevertheless I'm still hoping for scaling somewhere between say 65-85% in the best cases. Also not forgetting that the 1440p pixel count in 3d isn't too far short of 4k in 2d.
Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82 (Win8.0), Driver 388.71 (Win7), DX11.0
harisukro: "You sir, are 'Steely Eyed Missile Man'" (Quote from Apollo 13)
Return them if you can.
Sora can be annoying but he /she is correct about SLI. SLI is dying, I've had 780 Ti SLI in the past and I was so relieved to move from that to single 980 Ti in 2015 and the situation has only deteriorated further.
I watched an ebay auction of a 1080 Ti FE with EK waterblock end before my eyes, it had a block from day one (according to the listing), indicating that it was likely not used for mining and keeping the temps down around 45C about the entire card does wonders for longevity, but it wasn't the condition nor the price of the card that dissuaded me from bidding, it's that SLI support is absolutely abysmal and getting worse. The auction ended for $600. The block by itself is $150 and had zero signs of wear and included an EK backplate. Meaning, this was a $450 1080 Ti that someone just snagged.
I spent 3-4 hours mulling it over, and right now I only have 4-5 games that I can justify getting a second 1080 Ti for, and half of them have zero SLI support:
SLI Supported:
Mass Effect: Andromeda (3D Vision)
The Witcher 3: (3D Vision)
Shadow of the Tomb Raider (3D Vision)
Watch Dogs 2, either 3D at 2560x1440 on PG278Q or 2D, G-Sync, @ 3440x1440 on AW3418DW
No SLI Support:
Mafia 3 Including all DLC (great game IMHO), seeing 70 FPS at 3440x1440
No Man's Sky
Middle Earth: Shadow of War
Forza Horizon 3 and 4
The rest of my games just aren't demanding enough at 3D 2560x1440 or 2D 3440x1440 to warrant more GPU compute.
The problem is, SLI support is down around 65% of games, with many new titles offering zero support.
Titanfall 2, no SLI support to this day
Any OpenGL title, i.e. Wolfenstein, Doom, The Evil Within, No Man's Sky, no SLI support.
Bethesda usually has great SLI support and scaling, as does Nixxes, behind Tomb Raider franchise.
I HATE to say it but youre better off returning the 1080 Ti's if you can and waiting until we get concrete benchmarks before upgrading.
Timespy is a great indicator of performance, but we truly don't know how 2080 Ti is going to perform, nor do we honestly know whether or not the leaked benches are even real.
This could all be reverse psychology on the part of Nvidia, softening our expectations, so when we see real benches indicating higher than say 20% we are more inclined to be interested in purchasing vs thinking we are looking at a 50% bump in performance vis a vis 1080 Ti and then benches show 30%, then no-one would buy.
Youre better off with +30% in 100% of games than +60% in 50% of games and +30% or so in the other 15% of games with stuttering and other problems, having to mess around with Nvidia Inspector (i.e. setting SLI to AFR2 in Metro and dealing with scaling loss). I mean SLI truly just sucks. And if youre a 3D Vision enthusiast, many if not all of the fixes will indicate a problem with SLI, i.e. The Witcher 3 fix water reflections being bugged with SLI etc. etc. etc.
I don't want to be a downer or a negative nancy, but once you open those 1080 Ti's you may be beyond the point of no return and end up regretting going the SLI route.
The 3rd option is just waiting.
Nvidia is most certainly going to release a version of TU102 with an uncut die. We absolutely are going to see a Titan card this generation, and if history is any indication, we may see one before the end of the year. Titan cards always follow the main launch by 3 months. Watch them release a Titan card for $1200 that is 15% faster than 2080 Ti whilst simultaneously dropping the price of 2080 Ti, 2080 and 2070 by $100 like they did when they released 1080 Ti.
If you can wait further, there's a chance that we will see 7nm Volta late 2019, depending on what AMD does with Navi.
I'm probably going to wait, the only way I wont is if 2080 Ti is at minimum 40% faster in the aforementioned and future titles (SOTR, Fallout 76 etc.) and I can attain one for $999. That is the only way. Otherwise I'm waiting.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
3D Vision may be possible in Batman: Arkham Knight soon and a single 1080 Ti might not cut it for 60 FPS @ 2560x1440 and the game has no SLI support.
Just another example.
I started playing this game again recently, and honestly, it's really a good game. It's probably the only game that I haven't uninstalled in the past 3 years, along with Planetside 2, another game that doesn't have SLI support (but honestly doesn't need it anymore).
https://forums.geforce.com/default/topic/846640/3d-vision/batman-arkham-knight-3d-report/40/
Doom Eternal is also going to use OpenGL again and also not have SLI support. I mean how many good games do you want to come out to either not have the FPS you want in 3D Vision or 2D (i.e. No Man's Sky)?
I'm also hearing that there is an issue with G-Sync and SLI?
https://forums.geforce.com/default/topic/987529/sli-and-gsync-do-gsync-works-while-sli-is-on-/
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
Though i remember you always had to hunt for the correct sli bits to get the scaling up
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
https://youtu.be/DynYRiTXR28?t=20m
And Paul so cannot even pretend to be enthusiastic about this shiite.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
It's important to establish that the notion of 'value for money' is an entirely relative concept. For instance, buying a newly released games console and then being fairly selective about the games that you buy from then on, is by comparison a fraction of the cost of pc gaming. Anybody who games using even a half-decent pc and thinks that they're getting true value for money, needs their head examined. Entities like Steam and other online vendors have certainly put downward pressure on the cost of pc games so that any 'perceived value' is then slowly clawed back over time, but as I say, it's all relative in my opinion.
Sora isn't just annoying, he's arrogant with it. SLI may well be dying for Sora, but it isn't for me. It's interesting that you did a short list of SLI and non-SLI games. It just so happens that I actually like and own most of the games in your former list, and have had no interest in purchasing any of the games in the latter, excluding Middle Earth: Shadow of War which I do own. On my current rig, that particular game performs at around 45-50fps with zero stuttering even though it's not using both GPUs, and much like GTA V, it's perfectly playable and stutter-free even below 60fps. I can honestly say that not one of my games stutter using SLI, and I've also been very careful as to which drivers I use. Both 331.82 on Windows 8 (weighted towards DX9) and 388.71 on Windows 7 (weighted towards DX11) have worked well for me. Incidentally, my racing game of choice is the GTR series from Simbin, which all perform well enough in SLI, and I reckon there's a decent chance that the upcoming GTR3 will do the same. I'll test Doom (2016) on my current pc, if only to remind myself as to how it performs, both with and without SLI in 3D.
My experience of SLI has been extremely good. My current rig that's still shown in my signature was distinctly overkill when I bought it almost 5 years ago, and @ 1080p I can still to this day play the games on your aforementioned list at 60fps in most instances, with a few graphical settings turned down here and there.
Fair play to you, in that you're obviously directly familiar with both the 780 Ti and the 1080 Ti, which in my opinion are the best price/performance ratio high-end cards that Nvidia have produced. I do realise that a single 1080 Ti should suffice at 1440p, but there'll always be some games that are more demanding than others, and therefore would need an extra boost to get them past that 60fps bare minimum. I'm hedging that there's more perceived price/performance value to be had with a 1080 Ti as opposed to a 2080 Ti, so to me, getting a second 1080 Ti is money well spent at that price point. It's important to stress that word because it is often a matter of perception. I had briefly considered getting 2 1080 Ti cards with water-blocks, but the overall cost didn't justify buying 2 cards at this point in time, so I'm going with air-cooled this time around with no additional OC until that's actually needed.
You say that in your own experience, many games aren't demanding enough using a single 1080 Ti and that's fair enough. That's what should be expected with a brand new or fairly new GPU. However, I'm thinking as to how my system will perform 4 to 5 years from now with 2 GPUs. I'm quite happy to admit that having a second GPU is an indulgence and a gamble, so I'd imagine you certainly wouldn't appreciate me putting a 1050 Ti in as an additional dedicated PhysX renderer, which I'd certainly consider, depending entirely upon just how much my 2 1080 Ti GPUs are graphically taxed in the future or if I ever changed to 3D Surround. With 2 GTX Titans (not Titan X's or Titan XP's mind you) my system can still punch well above its weight @ 1080p in PhysX games such as Killing Floor 2, Warframe, Lords of the Fallen and the entire Metro series, all running at 60fps or above. The only advantage that the original Titan had over the 780 Ti was the amount of VRAM it had, which is why I never felt the need to replace them.
I have no interest in RTX GPUs at 12nm. If (and it's a big if) the performance benchmarks on the 2080 Ti aren't as good as expected, the 1080 Ti may be well be in higher demand at a comparatively cheaper alternative, hence putting upward pressure on their price which is why I'm buying now rather than waiting for the results to come out. Also bear mind that the price of a 1080 Ti, along with numerous other GPUs, spiked to around twice their original retail value earlier in the year, due to mining and crypto-currency pressures. So again it makes sense to get 2 now at their current comparative price.
I personally haven't needed to resort to using AFR2 in any of the Metro games to improve SLI scaling at 1080p, unless you're saying that that's a problem that you've experienced at 1440p. Again with Witcher 3, I'm sorry to hear that you've had problems with it, but my experience with Witcher 3 in SLI has been excellent with no problems in 3D at all. In fact, I was shocked at how well it ran in 3D. I expected a huge performance loss on my system, and I fully expected not to be able to play it in 3D. As it turns out, in Novigrad for instance, I still get around 50fps @1080p. It's not ideal, but it's still perfectly playable. The only game so far that I've had to resort to turning the resolution down just a single notch, was ELEX. Once again SLI does work with that game too, at least it does for me. Admittedly, the scaling may not be fantastic, but SLI enables ELEX to run at a resolution and a frame rate that isn't jarring to the point of hindering gameplay. In fact graphically it still looks really good in 3D even at a slightly lower resolution.
The option of waiting any further isn't an option for me in that I've waited almost 5 years already. 1080 Ti's are ones for me, fully acknowledging what you've said about any possible emergence of an RTX Titan this year. I know that it is genuine concern on your part given your own experiences with SLI and 3D, but I guess I've been that much luckier with that combination. I'm really not expecting games in future to support SLI, except perhaps Star Citizen and Cyberpunk 2077, which in that case would definitely piss me off. Other than that though, I regard it as being a great bonus and nothing more. I've got a Vive headset that I haven't even taken out of it's box yet, and which I purchased 2 years ago, so that's another incentive for me not to wait any longer.
I'm really not too fussed about G-Sync to be honest, assuming there is actually an issue with it. I'd disable it, just to retain SLI if it ever came to it.
For your sake, I obviously hope that Volta is eventually worth your wait. Seeing as there's a low probability of there ever being a 4k 3D monitor appearing on the horizon, that would probably be the only reason for me to actually justify waiting any longer for a 7nm GPU, let alone two of them. So basically, we couldn't be more opposite on this issue, but there you go. I sure that whenever these upcoming benchmarks do appear, I somehow suspect that this forum might just provide a little extra entertainment value all by itself.
Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82 (Win8.0), Driver 388.71 (Win7), DX11.0
harisukro: "You sir, are 'Steely Eyed Missile Man'" (Quote from Apollo 13)
You waited 5 years, you could have waited another 5 months!
I would send them back!
Are you going to try to push 1080 Ti SLI with a 4770k? Cause I can tell your right now that my 4930k @ 4.5 GHz in my previous system was a bottleneck with a single 1080 Ti.
Oh and G-Sync is legit.
It would be a hard decision choosing between G-Sync or +80-90% SLI performance in a game unless the game in question was a 3D Vision title and G-Sync didn't work in it. Way less latency, zero stutter or micro-stutter. If you haven't experienced a variable refresh rate monitor technology yet you honestly don't know what youre missing.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
Get this, they delayed the 2080 review NDA to the 17th and the 2080 Ti review to the 19th!
I mean, to me I think it's rather ridiculous that there is an NDA to begin with but this is problematic for a few reasons.
It could mean that they want to shorten the window amount of time consumers have between seeing abysmal rasterization performance improvements and cancelling their orders and it could also mean that they are desperately trying to wring as much performance as they can out of the drivers for DX11 benchmarks but aren't really making any headwind.
Either way, this is a rather ominous sign that the leaked Timespy benches we've seen thus far, where an overclocked 2080 is 3% slower than an overclocked 1080 Ti and a 2080 Ti, presumably at only +90 MHz (but maybe with another 5% headroom on top of that, Peterson has stated on record that they will do 2100 MHz and that's about it) being only 18% faster than overclocked 1080 Ti, are going to be more or less what we can expect on the DX11 side of things as well.
This is absolutely not good. As I said, I don't know why there is a review NDA at all unless you want to sell something to consumers without them making an informed decision. That Shadow of the Tomb Raider, which also launches around the same time will have Ray Tracing disabled only buttresses the perception that they absolutely want to hide the performance of both rasterization and ray tracing with this launch.
https://www.reddit.com/r/nvidia/comments/9d58y4/nvidia_geforce_rtx_2080_reviews_go_live_on/
Having a look at the collective sentiment there, man this thing does not look good at all. I could see no other reason for Nvidia delaying review NDA right up until launch unless they are hoping that a percentage of consumers don't cancel in time and are too lazy to physically return the product in question once they receive it.
That the reviewers that will actually post reviews one or two days before launch all had to go through a stringent "selection" process only makes this look that much more worse:
https://www.overclock.net/forum/225-hardware-news/1707000-hardocp-nvidia-controls-aib-launch-driver-distribution.html
https://www.hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
"First and foremost, NVIDIA has demanded that its AIBs tell NVIDIA who will be reviewing the AIB's custom RTX 2080 and 2080 Ti cards. We were forwarded emails from other reviewers, from the AIBs that were asking specifically, at NVIDIA's direction, "Who will be performing the review content?" "What is that person's phone number and email address?" That is a bit odd, as we have never seen this before in 20 years of reviewing video cards. AIBs in the past have been left to pretty much operate their own review campaigns on new video cards, but that seems to have come to an end. From these lists of reviewers submitted to NVIDIA by the AIBs, NVIDIA has put together its own list of "approved reviewers," and sent their approved list back to the AIBs in order to let them know who they are allowed to sample review cards to. Much like NVIDIA exerted control over AIB's and OEM's brands with GPP, it is now exerting control over who the AIB has review its own custom cards.
This is where it gets a bit more interesting, and likely should give you concern with any leaked benchmarks you see on the web. NVIDIA is not allowing its AIBs to distribute drivers with their review cards. For a reviewer to have access, he must first sign NVIDIA's multi-year NDA (which is fine if you are "just" a card reviewer), then he will log into a protected site which is most likely a secured version of GeForce Experience in order to obtain the driver, and download from there into a specific machine with the new RTX card being present."
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
LOL there is also the issue of , the LACK of DirectX 12 games ROFL
The most stupid dx release, dx12 . First we wait many years for some games and
The there is few that bench about the same in dx11 than dx12.
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I agree, just pointing it out. And Rise of the Tomb Raider runs considerably smoother in DX12 vs DX11 in problematic areas, i.e. Geothermal Valley on my 8700k and 4930k, unfortunately that's 2D only.
But yeah, most other games DX12 has been a joke, BF1 it's garbage. I don't know what happened, I figured DX12 would be more widespread by now. Maybe when the next consoles release and Microsoft starts developing games for their next-gen solely on DX12.
Well if you have a multi-core CPU and are experiencing a bottleneck that benefits from DX12, say in Rise of the Tomb Raider etc then it's an issue of do you want to disable SLI for DX12 to deal with the CPU bottleneck etc?
And Gears of War 4, this is NOT a good example.
It was released in Oct. 2016 and SLI support didn't happen until May 1st, 2017. I mean yeah, some of us are playing games 6 months after release, but if it's a multiplayer title, say Titanfall 2 or Star Wars: Battlefront or Call of Duty: Black Ops 3, 6 months can be the end of the game's multiplayer life-cycle. So by the time you can actually run the game at an appreciable frame-rate, the player-base has nearly completely dwindled to nothing.
And that's exactly what I'm talking about. SLI just straight up sucks. It's great for a few niche titles that support it, be it Fallout 4, Rise of the Tomb Raider, Crysis 3 etc. but if it's a multiplayer game, if there is no implementation out of the gate it usually takes them months to implement it and by the time they do in many cases the game is dead.
This was exactly my experience in 2014, I told everyone in various places on the web that I was moving from a laptop to desktop and going with a single 780 Ti. I was told to wait for Maxwell, as this was literally 6 months before the release of the next architecture. I told them I couldn't wait, I've waited long enough, and that I will build a PC now (primarily to get away from 680M SLI lol). So I did, and inevitably, after picking up my Swift PG278Q and falling in love with 3D Vision and realizing that one 780 Ti wasn't enough to push Assassins Creed Black Flag and Tomb Raider 2013 etc. I picked up another 780 Ti for 780 Ti SLI and yeah sure, those two games ran great, but then it was one problem after another. Dragon Age: Inquisition was released, and there was atrocious stutter with SLI. Batman: Arkham Knight was released. Zero SLI support. COD: Black Ops 3, no TAA support with SLI. Planetside 2, no proper SLI support (a single 780 Ti was challenged by this game back then).
I was completely relieved to pull that out in favor of 980 Ti. I've been with X-Fire and SLI since I got into PC Gaming: ATI 5870M X-Fire, GTX 580M SLI, GTX 680M SLI, GTX 780 TI SLI. SLI just straight up sucks.
And with going from SLI to single GPU there was a noticeable improvement in smoothness that's hard to describe. It's not even micro-stutter, it's like latency + frame-timing and G-Sync being impacted all together. Single GPU is absolutely better than SLI, 100% hands down. Also worth mentioning, two 300W GPU's in SLI will heat up your room really quickly. If you don't have great AC or it isn't naturally cool where you are you need to think about this before adding a 2nd 300W GPU.
So when I see people making the same mistake I did way back in 2014, I am trying to warn them, pointing to my experience. And SLI has only gotten worse, much worse since then.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#