i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
[quote="xXxStarManxXx"]
[quote="rustyk21"][quote="xXxStarManxXx"]There's also the issue of DX12 titles not supporting SLI.
...
[/quote]
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.[/quote]
Well if you have a multi-core CPU and are experiencing a bottleneck that benefits from DX12, say in Rise of the Tomb Raider etc then it's an issue of do you want to disable SLI for DX12 to deal with the CPU bottleneck etc?
And Gears of War 4, this is NOT a good example.
......
So when I see people making the same mistake I did way back in 2014, I am trying to warn them, pointing to my experience. And SLI has only gotten worse, much worse since then.
[/quote]
So I agree with you 100% that SLI often isn't there when games are released, it's definitely a major factor and it's why SLI is definitely not plug and play and is more of an enthusiasts thing. I only removed that text to try and make this post shorter!
If you expect if to work on day 1, then yes, at the moment it's often overlooked and sometimes never works.
I also agree that for this reason and others, any single card with enough comparable performance will aways be a better option that SLI, for all sorts of reasons.
BUT.. yes, you knew it was coming.
1. You basically said it doesn't work in DX12. But DX12 can scale with multiple cards. It's the scaling that's the point really isn't it?
2. Gears was given as an example of brilliant scaling. The fact that it was actually patched back in *could* be seen as a good thing, because it means it's still supported and there's still a demand.
3. ROTR actually scales brilliantly with multiple cards in DX12, so there is no choice to make.
4. Appreciate your experiences, but you need to understand that there are plenty of other people, especially on this forum that know exactly what they're talking about. You don't need to warn us off, we have our own experiences, do our own research and make our own conclusions.
5. As a value proposition, all of this and what you're raised is relevant to SLI/mgpu, but see point 4 and relate it to the actual topic we were originally discussing. That's my main point.
xXxStarManxXx said:There's also the issue of DX12 titles not supporting SLI.
...
I'm sure you're already aware that DX12 has multi gpu instead. Again, it must be supported, but so must SLI.
All the early DX12 games were a mess anyway, AMD promoted it because of Async compute but more often than not, DX12 just dropped the framerates for no gain, particularly for Nvidia users.
I don't think I have a single game that is Dx12 only, so if I need to I just use the better performing DX11 path and SLI where applicable.
Admittedly once we're in a DX12 only world the landscape will change again, but it's not clear how. Clearly there are plenty of examples of DX12 games that have no mgpu support, but there are others such as Sniper Elite 4 and Gears of War 4 that work brilliantly.
Going forwards, it will all be down to the 'big' game engines and whether they support it or not.
Well if you have a multi-core CPU and are experiencing a bottleneck that benefits from DX12, say in Rise of the Tomb Raider etc then it's an issue of do you want to disable SLI for DX12 to deal with the CPU bottleneck etc?
And Gears of War 4, this is NOT a good example.
......
So when I see people making the same mistake I did way back in 2014, I am trying to warn them, pointing to my experience. And SLI has only gotten worse, much worse since then.
So I agree with you 100% that SLI often isn't there when games are released, it's definitely a major factor and it's why SLI is definitely not plug and play and is more of an enthusiasts thing. I only removed that text to try and make this post shorter!
If you expect if to work on day 1, then yes, at the moment it's often overlooked and sometimes never works.
I also agree that for this reason and others, any single card with enough comparable performance will aways be a better option that SLI, for all sorts of reasons.
BUT.. yes, you knew it was coming.
1. You basically said it doesn't work in DX12. But DX12 can scale with multiple cards. It's the scaling that's the point really isn't it?
2. Gears was given as an example of brilliant scaling. The fact that it was actually patched back in *could* be seen as a good thing, because it means it's still supported and there's still a demand.
3. ROTR actually scales brilliantly with multiple cards in DX12, so there is no choice to make.
4. Appreciate your experiences, but you need to understand that there are plenty of other people, especially on this forum that know exactly what they're talking about. You don't need to warn us off, we have our own experiences, do our own research and make our own conclusions.
5. As a value proposition, all of this and what you're raised is relevant to SLI/mgpu, but see point 4 and relate it to the actual topic we were originally discussing. That's my main point.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
https://www.youtube.com/watch?v=bzJpNh21sds&t=210s
@ 3:27 mark "Out of 9 games 3 games we tested in DX12 scaled very well"
The entire video is basically an epitaph for SLI.
Deus Ex Mankind Divided:
Single 1080 Ti: 67 FPS
1080 Ti SLI: 85 FPS
That's a big fat fail. This isn't even 30% scaling (watch the video from the beginning, there are far far worse). And these are just raw framerate differences. There's not analysis of frame-timing, how G-Sync is impacted, micro-stutter. All these are are raw frame-rate differences.
14 out of 25 games exhibit some scaling, but if you watch the video, with rare exception (Firestrike, Fallout 4, Rise of the Tomb Raider) most are absolutely unacceptable scaling. I'm talking like another 10 FPS. These probably shouldn't even be counted as scaling, but ultimately, if you step back 11 out of 25 games have NO SCALING, and maybe 3/4 of the games that have scaling the scaling is a complete joke. And again, only raw framerates here, no talk of frame timing, microstutter, and the loss / degradation of G-Sync.
I'm framing this argument against someone who is contemplating buying two 1080 Ti's outright for $1300+ right now as an alternative to a 2080 Ti, not necessarily someone who already has a 1080 Ti and who may be contemplating adding a second one for $550 or so (I passed on one for $600 and it included an EK waterblock and backplate, some $900 of value easily).
Honestly, I wouldn't even pay $550 for this. This is unacceptable. I would wait until next year if I felt I absolutely had to upgrade.
Trying to tell yourself that youre upgrading, and paying a bunch of money towards that, only to realize that SLI is, as many have noted, in hospice, and then experiencing performance like in the video above youre going to have extreme remorse, and youre now going to have $550-700+ of sunk expenditure that could have been saved towards a faster card in 2019.
Hell, if you can wait, I'm positive the next 70 series card on 7nm will be considerably faster than 2070 and if Nvidia get's enough push back (see the straw poll above, 80% are not pre-ordering, meaning they are only tricking the total morons) and AMD gets somewhere with Navi as an alternative then we may see prices come down next generation.
Don't think it will happen?
Look at how Intel responded to Ryzen.
The consumer base had had enough of Intel's bullshit and they had to get real and bring the price / performance ratio back down to sane, acceptable levels.
@ 3:27 mark "Out of 9 games 3 games we tested in DX12 scaled very well"
The entire video is basically an epitaph for SLI.
Deus Ex Mankind Divided:
Single 1080 Ti: 67 FPS
1080 Ti SLI: 85 FPS
That's a big fat fail. This isn't even 30% scaling (watch the video from the beginning, there are far far worse). And these are just raw framerate differences. There's not analysis of frame-timing, how G-Sync is impacted, micro-stutter. All these are are raw frame-rate differences.
14 out of 25 games exhibit some scaling, but if you watch the video, with rare exception (Firestrike, Fallout 4, Rise of the Tomb Raider) most are absolutely unacceptable scaling. I'm talking like another 10 FPS. These probably shouldn't even be counted as scaling, but ultimately, if you step back 11 out of 25 games have NO SCALING, and maybe 3/4 of the games that have scaling the scaling is a complete joke. And again, only raw framerates here, no talk of frame timing, microstutter, and the loss / degradation of G-Sync.
I'm framing this argument against someone who is contemplating buying two 1080 Ti's outright for $1300+ right now as an alternative to a 2080 Ti, not necessarily someone who already has a 1080 Ti and who may be contemplating adding a second one for $550 or so (I passed on one for $600 and it included an EK waterblock and backplate, some $900 of value easily).
Honestly, I wouldn't even pay $550 for this. This is unacceptable. I would wait until next year if I felt I absolutely had to upgrade.
Trying to tell yourself that youre upgrading, and paying a bunch of money towards that, only to realize that SLI is, as many have noted, in hospice, and then experiencing performance like in the video above youre going to have extreme remorse, and youre now going to have $550-700+ of sunk expenditure that could have been saved towards a faster card in 2019.
Hell, if you can wait, I'm positive the next 70 series card on 7nm will be considerably faster than 2070 and if Nvidia get's enough push back (see the straw poll above, 80% are not pre-ordering, meaning they are only tricking the total morons) and AMD gets somewhere with Navi as an alternative then we may see prices come down next generation.
Don't think it will happen?
Look at how Intel responded to Ryzen.
The consumer base had had enough of Intel's bullshit and they had to get real and bring the price / performance ratio back down to sane, acceptable levels.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
xXxStarManxXx, for what it's worth, I do appreciate a lot of your anti-SLi sentiment.
I just want to point out that what you might see as absolute truths, especially from youtubers, are not. One has to remember that their primary purpose is infotainment, not really great journalism, nor a great deal of undrstanding.
The majority, if not all, of them do many stupid things, such as benchmark CPUs at high resolutions instead of the lowest possible resolution thereby benchmarking the GPU instead of the CPU; or benchmark memory in GPU limited scenarios... laughable really...
And of course, benchmarking GPUs in CPU limited scenarios...
1. The majority of SLi benchmarks which show lacklustre performance (non-SLI compatible games excepted), show bad scaling not because of bad SLi, but because the CPU has become the bottleneck - a single 1080 Ti is simply too powerful to show the muscle of 2 1080 Ti in SLi. Yes, even at 4K, and especially in 2D.
For a better comparison, one has to look at either:
a. 8K+ 1080 Ti SLI benchmarks
or
b. a lower tier graphics card, say a 970 SLi setup running games at 4k.
Luckily, techpowerup has a combined graph of ~17 games with the following representation:
[img]https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_970_SLI/images/perfrel_3840.gif[/img]
2. SLi requires double the pixels, so playing at 1440 3DV is actually almost the same as playing 4k in 2D:
2560x1440 x2 (3DV) = 7,372,800 pixels.
3840x2160 (4K) = 8,294,400 pixels.
Therefore, 1440 3DV with SLi will show great scaling on powerful GPUs literally equivalent to scaling shown in double the resolution, i.e in this case, 4K.
This means that comparing 2D performance and mocking 1440p scaling, especially with a fake performance cap which is CPU limited, as in the case of 2x 1080Ti in SLi even at 4k, isn't a fair, nor a correct comparison to make.
3. I appreciate what you are saying about getting a single faster card vs 2 in SLi. Unfortunately, the reality of the matter is that most of us are already using the highest or near the highest performing cards - we simply do not have the luxury of getting a single faster card with a good return on investment, i.e. performance vs. price.
4. Since nVidia have pretty much confirmed that the 2080Ti performance shall be 1.4x 1080Ti performance (maybe even at best), Tothepoint2's purchase decision betting the ~1.7x+ performance gain with 2x 1080Ti in SLi in many supported games is a justifiably correct choice for the same money.
All the best.
P.S. Might I humbly suggest that we all make an effort to split text up into many paragraphs, or replace them as much as we can with pictures? As they say - a picture is worth a thousand words.
Otherwise, posts become a wall of text which no-one really wants to read. Of course, I might be being a tad biased as English is my third language ¯\_(ツ)_/¯
xXxStarManxXx, for what it's worth, I do appreciate a lot of your anti-SLi sentiment.
I just want to point out that what you might see as absolute truths, especially from youtubers, are not. One has to remember that their primary purpose is infotainment, not really great journalism, nor a great deal of undrstanding.
The majority, if not all, of them do many stupid things, such as benchmark CPUs at high resolutions instead of the lowest possible resolution thereby benchmarking the GPU instead of the CPU; or benchmark memory in GPU limited scenarios... laughable really...
And of course, benchmarking GPUs in CPU limited scenarios...
1. The majority of SLi benchmarks which show lacklustre performance (non-SLI compatible games excepted), show bad scaling not because of bad SLi, but because the CPU has become the bottleneck - a single 1080 Ti is simply too powerful to show the muscle of 2 1080 Ti in SLi. Yes, even at 4K, and especially in 2D.
For a better comparison, one has to look at either:
a. 8K+ 1080 Ti SLI benchmarks
or
b. a lower tier graphics card, say a 970 SLi setup running games at 4k.
Luckily, techpowerup has a combined graph of ~17 games with the following representation:
2. SLi requires double the pixels, so playing at 1440 3DV is actually almost the same as playing 4k in 2D:
Therefore, 1440 3DV with SLi will show great scaling on powerful GPUs literally equivalent to scaling shown in double the resolution, i.e in this case, 4K.
This means that comparing 2D performance and mocking 1440p scaling, especially with a fake performance cap which is CPU limited, as in the case of 2x 1080Ti in SLi even at 4k, isn't a fair, nor a correct comparison to make.
3. I appreciate what you are saying about getting a single faster card vs 2 in SLi. Unfortunately, the reality of the matter is that most of us are already using the highest or near the highest performing cards - we simply do not have the luxury of getting a single faster card with a good return on investment, i.e. performance vs. price.
4. Since nVidia have pretty much confirmed that the 2080Ti performance shall be 1.4x 1080Ti performance (maybe even at best), Tothepoint2's purchase decision betting the ~1.7x+ performance gain with 2x 1080Ti in SLi in many supported games is a justifiably correct choice for the same money.
All the best.
P.S. Might I humbly suggest that we all make an effort to split text up into many paragraphs, or replace them as much as we can with pictures? As they say - a picture is worth a thousand words.
Otherwise, posts become a wall of text which no-one really wants to read. Of course, I might be being a tad biased as English is my third language ¯\_(ツ)_/¯
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
@xXxStarManxXx
Evidently, you hadn't read what I'd written.
1) I'm putting together an entirely new system from scratch.
2) An i7-6850k is my processor of choice, giving a potential total of 3 GPUs access to 40 PCI-E lanes in a 16/16/8 configuration, on an Asus X-99 Deluxe II motherboard.
3) The i7-6850k is specifically designed with multiple GPU usage in mind.
4) Using the i7-6850k, SLI has been proven to gain benefits both in terms of its performance and in terms of its scaling, resulting in a minimum 10fps increase over and above a comparative SLI setup using either an i7-6700k, an i7-7700k or an i7-8700k.
5) NVidia haven't received a penny from me since 2013. Would you care to mention just how much you've spent on NVidia GPUs over the same period please?
6) I'm not here telling anybody either, how, when, or if, to spend their own money, but you are.
7) You had kept on buying GPUs in SLI configurations that were too powerful for your CPU to handle, causing you endless stuttering and micro-stuttering issues, whereas I have not. That is certainly not NVidia's fault.
8) Another case and point: Dragon Age: Inquisition (3D + SLI = No stuttering on my system. It is the only game that I currently own that is heavily GPU bound with frame rates dipping to 35fps at times. The only one. It runs slowly, but with zero stuttering. Unlike your own stuttering issues with this self same title.)
9) I'm quite sure that SLI does suck regarding multiplayer gaming, but then I don't play multi-player games.
1) I'm putting together an entirely new system from scratch.
2) An i7-6850k is my processor of choice, giving a potential total of 3 GPUs access to 40 PCI-E lanes in a 16/16/8 configuration, on an Asus X-99 Deluxe II motherboard.
3) The i7-6850k is specifically designed with multiple GPU usage in mind.
4) Using the i7-6850k, SLI has been proven to gain benefits both in terms of its performance and in terms of its scaling, resulting in a minimum 10fps increase over and above a comparative SLI setup using either an i7-6700k, an i7-7700k or an i7-8700k.
5) NVidia haven't received a penny from me since 2013. Would you care to mention just how much you've spent on NVidia GPUs over the same period please?
6) I'm not here telling anybody either, how, when, or if, to spend their own money, but you are.
7) You had kept on buying GPUs in SLI configurations that were too powerful for your CPU to handle, causing you endless stuttering and micro-stuttering issues, whereas I have not. That is certainly not NVidia's fault.
8) Another case and point: Dragon Age: Inquisition (3D + SLI = No stuttering on my system. It is the only game that I currently own that is heavily GPU bound with frame rates dipping to 35fps at times. The only one. It runs slowly, but with zero stuttering. Unlike your own stuttering issues with this self same title.)
9) I'm quite sure that SLI does suck regarding multiplayer gaming, but then I don't play multi-player games.
I've had to come to the conclusion that unfortunately xXxStarManxXx is making the mistake of going on to forums and arguing a point continually in the vain attempt to make people agree. He thinks he's right, therefore everyone else is either just wrong or ignorant of the facts.
@xXxStarManxXx
I've already said I agree with some of what you say, but you seem to think that you know better than everyone else and you just don't. It's shame because quite a bit of what you've said is good and factual, but then you keep evidencing your own confirmation bias by cherry picking quotes and links.
Unfortunately it doesn't work like that. Everyone has a different perspective and each of us is a totally different use case for SLI, and indeed for the 2xxx series.
Like I alluded to earlier, I can speak for most when I say he's not telling us anything we don't already know, but because he doesn't think it's worth it he in unable to see why it's worth it for anyone else.
I could go into pages of posts and analysis of why SLI is worth it for me, but he'd just ignore it and post another poll or youtube video.
Anyway, can't wait for the reviews of the new cards!!
I've had to come to the conclusion that unfortunately xXxStarManxXx is making the mistake of going on to forums and arguing a point continually in the vain attempt to make people agree. He thinks he's right, therefore everyone else is either just wrong or ignorant of the facts.
@xXxStarManxXx
I've already said I agree with some of what you say, but you seem to think that you know better than everyone else and you just don't. It's shame because quite a bit of what you've said is good and factual, but then you keep evidencing your own confirmation bias by cherry picking quotes and links.
Unfortunately it doesn't work like that. Everyone has a different perspective and each of us is a totally different use case for SLI, and indeed for the 2xxx series.
Like I alluded to earlier, I can speak for most when I say he's not telling us anything we don't already know, but because he doesn't think it's worth it he in unable to see why it's worth it for anyone else.
I could go into pages of posts and analysis of why SLI is worth it for me, but he'd just ignore it and post another poll or youtube video.
Anyway, can't wait for the reviews of the new cards!!
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
xXxStarManxXx = Vulcan1978
Vulcan1978 is the guy that claimed that helifax stated that 3Dmigoto supported dx12 and then went on to tell him that he needed to work on his English. This is in regards to Rise of the Tomb Raider
[quote="xXxStarManxXx"][quote="helifax"][quote="didierh"]sorry helifax, big mistake, I'm going to try your fix, sure it's better :-([/quote]
Heh, nothing to worry about there. Problem is the one sentence you wrote there is not related to our fix. Although is interesting and I actually managed to make DX12 work with 3D Vision Automatic (not 3D Vision Direct which is DISABLED from their game if you use DX12) everything is extremely broken!
The official fix is also worse in terms of stereo 3D compared to our community one. There is a whole thread here related to this:
[url]https://forums.geforce.com/default/topic/919928/3d-vision/rise-of-the-tomb-raider-2016-3d-vision-fix/[/url]
In some pages (in the beginning you can actually see what is the difference)
Hope this helps:)[/quote]
Hey dude, no offense, but you need to work on your English. Being forced to use Build 610 or earlier, THERE IS NO DX12 OPTION IN THE LAUNCHER OR IN THE GAME.
I will write in all caps, maybe you can read slowly and respond clearly:
WE ASKED YOU IF YOUR FIX WORKS WITH DX12. APPARENTLY IT DOES NOT.
Your fix SUCKS dude. I did everything correctly, down to the last letter, build 610, install your .nip, install your files to the game's .exe location. Poor Laura Croft's hair is rendering in three different locations and my FPS went from 47 in a certain area that I WANTED TO REVISIT WITH YOUR FIX AND DX12 ENABLED BECAUSE YOU ARE CLEARLY STATING THAT DX12 WORKS WITH YOUR FIX WHEN IT DOESN'T, and now it's 32 FPS with Laura's hair rendering in three different points in space.
So dude, thanks' for all of the work, but you just wasted a whole bunch of my time BECAUSE APPARENTLY ENGLISH IS NOT YOUR PRIMARY LANGUAGE.
No offense.
But when we come on here and ask you very clearly.
"DOES YOUR FIX WORK WITH DX12?"
I mean, is this confusing to you?
Now I have to go and un-F U C K my game. Thank's for wasting my time. [/quote]
Hmm, nice guy, so rational and polite
Vulcan1978 is the guy that claimed that helifax stated that 3Dmigoto supported dx12 and then went on to tell him that he needed to work on his English. This is in regards to Rise of the Tomb Raider
xXxStarManxXx said:
helifax said:
didierh said:sorry helifax, big mistake, I'm going to try your fix, sure it's better :-(
Heh, nothing to worry about there. Problem is the one sentence you wrote there is not related to our fix. Although is interesting and I actually managed to make DX12 work with 3D Vision Automatic (not 3D Vision Direct which is DISABLED from their game if you use DX12) everything is extremely broken!
Hey dude, no offense, but you need to work on your English. Being forced to use Build 610 or earlier, THERE IS NO DX12 OPTION IN THE LAUNCHER OR IN THE GAME.
I will write in all caps, maybe you can read slowly and respond clearly:
WE ASKED YOU IF YOUR FIX WORKS WITH DX12. APPARENTLY IT DOES NOT.
Your fix SUCKS dude. I did everything correctly, down to the last letter, build 610, install your .nip, install your files to the game's .exe location. Poor Laura Croft's hair is rendering in three different locations and my FPS went from 47 in a certain area that I WANTED TO REVISIT WITH YOUR FIX AND DX12 ENABLED BECAUSE YOU ARE CLEARLY STATING THAT DX12 WORKS WITH YOUR FIX WHEN IT DOESN'T, and now it's 32 FPS with Laura's hair rendering in three different points in space.
So dude, thanks' for all of the work, but you just wasted a whole bunch of my time BECAUSE APPARENTLY ENGLISH IS NOT YOUR PRIMARY LANGUAGE.
No offense.
But when we come on here and ask you very clearly.
"DOES YOUR FIX WORK WITH DX12?"
I mean, is this confusing to you?
Now I have to go and un-F U C K my game. Thank's for wasting my time.
Regarding Turing, it seems that the consumer GPUs will be using GDDR6 from Micron, where as Professional GPUs will be using the faster GDDR6 from Samsung
Regarding Turing, it seems that the consumer GPUs will be using GDDR6 from Micron, where as Professional GPUs will be using the faster GDDR6 from Samsung
[quote="RAGEdemon"]xXxStarManxXx, for what it's worth, I do appreciate a lot of your anti-SLi sentiment.
I just want to point out that what you might see as absolute truths, especially from youtubers, are not. One has to remember that their primary purpose is infotainment, not really great journalism, nor a great deal of undrstanding.
The majority, if not all, of them do many stupid things, such as benchmark CPUs at high resolutions instead of the lowest possible resolution thereby benchmarking the GPU instead of the CPU; or benchmark memory in GPU limited scenarios... laughable really...
And of course, benchmarking GPUs in CPU limited scenarios...
1. The majority of SLi benchmarks which show lacklustre performance (non-SLI compatible games excepted), show bad scaling not because of bad SLi, but because the CPU has become the bottleneck - a single 1080 Ti is simply too powerful to show the muscle of 2 1080 Ti in SLi. Yes, even at 4K, and especially in 2D.
For a better comparison, one has to look at either:
a. 8K+ 1080 Ti SLI benchmarks
or
b. a lower tier graphics card, say a 970 SLi setup running games at 4k.
Luckily, techpowerup has a combined graph of ~17 games with the following representation:
[img]https://tpucdn.com/reviews/NVIDIA/GeForce_GTX_970_SLI/images/perfrel_3840.gif[/img]
2. SLi requires double the pixels, so playing at 1440 3DV is actually almost the same as playing 4k in 2D:
2560x1440 x2 (3DV) = 7,372,800 pixels.
3840x2160 (4K) = 8,294,400 pixels.
Therefore, 1440 3DV with SLi will show great scaling on powerful GPUs literally equivalent to scaling shown in double the resolution, i.e in this case, 4K.
This means that comparing 2D performance and mocking 1440p scaling, especially with a fake performance cap which is CPU limited, as in the case of 2x 1080Ti in SLi even at 4k, isn't a fair, nor a correct comparison to make.
3. I appreciate what you are saying about getting a single faster card vs 2 in SLi. Unfortunately, the reality of the matter is that most of us are already using the highest or near the highest performing cards - we simply do not have the luxury of getting a single faster card with a good return on investment, i.e. performance vs. price.
4. Since nVidia have pretty much confirmed that the 2080Ti performance shall be 1.4x 1080Ti performance (maybe even at best), Tothepoint2's purchase decision betting the ~1.7x+ performance gain with 2x 1080Ti in SLi in many supported games is a justifiably correct choice for the same money.
All the best.
P.S. Might I humbly suggest that we all make an effort to split text up into many paragraphs, or replace them as much as we can with pictures? As they say - a picture is worth a thousand words.
Otherwise, posts become a wall of text which no-one really wants to read. Of course, I might be being a tad biased as English is my third language ¯\_(ツ)_/¯ [/quote]
If you actually watch the video framerate comparison was done at both 1440p and 4K, both of which mostly negate a CPU bottleneck with 4K pretty much ruling out a CPU bottleneck. Tech of Tomorrow's presentation style is different, but he's been around for a long time and is trustworthy.
I am attempting to point out that actually, spending up for another GPU, be it another 1070, 1080, 1080 Ti, whatever is actually mal-adaptive as youre better off SAVING the cost of the additional GPU UNTIL you can afford a faster single card. That can mean selling your 1070 and putting what you made from that towards a new or used 1080 Ti or saving that $500 or whatever you have on hand until you can swing a used 2080 in 2019 or, if you have enough a new GPU on 7nm node late 2019 for $700. All of which are a vastly superior decision to buying a second GPU for SLI.
The diagonosis is in.
SLI is comatose and not recovering.
You might think youre being smart and "oh hey look I have a faster 3DMark synthetic score than a GTX 2080 Ti and Fallout 4 runs great! I hope you like Fallout 4, because aside from a handful of games that exhibit good scaling, that's what youre going to be limited to playing with that second GPU.
Assassins Creed: Origins, No SLI support
Doom 2016 and Doom Eternal, No SLI support
No Man's Sky, No SLI support
Batman: Arkham Knight, No SLI
Middle Earth: Shadow of War, No SLI
The Evil Within, 1 and 2, No SLI
Forza Horizon 3 and 4 and FM7, No SLI
Only to name a few.
[quote="Tothepoint2"]@xXxStarManxXx
Evidently, you hadn't read what I'd written.
1) I'm putting together an entirely new system from scratch.
2) An i7-6850k is my processor of choice, giving a potential total of 3 GPUs access to 40 PCI-E lanes in a 16/16/8 configuration, on an Asus X-99 Deluxe II motherboard.
3) The i7-6850k is specifically designed with multiple GPU usage in mind.
4) Using the i7-6850k, SLI has been proven to gain benefits both in terms of its performance and in terms of its scaling, resulting in a minimum 10fps increase over and above a comparative SLI setup using either an i7-6700k, an i7-7700k or an i7-8700k.
5) NVidia haven't received a penny from me since 2013. Would you care to mention just how much you've spent on NVidia GPUs over the same period please?
6) I'm not here telling anybody either, how, when, or if, to spend their own money, but you are.
7) You had kept on buying GPUs in SLI configurations that were too powerful for your CPU to handle, causing you endless stuttering and micro-stuttering issues, whereas I have not. That is certainly not NVidia's fault.
8) Another case and point: Dragon Age: Inquisition (3D + SLI = No stuttering on my system. It is the only game that I currently own that is heavily GPU bound with frame rates dipping to 35fps at times. The only one. It runs slowly, but with zero stuttering. Unlike your own stuttering issues with this self same title.)
9) I'm quite sure that SLI does suck regarding multiplayer gaming, but then I don't play multi-player games.[/quote]
I was unaware that ATI 5870 X-Fire, 580M SLI, and 680M SLI were powerful graphics cards. And an i7 3920xm @ 4.4 GHz is totally a CPU bottleneck:
https://www.3dmark.com/fs/9382040
As is a 4930k @ 4.5 GHz with both cards x16/x16:
https://www.3dmark.com/fs/5333285
And youre buying all of this new in 2018? Dude, if you want x16/x16 Gigabyte is releasing a z390 SLI motherboard for Whiskey Lake / 9900k with 1st and 2nd GPU slots in x16:
https://hothardware.com/news/gigabyte-z390-gaming-sli-motherboard-9th-gen-coffee-lake-cpus
You literally can't wait a few months? i7 6850 is immediately non-future-proof. I mean dude, do your research before opening your wallet?
[quote="rustyk21"]I've had to come to the conclusion that unfortunately xXxStarManxXx is making the mistake of going on to forums and arguing a point continually in the vain attempt to make people agree. He thinks he's right, therefore everyone else is either just wrong or ignorant of the facts.
@xXxStarManxXx
I've already said I agree with some of what you say, but you seem to think that you know better than everyone else and you just don't. It's shame because quite a bit of what you've said is good and factual, but then you keep evidencing your own confirmation bias by cherry picking quotes and links.
Unfortunately it doesn't work like that. Everyone has a different perspective and each of us is a totally different use case for SLI, and indeed for the 2xxx series.
Like I alluded to earlier, I can speak for most when I say he's not telling us anything we don't already know, but because he doesn't think it's worth it he in unable to see why it's worth it for anyone else.
I could go into pages of posts and analysis of why SLI is worth it for me, but he'd just ignore it and post another poll or youtube video.
Anyway, can't wait for the reviews of the new cards!!
[/quote]
[quote="D-Man11"]xXxStarManxXx = Vulcan1978
Vulcan1978 is the guy that claimed that helifax stated that 3Dmigoto supported dx12 and then went on to tell him that he needed to work on his English. This is in regards to Rise of the Tomb Raider
[quote="xXxStarManxXx"][quote="helifax"][quote="didierh"]sorry helifax, big mistake, I'm going to try your fix, sure it's better :-([/quote]
Heh, nothing to worry about there. Problem is the one sentence you wrote there is not related to our fix. Although is interesting and I actually managed to make DX12 work with 3D Vision Automatic (not 3D Vision Direct which is DISABLED from their game if you use DX12) everything is extremely broken!
The official fix is also worse in terms of stereo 3D compared to our community one. There is a whole thread here related to this:
[url]https://forums.geforce.com/default/topic/919928/3d-vision/rise-of-the-tomb-raider-2016-3d-vision-fix/[/url]
In some pages (in the beginning you can actually see what is the difference)
Hope this helps:)[/quote]
Hey dude, no offense, but you need to work on your English. Being forced to use Build 610 or earlier, THERE IS NO DX12 OPTION IN THE LAUNCHER OR IN THE GAME.
I will write in all caps, maybe you can read slowly and respond clearly:
WE ASKED YOU IF YOUR FIX WORKS WITH DX12. APPARENTLY IT DOES NOT.
Your fix SUCKS dude. I did everything correctly, down to the last letter, build 610, install your .nip, install your files to the game's .exe location. Poor Laura Croft's hair is rendering in three different locations and my FPS went from 47 in a certain area that I WANTED TO REVISIT WITH YOUR FIX AND DX12 ENABLED BECAUSE YOU ARE CLEARLY STATING THAT DX12 WORKS WITH YOUR FIX WHEN IT DOESN'T, and now it's 32 FPS with Laura's hair rendering in three different points in space.
So dude, thanks' for all of the work, but you just wasted a whole bunch of my time BECAUSE APPARENTLY ENGLISH IS NOT YOUR PRIMARY LANGUAGE.
No offense.
But when we come on here and ask you very clearly.
"DOES YOUR FIX WORK WITH DX12?"
I mean, is this confusing to you?
Now I have to go and un-F U C K my game. Thank's for wasting my time. [/quote]
Hmm, nice guy, so rational and polite[/quote]
When in doubt sling ad hominem attacks!
The following is irrefutable:
SLI is trash
Anyone buying Turing is a retard.
D-Man 11 go fuck yourself.
Rusty go fuck yourself.
I'm done. Don't expect any kindness going forward. I have a great memory, you two are on the shit list.
Yeah so polite, what about the post is alarming? One follows the instructions for a fix and it fucks their game, hey man, I'm entitled to be upset by that.
RAGEdemon said:xXxStarManxXx, for what it's worth, I do appreciate a lot of your anti-SLi sentiment.
I just want to point out that what you might see as absolute truths, especially from youtubers, are not. One has to remember that their primary purpose is infotainment, not really great journalism, nor a great deal of undrstanding.
The majority, if not all, of them do many stupid things, such as benchmark CPUs at high resolutions instead of the lowest possible resolution thereby benchmarking the GPU instead of the CPU; or benchmark memory in GPU limited scenarios... laughable really...
And of course, benchmarking GPUs in CPU limited scenarios...
1. The majority of SLi benchmarks which show lacklustre performance (non-SLI compatible games excepted), show bad scaling not because of bad SLi, but because the CPU has become the bottleneck - a single 1080 Ti is simply too powerful to show the muscle of 2 1080 Ti in SLi. Yes, even at 4K, and especially in 2D.
For a better comparison, one has to look at either:
a. 8K+ 1080 Ti SLI benchmarks
or
b. a lower tier graphics card, say a 970 SLi setup running games at 4k.
Luckily, techpowerup has a combined graph of ~17 games with the following representation:
2. SLi requires double the pixels, so playing at 1440 3DV is actually almost the same as playing 4k in 2D:
Therefore, 1440 3DV with SLi will show great scaling on powerful GPUs literally equivalent to scaling shown in double the resolution, i.e in this case, 4K.
This means that comparing 2D performance and mocking 1440p scaling, especially with a fake performance cap which is CPU limited, as in the case of 2x 1080Ti in SLi even at 4k, isn't a fair, nor a correct comparison to make.
3. I appreciate what you are saying about getting a single faster card vs 2 in SLi. Unfortunately, the reality of the matter is that most of us are already using the highest or near the highest performing cards - we simply do not have the luxury of getting a single faster card with a good return on investment, i.e. performance vs. price.
4. Since nVidia have pretty much confirmed that the 2080Ti performance shall be 1.4x 1080Ti performance (maybe even at best), Tothepoint2's purchase decision betting the ~1.7x+ performance gain with 2x 1080Ti in SLi in many supported games is a justifiably correct choice for the same money.
All the best.
P.S. Might I humbly suggest that we all make an effort to split text up into many paragraphs, or replace them as much as we can with pictures? As they say - a picture is worth a thousand words.
Otherwise, posts become a wall of text which no-one really wants to read. Of course, I might be being a tad biased as English is my third language ¯\_(ツ)_/¯
If you actually watch the video framerate comparison was done at both 1440p and 4K, both of which mostly negate a CPU bottleneck with 4K pretty much ruling out a CPU bottleneck. Tech of Tomorrow's presentation style is different, but he's been around for a long time and is trustworthy.
I am attempting to point out that actually, spending up for another GPU, be it another 1070, 1080, 1080 Ti, whatever is actually mal-adaptive as youre better off SAVING the cost of the additional GPU UNTIL you can afford a faster single card. That can mean selling your 1070 and putting what you made from that towards a new or used 1080 Ti or saving that $500 or whatever you have on hand until you can swing a used 2080 in 2019 or, if you have enough a new GPU on 7nm node late 2019 for $700. All of which are a vastly superior decision to buying a second GPU for SLI.
The diagonosis is in.
SLI is comatose and not recovering.
You might think youre being smart and "oh hey look I have a faster 3DMark synthetic score than a GTX 2080 Ti and Fallout 4 runs great! I hope you like Fallout 4, because aside from a handful of games that exhibit good scaling, that's what youre going to be limited to playing with that second GPU.
Assassins Creed: Origins, No SLI support
Doom 2016 and Doom Eternal, No SLI support
No Man's Sky, No SLI support
Batman: Arkham Knight, No SLI
Middle Earth: Shadow of War, No SLI
The Evil Within, 1 and 2, No SLI
Forza Horizon 3 and 4 and FM7, No SLI
Only to name a few.
Tothepoint2 said:@xXxStarManxXx
Evidently, you hadn't read what I'd written.
1) I'm putting together an entirely new system from scratch.
2) An i7-6850k is my processor of choice, giving a potential total of 3 GPUs access to 40 PCI-E lanes in a 16/16/8 configuration, on an Asus X-99 Deluxe II motherboard.
3) The i7-6850k is specifically designed with multiple GPU usage in mind.
4) Using the i7-6850k, SLI has been proven to gain benefits both in terms of its performance and in terms of its scaling, resulting in a minimum 10fps increase over and above a comparative SLI setup using either an i7-6700k, an i7-7700k or an i7-8700k.
5) NVidia haven't received a penny from me since 2013. Would you care to mention just how much you've spent on NVidia GPUs over the same period please?
6) I'm not here telling anybody either, how, when, or if, to spend their own money, but you are.
7) You had kept on buying GPUs in SLI configurations that were too powerful for your CPU to handle, causing you endless stuttering and micro-stuttering issues, whereas I have not. That is certainly not NVidia's fault.
8) Another case and point: Dragon Age: Inquisition (3D + SLI = No stuttering on my system. It is the only game that I currently own that is heavily GPU bound with frame rates dipping to 35fps at times. The only one. It runs slowly, but with zero stuttering. Unlike your own stuttering issues with this self same title.)
9) I'm quite sure that SLI does suck regarding multiplayer gaming, but then I don't play multi-player games.
I was unaware that ATI 5870 X-Fire, 580M SLI, and 680M SLI were powerful graphics cards. And an i7 3920xm @ 4.4 GHz is totally a CPU bottleneck:
You literally can't wait a few months? i7 6850 is immediately non-future-proof. I mean dude, do your research before opening your wallet?
rustyk21 said:I've had to come to the conclusion that unfortunately xXxStarManxXx is making the mistake of going on to forums and arguing a point continually in the vain attempt to make people agree. He thinks he's right, therefore everyone else is either just wrong or ignorant of the facts.
@xXxStarManxXx
I've already said I agree with some of what you say, but you seem to think that you know better than everyone else and you just don't. It's shame because quite a bit of what you've said is good and factual, but then you keep evidencing your own confirmation bias by cherry picking quotes and links.
Unfortunately it doesn't work like that. Everyone has a different perspective and each of us is a totally different use case for SLI, and indeed for the 2xxx series.
Like I alluded to earlier, I can speak for most when I say he's not telling us anything we don't already know, but because he doesn't think it's worth it he in unable to see why it's worth it for anyone else.
I could go into pages of posts and analysis of why SLI is worth it for me, but he'd just ignore it and post another poll or youtube video.
Anyway, can't wait for the reviews of the new cards!!
D-Man11 said:xXxStarManxXx = Vulcan1978
Vulcan1978 is the guy that claimed that helifax stated that 3Dmigoto supported dx12 and then went on to tell him that he needed to work on his English. This is in regards to Rise of the Tomb Raider
xXxStarManxXx said:
helifax said:
didierh said:sorry helifax, big mistake, I'm going to try your fix, sure it's better :-(
Heh, nothing to worry about there. Problem is the one sentence you wrote there is not related to our fix. Although is interesting and I actually managed to make DX12 work with 3D Vision Automatic (not 3D Vision Direct which is DISABLED from their game if you use DX12) everything is extremely broken!
Hey dude, no offense, but you need to work on your English. Being forced to use Build 610 or earlier, THERE IS NO DX12 OPTION IN THE LAUNCHER OR IN THE GAME.
I will write in all caps, maybe you can read slowly and respond clearly:
WE ASKED YOU IF YOUR FIX WORKS WITH DX12. APPARENTLY IT DOES NOT.
Your fix SUCKS dude. I did everything correctly, down to the last letter, build 610, install your .nip, install your files to the game's .exe location. Poor Laura Croft's hair is rendering in three different locations and my FPS went from 47 in a certain area that I WANTED TO REVISIT WITH YOUR FIX AND DX12 ENABLED BECAUSE YOU ARE CLEARLY STATING THAT DX12 WORKS WITH YOUR FIX WHEN IT DOESN'T, and now it's 32 FPS with Laura's hair rendering in three different points in space.
So dude, thanks' for all of the work, but you just wasted a whole bunch of my time BECAUSE APPARENTLY ENGLISH IS NOT YOUR PRIMARY LANGUAGE.
No offense.
But when we come on here and ask you very clearly.
"DOES YOUR FIX WORK WITH DX12?"
I mean, is this confusing to you?
Now I have to go and un-F U C K my game. Thank's for wasting my time.
Hmm, nice guy, so rational and polite
When in doubt sling ad hominem attacks!
The following is irrefutable:
SLI is trash
Anyone buying Turing is a retard.
D-Man 11 go fuck yourself.
Rusty go fuck yourself.
I'm done. Don't expect any kindness going forward. I have a great memory, you two are on the shit list.
Yeah so polite, what about the post is alarming? One follows the instructions for a fix and it fucks their game, hey man, I'm entitled to be upset by that.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
lol@Vulcan1978 / xXxStarManxXx
Your walls of text are out of control. Yah and your signing up for a Nvidia account to verbally attack people is justified how?
Most people would have initially asked for help.
Here's his tirade
[url]https://forums.geforce.com/default/topic/905945/3d-vision/rise-of-the-tomb-raider-3d-vision-ready-support-/post/5113748/#5113748[/url]
Assassins Creed: Origins, No SLI support
Doom 2016 and Doom Eternal, No SLI support
No Man's Sky, No SLI support
Batman: Arkham Knight, No SLI
Middle Earth: Shadow of War, No SLI
The Evil Within, 1 and 2, No SLI
Forza Horizon 3 and 4 and FM7, No SLI
Only to name a few.
Let's hear what others have to say about the current state of SLI:
https://us.hardware.info/reviews/8113/17/crossfire-a-sli-anno-2018-is-it-still-worth-it-conclusion
https://www.youtube.com/watch?v=E5LJczxSoQU&t
https://www.youtube.com/watch?v=YcH2ri9YZO4
Assassins Creed: Origins, No SLI support
Doom 2016 and Doom Eternal, No SLI support
No Man's Sky, No SLI support
Batman: Arkham Knight, No SLI
Middle Earth: Shadow of War, No SLI
The Evil Within, 1 and 2, No SLI
Forza Horizon 3 and 4 and FM7, No SLI
Only to name a few.
Let's hear what others have to say about the current state of SLI:
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
So your last post basically proves my point.
The post before just shows your true nature.
By the way, you have serious anger issues, I hope it doesn't impact you in the real world.
I know it's impossible for you to see it, but I'm being sincere.
[quote="D-Man11"]lol@Vulcan1978 / xXxStarManxXx
Your walls of text are out of control. Yah and your signing up for a Nvidia account to verbally attack people is justified how?
Most people would have initially asked for help.
Here's his tirade
[url]https://forums.geforce.com/default/topic/905945/3d-vision/rise-of-the-tomb-raider-3d-vision-ready-support-/post/5113748/#5113748[/url][/quote]
So what? Are you 15 or autistic? You spent the time to look at my post history to dig this up?
Hey when a mod / fix author is unresponsive or states that a fix works with DX12 etc. when it actually doesn't and I install their fix following the instructions to the T and I end up having to redownload my game, yeah you know what? I think I'm entitled to some indignation.
This is my last exchange with you buddy. From here on out you just fucking steer clear, I will not deal with you with kid gloves.
Creating account to be a troll LMFAO. No faggot cunt.
So what? Are you 15 or autistic? You spent the time to look at my post history to dig this up?
Hey when a mod / fix author is unresponsive or states that a fix works with DX12 etc. when it actually doesn't and I install their fix following the instructions to the T and I end up having to redownload my game, yeah you know what? I think I'm entitled to some indignation.
This is my last exchange with you buddy. From here on out you just fucking steer clear, I will not deal with you with kid gloves.
Creating account to be a troll LMFAO. No faggot cunt.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
[quote="rustyk21"]So your last post basically proves my point.
The post before just shows your true nature.
By the way, you have serious anger issues, I hope it doesn't impact you in the real world.
I know it's impossible for you to see it, but I'm being sincere.[/quote]
^ See above, that applies to you shit in a bowl.
rustyk21 said:So your last post basically proves my point.
The post before just shows your true nature.
By the way, you have serious anger issues, I hope it doesn't impact you in the real world.
I know it's impossible for you to see it, but I'm being sincere.
^ See above, that applies to you shit in a bowl.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
My last post here. Have fun with youre delusions! Don't bother responding because I'm not returning here.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://strawpoll.com/7568z34g
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
So I agree with you 100% that SLI often isn't there when games are released, it's definitely a major factor and it's why SLI is definitely not plug and play and is more of an enthusiasts thing. I only removed that text to try and make this post shorter!
If you expect if to work on day 1, then yes, at the moment it's often overlooked and sometimes never works.
I also agree that for this reason and others, any single card with enough comparable performance will aways be a better option that SLI, for all sorts of reasons.
BUT.. yes, you knew it was coming.
1. You basically said it doesn't work in DX12. But DX12 can scale with multiple cards. It's the scaling that's the point really isn't it?
2. Gears was given as an example of brilliant scaling. The fact that it was actually patched back in *could* be seen as a good thing, because it means it's still supported and there's still a demand.
3. ROTR actually scales brilliantly with multiple cards in DX12, so there is no choice to make.
4. Appreciate your experiences, but you need to understand that there are plenty of other people, especially on this forum that know exactly what they're talking about. You don't need to warn us off, we have our own experiences, do our own research and make our own conclusions.
5. As a value proposition, all of this and what you're raised is relevant to SLI/mgpu, but see point 4 and relate it to the actual topic we were originally discussing. That's my main point.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
@ 3:27 mark "Out of 9 games 3 games we tested in DX12 scaled very well"
The entire video is basically an epitaph for SLI.
Deus Ex Mankind Divided:
Single 1080 Ti: 67 FPS
1080 Ti SLI: 85 FPS
That's a big fat fail. This isn't even 30% scaling (watch the video from the beginning, there are far far worse). And these are just raw framerate differences. There's not analysis of frame-timing, how G-Sync is impacted, micro-stutter. All these are are raw frame-rate differences.
14 out of 25 games exhibit some scaling, but if you watch the video, with rare exception (Firestrike, Fallout 4, Rise of the Tomb Raider) most are absolutely unacceptable scaling. I'm talking like another 10 FPS. These probably shouldn't even be counted as scaling, but ultimately, if you step back 11 out of 25 games have NO SCALING, and maybe 3/4 of the games that have scaling the scaling is a complete joke. And again, only raw framerates here, no talk of frame timing, microstutter, and the loss / degradation of G-Sync.
I'm framing this argument against someone who is contemplating buying two 1080 Ti's outright for $1300+ right now as an alternative to a 2080 Ti, not necessarily someone who already has a 1080 Ti and who may be contemplating adding a second one for $550 or so (I passed on one for $600 and it included an EK waterblock and backplate, some $900 of value easily).
Honestly, I wouldn't even pay $550 for this. This is unacceptable. I would wait until next year if I felt I absolutely had to upgrade.
Trying to tell yourself that youre upgrading, and paying a bunch of money towards that, only to realize that SLI is, as many have noted, in hospice, and then experiencing performance like in the video above youre going to have extreme remorse, and youre now going to have $550-700+ of sunk expenditure that could have been saved towards a faster card in 2019.
Hell, if you can wait, I'm positive the next 70 series card on 7nm will be considerably faster than 2070 and if Nvidia get's enough push back (see the straw poll above, 80% are not pre-ordering, meaning they are only tricking the total morons) and AMD gets somewhere with Navi as an alternative then we may see prices come down next generation.
Don't think it will happen?
Look at how Intel responded to Ryzen.
The consumer base had had enough of Intel's bullshit and they had to get real and bring the price / performance ratio back down to sane, acceptable levels.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
I just want to point out that what you might see as absolute truths, especially from youtubers, are not. One has to remember that their primary purpose is infotainment, not really great journalism, nor a great deal of undrstanding.
The majority, if not all, of them do many stupid things, such as benchmark CPUs at high resolutions instead of the lowest possible resolution thereby benchmarking the GPU instead of the CPU; or benchmark memory in GPU limited scenarios... laughable really...
And of course, benchmarking GPUs in CPU limited scenarios...
1. The majority of SLi benchmarks which show lacklustre performance (non-SLI compatible games excepted), show bad scaling not because of bad SLi, but because the CPU has become the bottleneck - a single 1080 Ti is simply too powerful to show the muscle of 2 1080 Ti in SLi. Yes, even at 4K, and especially in 2D.
For a better comparison, one has to look at either:
a. 8K+ 1080 Ti SLI benchmarks
or
b. a lower tier graphics card, say a 970 SLi setup running games at 4k.
Luckily, techpowerup has a combined graph of ~17 games with the following representation:
2. SLi requires double the pixels, so playing at 1440 3DV is actually almost the same as playing 4k in 2D:
2560x1440 x2 (3DV) = 7,372,800 pixels.
3840x2160 (4K) = 8,294,400 pixels.
Therefore, 1440 3DV with SLi will show great scaling on powerful GPUs literally equivalent to scaling shown in double the resolution, i.e in this case, 4K.
This means that comparing 2D performance and mocking 1440p scaling, especially with a fake performance cap which is CPU limited, as in the case of 2x 1080Ti in SLi even at 4k, isn't a fair, nor a correct comparison to make.
3. I appreciate what you are saying about getting a single faster card vs 2 in SLi. Unfortunately, the reality of the matter is that most of us are already using the highest or near the highest performing cards - we simply do not have the luxury of getting a single faster card with a good return on investment, i.e. performance vs. price.
4. Since nVidia have pretty much confirmed that the 2080Ti performance shall be 1.4x 1080Ti performance (maybe even at best), Tothepoint2's purchase decision betting the ~1.7x+ performance gain with 2x 1080Ti in SLi in many supported games is a justifiably correct choice for the same money.
All the best.
P.S. Might I humbly suggest that we all make an effort to split text up into many paragraphs, or replace them as much as we can with pictures? As they say - a picture is worth a thousand words.
Otherwise, posts become a wall of text which no-one really wants to read. Of course, I might be being a tad biased as English is my third language ¯\_(ツ)_/¯
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Evidently, you hadn't read what I'd written.
1) I'm putting together an entirely new system from scratch.
2) An i7-6850k is my processor of choice, giving a potential total of 3 GPUs access to 40 PCI-E lanes in a 16/16/8 configuration, on an Asus X-99 Deluxe II motherboard.
3) The i7-6850k is specifically designed with multiple GPU usage in mind.
4) Using the i7-6850k, SLI has been proven to gain benefits both in terms of its performance and in terms of its scaling, resulting in a minimum 10fps increase over and above a comparative SLI setup using either an i7-6700k, an i7-7700k or an i7-8700k.
5) NVidia haven't received a penny from me since 2013. Would you care to mention just how much you've spent on NVidia GPUs over the same period please?
6) I'm not here telling anybody either, how, when, or if, to spend their own money, but you are.
7) You had kept on buying GPUs in SLI configurations that were too powerful for your CPU to handle, causing you endless stuttering and micro-stuttering issues, whereas I have not. That is certainly not NVidia's fault.
8) Another case and point: Dragon Age: Inquisition (3D + SLI = No stuttering on my system. It is the only game that I currently own that is heavily GPU bound with frame rates dipping to 35fps at times. The only one. It runs slowly, but with zero stuttering. Unlike your own stuttering issues with this self same title.)
9) I'm quite sure that SLI does suck regarding multiplayer gaming, but then I don't play multi-player games.
Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82 (Win8.0), Driver 388.71 (Win7), DX11.0
harisukro: "You sir, are 'Steely Eyed Missile Man'" (Quote from Apollo 13)
@xXxStarManxXx
I've already said I agree with some of what you say, but you seem to think that you know better than everyone else and you just don't. It's shame because quite a bit of what you've said is good and factual, but then you keep evidencing your own confirmation bias by cherry picking quotes and links.
Unfortunately it doesn't work like that. Everyone has a different perspective and each of us is a totally different use case for SLI, and indeed for the 2xxx series.
Like I alluded to earlier, I can speak for most when I say he's not telling us anything we don't already know, but because he doesn't think it's worth it he in unable to see why it's worth it for anyone else.
I could go into pages of posts and analysis of why SLI is worth it for me, but he'd just ignore it and post another poll or youtube video.
Anyway, can't wait for the reviews of the new cards!!
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Vulcan1978 is the guy that claimed that helifax stated that 3Dmigoto supported dx12 and then went on to tell him that he needed to work on his English. This is in regards to Rise of the Tomb Raider
Hmm, nice guy, so rational and polite
If you actually watch the video framerate comparison was done at both 1440p and 4K, both of which mostly negate a CPU bottleneck with 4K pretty much ruling out a CPU bottleneck. Tech of Tomorrow's presentation style is different, but he's been around for a long time and is trustworthy.
I am attempting to point out that actually, spending up for another GPU, be it another 1070, 1080, 1080 Ti, whatever is actually mal-adaptive as youre better off SAVING the cost of the additional GPU UNTIL you can afford a faster single card. That can mean selling your 1070 and putting what you made from that towards a new or used 1080 Ti or saving that $500 or whatever you have on hand until you can swing a used 2080 in 2019 or, if you have enough a new GPU on 7nm node late 2019 for $700. All of which are a vastly superior decision to buying a second GPU for SLI.
The diagonosis is in.
SLI is comatose and not recovering.
You might think youre being smart and "oh hey look I have a faster 3DMark synthetic score than a GTX 2080 Ti and Fallout 4 runs great! I hope you like Fallout 4, because aside from a handful of games that exhibit good scaling, that's what youre going to be limited to playing with that second GPU.
Assassins Creed: Origins, No SLI support
Doom 2016 and Doom Eternal, No SLI support
No Man's Sky, No SLI support
Batman: Arkham Knight, No SLI
Middle Earth: Shadow of War, No SLI
The Evil Within, 1 and 2, No SLI
Forza Horizon 3 and 4 and FM7, No SLI
Only to name a few.
I was unaware that ATI 5870 X-Fire, 580M SLI, and 680M SLI were powerful graphics cards. And an i7 3920xm @ 4.4 GHz is totally a CPU bottleneck:
https://www.3dmark.com/fs/9382040
As is a 4930k @ 4.5 GHz with both cards x16/x16:
https://www.3dmark.com/fs/5333285
And youre buying all of this new in 2018? Dude, if you want x16/x16 Gigabyte is releasing a z390 SLI motherboard for Whiskey Lake / 9900k with 1st and 2nd GPU slots in x16:
https://hothardware.com/news/gigabyte-z390-gaming-sli-motherboard-9th-gen-coffee-lake-cpus
You literally can't wait a few months? i7 6850 is immediately non-future-proof. I mean dude, do your research before opening your wallet?
When in doubt sling ad hominem attacks!
The following is irrefutable:
SLI is trash
Anyone buying Turing is a retard.
D-Man 11 go fuck yourself.
Rusty go fuck yourself.
I'm done. Don't expect any kindness going forward. I have a great memory, you two are on the shit list.
Yeah so polite, what about the post is alarming? One follows the instructions for a fix and it fucks their game, hey man, I'm entitled to be upset by that.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
Your walls of text are out of control. Yah and your signing up for a Nvidia account to verbally attack people is justified how?
Most people would have initially asked for help.
Here's his tirade
https://forums.geforce.com/default/topic/905945/3d-vision/rise-of-the-tomb-raider-3d-vision-ready-support-/post/5113748/#5113748
Doom 2016 and Doom Eternal, No SLI support
No Man's Sky, No SLI support
Batman: Arkham Knight, No SLI
Middle Earth: Shadow of War, No SLI
The Evil Within, 1 and 2, No SLI
Forza Horizon 3 and 4 and FM7, No SLI
Only to name a few.
Let's hear what others have to say about the current state of SLI:
https://us.hardware.info/reviews/8113/17/crossfire-a-sli-anno-2018-is-it-still-worth-it-conclusion
;t
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
The post before just shows your true nature.
By the way, you have serious anger issues, I hope it doesn't impact you in the real world.
I know it's impossible for you to see it, but I'm being sincere.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
So what? Are you 15 or autistic? You spent the time to look at my post history to dig this up?
Hey when a mod / fix author is unresponsive or states that a fix works with DX12 etc. when it actually doesn't and I install their fix following the instructions to the T and I end up having to redownload my game, yeah you know what? I think I'm entitled to some indignation.
This is my last exchange with you buddy. From here on out you just fucking steer clear, I will not deal with you with kid gloves.
Creating account to be a troll LMFAO. No faggot cunt.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
^ See above, that applies to you shit in a bowl.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#