I've not noticed stuttering with SLI in 3d.
Each card does each eye, meaning in some circumstances you'll get a 100% performance boost over a single card. This isn't true in all cases though, especially when performance is CPU-limited.
Each card does each eye, meaning in some circumstances you'll get a 100% performance boost over a single card. This isn't true in all cases though, especially when performance is CPU-limited.
I've noticed microstuttering sometimes, but it has usually been pretty minor. There was one game where it was pretty terrible, but I can't remember which game it was now
I've noticed microstuttering sometimes, but it has usually been pretty minor. There was one game where it was pretty terrible, but I can't remember which game it was now
[quote="Reggaeroman"]I always read you get micro stuttering with SLI. Is this not the case with 3dvision, as each card does one frame? So SLI is the perfect solution for 3dvision? Reading this thread makes me want to try 2 670 as well[/quote]
I've been gaming with SLI since the 6800GT days, micro stuttering is certainly not a SLI feature.
some games don't support SLI and some scale badly with SLI; in these cases, micro stuttering or even reduced perfromance (from single card) is expected to happen.
Reggaeroman said:I always read you get micro stuttering with SLI. Is this not the case with 3dvision, as each card does one frame? So SLI is the perfect solution for 3dvision? Reading this thread makes me want to try 2 670 as well
I've been gaming with SLI since the 6800GT days, micro stuttering is certainly not a SLI feature.
some games don't support SLI and some scale badly with SLI; in these cases, micro stuttering or even reduced perfromance (from single card) is expected to happen.
Actually, there is a strong link between SLI and microsuttering. You have 2 cards rendering 2 frames, with 2 different origin times. Unless those 2 frames take *exactly* the same amount of time to render, there will be a mismatch between their origin times and their display times, and the GPU doesn't always compensate for this well. The result is microstutter.
It first came into the public eye with [url="http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking"]this landmark article[/url]. It's worth a read if you want to learn more about it, and don't mind a bit of excrutiating techie detail. In particular, that article blew the lid on just how horribly AMD CrossfireX suffered from microstuttering. Nvidia cards were affected too, but to a lesser degree.
Largely as a result of that article and the shitstorm it stirred up, many hardware review sites have started measuring "frame timings", which do a better job of spotting microstuttering than measuring FPS alone. I recommend avoiding reviews that still only measure FPS, if you can help it.
Apparently, Nvidia were aware of the problem for a number of years, and they eventually released a driver-level solution that mitigates SLI microstuttering to some degree. AMD have since improved CrossfireX as well, following no small outrage from AMD users.
The bottom line is that SLI microstuttering happens. But how much it happens is a foggy area. Some people are sensitive to it, some don't even notice it. Some games are prone to it, some aren't. On most charts that I've seen, SLI performance is still worse nowadays than single-GPU performance when it comes to frame-timings, but usually only by a bit.
Like others, I suspect that it may be less of a problem under 3Dvision, since the 2 frames will tend to be much more similar, so they'll be more likely to take the same amount of time to render.
I'm happy with my SLI setup, though I've definitely noticed microstuttering since making the switch from single-GPU a year ago. But like I said, in most games it's been pretty minor - and I'm usually pretty sensitive to these sort of things. But I have two Titans which currently do pretty well with most games. I suspect that in a year or two with more demanding games, microstutering may well increase as they start to chug and splutter more.
Actually, there is a strong link between SLI and microsuttering. You have 2 cards rendering 2 frames, with 2 different origin times. Unless those 2 frames take *exactly* the same amount of time to render, there will be a mismatch between their origin times and their display times, and the GPU doesn't always compensate for this well. The result is microstutter.
It first came into the public eye with this landmark article. It's worth a read if you want to learn more about it, and don't mind a bit of excrutiating techie detail. In particular, that article blew the lid on just how horribly AMD CrossfireX suffered from microstuttering. Nvidia cards were affected too, but to a lesser degree.
Largely as a result of that article and the shitstorm it stirred up, many hardware review sites have started measuring "frame timings", which do a better job of spotting microstuttering than measuring FPS alone. I recommend avoiding reviews that still only measure FPS, if you can help it.
Apparently, Nvidia were aware of the problem for a number of years, and they eventually released a driver-level solution that mitigates SLI microstuttering to some degree. AMD have since improved CrossfireX as well, following no small outrage from AMD users.
The bottom line is that SLI microstuttering happens. But how much it happens is a foggy area. Some people are sensitive to it, some don't even notice it. Some games are prone to it, some aren't. On most charts that I've seen, SLI performance is still worse nowadays than single-GPU performance when it comes to frame-timings, but usually only by a bit.
Like others, I suspect that it may be less of a problem under 3Dvision, since the 2 frames will tend to be much more similar, so they'll be more likely to take the same amount of time to render.
I'm happy with my SLI setup, though I've definitely noticed microstuttering since making the switch from single-GPU a year ago. But like I said, in most games it's been pretty minor - and I'm usually pretty sensitive to these sort of things. But I have two Titans which currently do pretty well with most games. I suspect that in a year or two with more demanding games, microstutering may well increase as they start to chug and splutter more.
I had a brief look around for some graphs showing SLI frame timings in 670, but I couldn't find any.
So here's an image of my system (Titans) in the Metro Last Light Benchmark, comparing SLI and single GPU performance. Essentially, the black area shows frame variability. A thin black line indicates that there is little variance between one frame and the next.
A fat black area indicates that frame timings are jumping up and down like crazy (those solid black masses on the graphs on the right are actually very tight *zigzags*). In some cases, one frame might take almost 3 times longer to render than the frames directly next to it. In other words: microstutter.
As you can see, when I ran the game in SLI, I got much better FPS, but also WAY more microstutter. Adding a PhysX card helped, but didn't change the fact that SLI introduces microstutter.
[img]http://postachio-images.s3-website-us-east-1.amazonaws.com/700adbbdfac613bd3a6c994890af82a8/eb95c1d7c0ac450dfe4c2303b4c94f96/w600_5ce8453a8f86576a7f5a2fcde7653b31.png[/img]
Here are two images from the techreport article I linked to earlier (note that they show frame timings, not FPS).
[img]http://techreport.com/r.x/inside-the-second/bc2-gtx570.gif[/img]
[img]http://techreport.com/r.x/inside-the-second/bc2-gtx570sli.gif[/img]
That comb effect in the second one shows you SLI microstuttering very clearly. The SLI system is taking less time to produce frames overall, but there's a distinct difference between the timing of the first card and the second card.
Here's a newer pic from a guru3d review of the 780ti. Compare the relatively smooth blue line (single GPU) to the very spiky dark green line (SLI). The frame timings in SLI are actually quite acceptable, but certainly not as smooth as single GPU. So, despite nvidia's efforts, the problem still exists.
[img]http://www.guru3d.com/index.php?ct=articles&action=file&id=7529&admin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1[/img]
I'd love to see a comparison between frame timings on a 3D and non-3D system. If I get time, I might try and see if I can do those tests myself.
I had a brief look around for some graphs showing SLI frame timings in 670, but I couldn't find any.
So here's an image of my system (Titans) in the Metro Last Light Benchmark, comparing SLI and single GPU performance. Essentially, the black area shows frame variability. A thin black line indicates that there is little variance between one frame and the next.
A fat black area indicates that frame timings are jumping up and down like crazy (those solid black masses on the graphs on the right are actually very tight *zigzags*). In some cases, one frame might take almost 3 times longer to render than the frames directly next to it. In other words: microstutter.
As you can see, when I ran the game in SLI, I got much better FPS, but also WAY more microstutter. Adding a PhysX card helped, but didn't change the fact that SLI introduces microstutter.
Here are two images from the techreport article I linked to earlier (note that they show frame timings, not FPS).
That comb effect in the second one shows you SLI microstuttering very clearly. The SLI system is taking less time to produce frames overall, but there's a distinct difference between the timing of the first card and the second card.
Here's a newer pic from a guru3d review of the 780ti. Compare the relatively smooth blue line (single GPU) to the very spiky dark green line (SLI). The frame timings in SLI are actually quite acceptable, but certainly not as smooth as single GPU. So, despite nvidia's efforts, the problem still exists.
I'd love to see a comparison between frame timings on a 3D and non-3D system. If I get time, I might try and see if I can do those tests myself.
Very interesting results there. I couldn't but notice the RED graph in the last pic... Man, AMD's Crossfire is very weird... In some parts is better than SLI but has bigger spikes and overall bigger delay between GPUs... Hmm. I am REALLY interested to see the above chart from Metro LL in 3D Vision TBH :D
Very interesting results there. I couldn't but notice the RED graph in the last pic... Man, AMD's Crossfire is very weird... In some parts is better than SLI but has bigger spikes and overall bigger delay between GPUs... Hmm. I am REALLY interested to see the above chart from Metro LL in 3D Vision TBH :D
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Each card does each eye, meaning in some circumstances you'll get a 100% performance boost over a single card. This isn't true in all cases though, especially when performance is CPU-limited.
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
I've been gaming with SLI since the 6800GT days, micro stuttering is certainly not a SLI feature.
some games don't support SLI and some scale badly with SLI; in these cases, micro stuttering or even reduced perfromance (from single card) is expected to happen.
epenny size =/= nerdiness
It first came into the public eye with this landmark article. It's worth a read if you want to learn more about it, and don't mind a bit of excrutiating techie detail. In particular, that article blew the lid on just how horribly AMD CrossfireX suffered from microstuttering. Nvidia cards were affected too, but to a lesser degree.
Largely as a result of that article and the shitstorm it stirred up, many hardware review sites have started measuring "frame timings", which do a better job of spotting microstuttering than measuring FPS alone. I recommend avoiding reviews that still only measure FPS, if you can help it.
Apparently, Nvidia were aware of the problem for a number of years, and they eventually released a driver-level solution that mitigates SLI microstuttering to some degree. AMD have since improved CrossfireX as well, following no small outrage from AMD users.
The bottom line is that SLI microstuttering happens. But how much it happens is a foggy area. Some people are sensitive to it, some don't even notice it. Some games are prone to it, some aren't. On most charts that I've seen, SLI performance is still worse nowadays than single-GPU performance when it comes to frame-timings, but usually only by a bit.
Like others, I suspect that it may be less of a problem under 3Dvision, since the 2 frames will tend to be much more similar, so they'll be more likely to take the same amount of time to render.
I'm happy with my SLI setup, though I've definitely noticed microstuttering since making the switch from single-GPU a year ago. But like I said, in most games it's been pretty minor - and I'm usually pretty sensitive to these sort of things. But I have two Titans which currently do pretty well with most games. I suspect that in a year or two with more demanding games, microstutering may well increase as they start to chug and splutter more.
So here's an image of my system (Titans) in the Metro Last Light Benchmark, comparing SLI and single GPU performance. Essentially, the black area shows frame variability. A thin black line indicates that there is little variance between one frame and the next.
A fat black area indicates that frame timings are jumping up and down like crazy (those solid black masses on the graphs on the right are actually very tight *zigzags*). In some cases, one frame might take almost 3 times longer to render than the frames directly next to it. In other words: microstutter.
As you can see, when I ran the game in SLI, I got much better FPS, but also WAY more microstutter. Adding a PhysX card helped, but didn't change the fact that SLI introduces microstutter.
Here are two images from the techreport article I linked to earlier (note that they show frame timings, not FPS).
That comb effect in the second one shows you SLI microstuttering very clearly. The SLI system is taking less time to produce frames overall, but there's a distinct difference between the timing of the first card and the second card.
Here's a newer pic from a guru3d review of the 780ti. Compare the relatively smooth blue line (single GPU) to the very spiky dark green line (SLI). The frame timings in SLI are actually quite acceptable, but certainly not as smooth as single GPU. So, despite nvidia's efforts, the problem still exists.
I'd love to see a comparison between frame timings on a 3D and non-3D system. If I get time, I might try and see if I can do those tests myself.
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)