TESTED: 3D SLI vs 2D SLI performance compared
  2 / 3    
for tomb raider, thief and dxhr vsync has to be forced off in the driver.
for tomb raider, thief and dxhr vsync has to be forced off in the driver.

NVIDIA TITAN X (Pascal), Intel Core i7-6900K, Win 10 Pro,
ASUS ROG Rampage V Edition 10, G.Skill RipJaws V 4x 8GB DDR4-3200 CL14-14-14-34,
ASUS ROG Swift PG258Q, ASUS ROG Swift PG278Q, Acer Predator XB280HK, BenQ W710ST

#16
Posted 05/09/2014 05:34 AM   
As far as I can tell vsync is always on if the 3D Vision driver is enabled in both 2D and 3D mode. To confirm I ran video stress test in Counter-Strike Source. 3D -> 60fps 2D -> 120fps 2D 3D Vision disabled -> 570fps All values are average fps. An ingame renderer can report inverted frametime as fps which doesn't account for time waiting for vsync.
As far as I can tell vsync is always on if the 3D Vision driver is enabled in both 2D and 3D mode.

To confirm I ran video stress test in Counter-Strike Source.
3D -> 60fps
2D -> 120fps
2D 3D Vision disabled -> 570fps

All values are average fps.

An ingame renderer can report inverted frametime as fps which doesn't account for time waiting for vsync.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#17
Posted 05/09/2014 08:27 AM   
I don't think that's the case. When Vsync is turned off, I can often see screen tearing in 3D, and I can also feel input lag. Neither would happen if Vsync were still on. In Thief, 3D doesn't even work unless you turn vsync on. Besides, even if what you are saying were true, I don't see how it make any difference other than being a technical curiosity. The whole point is to look at performance between 2D and 3D. And the only sensible way to measure that is to first test in 2D (ie. with 3D driver disabled), and then in 3D, is it not?
I don't think that's the case. When Vsync is turned off, I can often see screen tearing in 3D, and I can also feel input lag. Neither would happen if Vsync were still on. In Thief, 3D doesn't even work unless you turn vsync on.


Besides, even if what you are saying were true, I don't see how it make any difference other than being a technical curiosity. The whole point is to look at performance between 2D and 3D. And the only sensible way to measure that is to first test in 2D (ie. with 3D driver disabled), and then in 3D, is it not?

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#18
Posted 05/09/2014 02:15 PM   
My suprice is that you can get 60+ average performance in 3D. wonder if you get such high values using fraps together with the benchmark. If a game is using 3D Vision automatic it is clearly limited by vsync with the 3D driver enabled. Having a system capable at running 3d at 60+ hz doesn't make the 120hz monitor display at higher than 60hz per eye. I was mostly thinking you are comparing apples to oranges as you are using vsync in 3D and no vsync in 2D when comparing in 3D Vision automatic games. I used Counter-Strike Source as an example how vsync clearly affects performance in 3D Vision even though vsync is turned off in the game in all three testruns.
My suprice is that you can get 60+ average performance in 3D.

wonder if you get such high values using fraps together with the benchmark.

If a game is using 3D Vision automatic it is clearly limited by vsync with the 3D driver enabled.

Having a system capable at running 3d at 60+ hz doesn't make the 120hz monitor display at higher than 60hz per eye.

I was mostly thinking you are comparing apples to oranges as you are using vsync in 3D and no vsync in 2D when comparing in 3D Vision automatic games.

I used Counter-Strike Source as an example how vsync clearly affects performance in 3D Vision even though vsync is turned off in the game in all three testruns.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#19
Posted 05/09/2014 02:41 PM   
See if you can beat my CSS benchmark score significcantly in 3D Vision.
See if you can beat my CSS benchmark score significcantly in 3D Vision.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#20
Posted 05/09/2014 02:44 PM   
Sleeping Dogs - 1440p, extreme 2D SLI scaling: +93 % 3D SLI scaling: +100 % [url=http://abload.de/image.php?img=sleepingdogs_sli_offcwxjt.png][img]http://abload.de/thumb/sleepingdogs_sli_offcwxjt.png[/img][/url] [url=http://abload.de/image.php?img=sleepingdogs_sli_onllb46.png][img]http://abload.de/thumb/sleepingdogs_sli_onllb46.png[/img][/url] [url=http://abload.de/image.php?img=sleepingdogs_3d_sli_oaxx09.png][img]http://abload.de/thumb/sleepingdogs_3d_sli_oaxx09.png[/img][/url] [url=http://abload.de/image.php?img=sleepingdogs_3d_sli_o7vbj8.png][img]http://abload.de/thumb/sleepingdogs_3d_sli_o7vbj8.png[/img][/url]
Sleeping Dogs - 1440p, extreme

2D SLI scaling: +93 %
3D SLI scaling: +100 %

Image Image Image Image

NVIDIA TITAN X (Pascal), Intel Core i7-6900K, Win 10 Pro,
ASUS ROG Rampage V Edition 10, G.Skill RipJaws V 4x 8GB DDR4-3200 CL14-14-14-34,
ASUS ROG Swift PG258Q, ASUS ROG Swift PG278Q, Acer Predator XB280HK, BenQ W710ST

#21
Posted 05/09/2014 05:05 PM   
[quote="Flugan"]My suprice is that you can get 60+ average performance in 3D.[/quote] And I'm surprised that this surprises you. I've frequently seen games go above 60fps in 3Dvision. I haven't noticed any difference in this regard since I started using 3dvision a year ago. You're the first person who I've ever heard mention such a thing. Perhaps other people can chime in with their experiences. [quote]wonder if you get such high values using fraps together with the benchmark.[/quote]Yes, I always have fraps turned on, displaying the FPS on the LCD panel of my keyboard. So I'm always aware of my FPS. Yes, FRAPS was reporting very similar values to what the benchmarks were. And like I said, there's nothing surprising about them for me, because I'm accustomed to seeing FPS go above 60 when I turn vsync off in a game. [quote]Having a system capable at running 3d at 60+ hz doesn't make the 120hz monitor display at higher than 60hz per eye.[/quote] Of course not. Just like your monitor wasn't actually pumping out 570Hz in Counter strike. What the GPU renders and what the monitor displays are two different things, as I'm sure you'll agree. I'm having a hard time understanding the point you're trying to make. [quote]I was mostly thinking you are comparing apples to oranges as you are using vsync in 3D and no vsync in 2D when comparing in 3D Vision automatic games.[/quote]I think you're mistaken about 3D forcing vsync. And the evidence seems to back me up: I can get 60+fps in various games in 3D, I can clearly see screen tearing without vsync in 3D, and I can clearly see input lag with vsync in 3D. I've no idea why your experience is different to mine. Are you sure vsync is forced off in the NVidia control panel? But anyway, even if you were completely right, I still don't see your point. My aim with the blog was to test "what is the performance drop of going from 2D to 3D?" If 2D is apples and 3D is oranges, then the only possible way to make this test is to compare apples to oranges. What else would you have me do? Leave the 3D drivers on for all of the tests? That would totally defeat the purpose, since 2D gamers do not play their games with 3Dvision drivers. [quote] I used Counter-Strike Source as an example how vsync clearly affects performance in 3D Vision even though vsync is turned off in the game in all three testruns.[/quote]Sure, and on my blog I've got a number of examples that seem to totally refute that: 3D results that aren't capped at 60fps. I don't own Counter Strike, so I can't make the test. But even if I could, I don't see what good it would do. Even if I got the exact same results as you, it still wouldn't negate the results I got from Mafia II, Grid 2, and Bioshock Infinite. The only conclusion I would draw from such a result in Counter Strike is that there's something particular and uniquely weird about Counter Strike. Sorry man, I get the feeling we're maybe talking at cross purposes here, and perhaps not understanding one another. No ill feelings intended.
Flugan said:My suprice is that you can get 60+ average performance in 3D.

And I'm surprised that this surprises you. I've frequently seen games go above 60fps in 3Dvision. I haven't noticed any difference in this regard since I started using 3dvision a year ago.

You're the first person who I've ever heard mention such a thing. Perhaps other people can chime in with their experiences.

wonder if you get such high values using fraps together with the benchmark.
Yes, I always have fraps turned on, displaying the FPS on the LCD panel of my keyboard. So I'm always aware of my FPS. Yes, FRAPS was reporting very similar values to what the benchmarks were. And like I said, there's nothing surprising about them for me, because I'm accustomed to seeing FPS go above 60 when I turn vsync off in a game.



Having a system capable at running 3d at 60+ hz doesn't make the 120hz monitor display at higher than 60hz per eye.

Of course not. Just like your monitor wasn't actually pumping out 570Hz in Counter strike. What the GPU renders and what the monitor displays are two different things, as I'm sure you'll agree. I'm having a hard time understanding the point you're trying to make.



I was mostly thinking you are comparing apples to oranges as you are using vsync in 3D and no vsync in 2D when comparing in 3D Vision automatic games.
I think you're mistaken about 3D forcing vsync. And the evidence seems to back me up: I can get 60+fps in various games in 3D, I can clearly see screen tearing without vsync in 3D, and I can clearly see input lag with vsync in 3D.

I've no idea why your experience is different to mine. Are you sure vsync is forced off in the NVidia control panel?

But anyway, even if you were completely right, I still don't see your point. My aim with the blog was to test "what is the performance drop of going from 2D to 3D?" If 2D is apples and 3D is oranges, then the only possible way to make this test is to compare apples to oranges. What else would you have me do? Leave the 3D drivers on for all of the tests? That would totally defeat the purpose, since 2D gamers do not play their games with 3Dvision drivers.


I used Counter-Strike Source as an example how vsync clearly affects performance in 3D Vision even though vsync is turned off in the game in all three testruns.
Sure, and on my blog I've got a number of examples that seem to totally refute that: 3D results that aren't capped at 60fps.

I don't own Counter Strike, so I can't make the test. But even if I could, I don't see what good it would do.

Even if I got the exact same results as you, it still wouldn't negate the results I got from Mafia II, Grid 2, and Bioshock Infinite. The only conclusion I would draw from such a result in Counter Strike is that there's something particular and uniquely weird about Counter Strike.

Sorry man, I get the feeling we're maybe talking at cross purposes here, and perhaps not understanding one another. No ill feelings intended.

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#22
Posted 05/10/2014 09:04 AM   
Flugan, I think you are under the misconception about the fact that you need "VSync" for 3D vision. There are actually 2 stages to VSync. An excerpt from another one of my posts should clarify why you don't need VSync and why Volnaiskra's results are perfectly valid. ============ GPU output has 2 phases. One phase outputs data from the GPU to 2 buffers. These 2 buffers output data to the screen. Whatever is in these buffers is actually being seen on the screen. When we talk about Full sync, the gpu is in sync with the buffers, and the buffers are in sync with the screen. In 3D vision mode, the driver forces the fact that the buffers are always in sync with the screen. It does not, however, force the GPU to be in sync with the buffers (unless VSync is enabled). This means that although the buffers are synced to the screen (each frame of the screen is receiving alternating information from the 2 buffers, for the right and left eyes), the GPU is not sync'd with the buffers themselves. This means that although you are perceiving Stereo image with correct images for both the left and right eye, each eye is also perceiving tearing caused due to the GPU not being in sync with the buffers. I hope I'm clear. This is my understanding. Perhaps someone with more insight would post more information on the subject.
Flugan, I think you are under the misconception about the fact that you need "VSync" for 3D vision.

There are actually 2 stages to VSync. An excerpt from another one of my posts should clarify why you don't need VSync and why Volnaiskra's results are perfectly valid.


============

GPU output has 2 phases. One phase outputs data from the GPU to 2 buffers. These 2 buffers output data to the screen. Whatever is in these buffers is actually being seen on the screen.

When we talk about Full sync, the gpu is in sync with the buffers, and the buffers are in sync with the screen.

In 3D vision mode, the driver forces the fact that the buffers are always in sync with the screen. It does not, however, force the GPU to be in sync with the buffers (unless VSync is enabled).

This means that although the buffers are synced to the screen (each frame of the screen is receiving alternating information from the 2 buffers, for the right and left eyes), the GPU is not sync'd with the buffers themselves.

This means that although you are perceiving Stereo image with correct images for both the left and right eye, each eye is also perceiving tearing caused due to the GPU not being in sync with the buffers.

I hope I'm clear. This is my understanding. Perhaps someone with more insight would post more information on the subject.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#23
Posted 05/10/2014 02:37 PM   
Vsync is always enabled with 3D on. https://forums.geforce.com/default/topic/489962/3d-vision/3d-and-vsync/post/3511551/ andrewf@nvidia @ 2011 My point is that a benchmark as well as fraps can show the frametime rather than the time between frames. With vsync on the maximum framerate is limited. With vsync off tearing happens pretty much every frame.
Vsync is always enabled with 3D on.

https://forums.geforce.com/default/topic/489962/3d-vision/3d-and-vsync/post/3511551/

andrewf@nvidia @ 2011

My point is that a benchmark as well as fraps can show the frametime rather than the time between frames.

With vsync on the maximum framerate is limited.

With vsync off tearing happens pretty much every frame.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#24
Posted 05/10/2014 05:42 PM   
After further reading it appears to be able to run 3D Vision with vsync off recently
After further reading it appears to be able to run 3D Vision with vsync off recently

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#25
Posted 05/10/2014 05:48 PM   
It has been stated clearly by Nvidia that 3way sli with 3d vision does not uses more than 2 cards, and they still did not posted any updated driver that fixed this up to today that I know of. The reason you still see increased performance on some games is the PhysX being offloaded on the card not in use. Otherwise there should be no difference between 2way sli and 3way sli in 3dvision. I did extensive tests with PhysX enabled games such batman or metro and using the 3rd card dedicated for PhysX and 2way sli only gives better results than 3way sli in 3d vision see this thread: https://forums.geforce.com/default/topic/502441/3d-vision/3d-vision-3-way-sli/ and the post from Andrew@nvidia https://forums.geforce.com/default/topic/491394/3d-vision/3d-vision-with-3way-sli-is-it-fixed-/post/3522012/#3522012
It has been stated clearly by Nvidia that 3way sli with 3d vision does not uses more than 2 cards, and they still did not posted any updated driver that fixed this up to today that I know of. The reason you still see increased performance on some games is the PhysX being offloaded on the card not in use. Otherwise there should be no difference between 2way sli and 3way sli in 3dvision. I did extensive tests with PhysX enabled games such batman or metro and using the 3rd card dedicated for PhysX and 2way sli only gives better results than 3way sli in 3d vision
see this thread: https://forums.geforce.com/default/topic/502441/3d-vision/3d-vision-3-way-sli/
and the post from Andrew@nvidia
https://forums.geforce.com/default/topic/491394/3d-vision/3d-vision-with-3way-sli-is-it-fixed-/post/3522012/#3522012

Videocards: 3xEVGA Geforce GTX660TI SC 3GB in 3way SLI; Processor: Intel Core i7 970 (6 cores) @4.2GHz (200x21 HT ON, TB OFF); Cooler: Thermalright Ultra-120 eXtreme 1366 RT Rev. C; Mainboard: EVGA X58 SLI Classified E760; Memory: 3x4GB DDR3 Mushkin Redline; Storage OS: Samsung 840 Pro SSD; Storage Games: 2xWD Velociraptor in RAID-0; Soundcard: Creative X-Fi Titanium Fatality; Speakers: Klipsch ProMedia 4.1; OS: Windows 8 Pro 64bit with Media Center

#26
Posted 05/10/2014 10:50 PM   
@RAGEdemon: Thanks for the explanation. Intuitively, it seems to make sense that it would work that way. @Flugan: ok, I see now why you were so resolute about it, since Andrew explicitly states that vsync is forced on. I only got 3Dvision in 2013, so perhaps the vsync lock was no longer applicable by then, which would explain why I've never noticed vsync being forced on. Out of curiosity, could you point us to the "further reading" you mentioned? [quote="unstrain"]It has been stated clearly by Nvidia that 3way sli with 3d vision does not uses more than 2 cards, and they still did not posted any updated driver that fixed this up to today that I know of. The reason you still see increased performance on some games is the PhysX being offloaded on the card not in use. Otherwise there should be no difference between 2way sli and 3way sli in 3dvision. I did extensive tests with PhysX enabled games such batman or metro and using the 3rd card dedicated for PhysX and 2way sli only gives better results than 3way sli in 3d vision see this thread: https://forums.geforce.com/default/topic/502441/3d-vision/3d-vision-3-way-sli/ and the post from Andrew@nvidia https://forums.geforce.com/default/topic/491394/3d-vision/3d-vision-with-3way-sli-is-it-fixed-/post/3522012/#3522012[/quote]I'm not sure why you mention this, because I only have 2-way SLI on my system. All of the tests I did measured either single-GPU or 2-way SLI. In the single GPU tests, I assigned PhysX to the main GPU (rather than leaving it on "automatic", to ensure that only one GPU was doing any work (ie. to prevent the physX from being offloaded to my unused Titan or my 650ti).
@RAGEdemon: Thanks for the explanation. Intuitively, it seems to make sense that it would work that way.

@Flugan: ok, I see now why you were so resolute about it, since Andrew explicitly states that vsync is forced on.

I only got 3Dvision in 2013, so perhaps the vsync lock was no longer applicable by then, which would explain why I've never noticed vsync being forced on. Out of curiosity, could you point us to the "further reading" you mentioned?

unstrain said:It has been stated clearly by Nvidia that 3way sli with 3d vision does not uses more than 2 cards, and they still did not posted any updated driver that fixed this up to today that I know of. The reason you still see increased performance on some games is the PhysX being offloaded on the card not in use. Otherwise there should be no difference between 2way sli and 3way sli in 3dvision. I did extensive tests with PhysX enabled games such batman or metro and using the 3rd card dedicated for PhysX and 2way sli only gives better results than 3way sli in 3d vision
see this thread: https://forums.geforce.com/default/topic/502441/3d-vision/3d-vision-3-way-sli/
and the post from Andrew@nvidia
https://forums.geforce.com/default/topic/491394/3d-vision/3d-vision-with-3way-sli-is-it-fixed-/post/3522012/#3522012
I'm not sure why you mention this, because I only have 2-way SLI on my system. All of the tests I did measured either single-GPU or 2-way SLI.

In the single GPU tests, I assigned PhysX to the main GPU (rather than leaving it on "automatic", to ensure that only one GPU was doing any work (ie. to prevent the physX from being offloaded to my unused Titan or my 650ti).

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#27
Posted 05/10/2014 11:21 PM   
I can't find the thread on the top of my head but it basically stated that vsync was no longer being forces 6 months back or similar. The exact date of the change was not specified.
I can't find the thread on the top of my head but it basically stated that vsync was no longer being forces 6 months back or similar. The exact date of the change was not specified.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#28
Posted 05/11/2014 01:48 AM   
[quote="Flugan"]I can't find the thread on the top of my head but it basically stated that vsync was no longer being forces 6 months back or similar. The exact date of the change was not specified.[/quote]Hmm. That implies that while everybody was bemoaning NVidia's total lack of interest in 3D last year, they were quietly introducing fundamental improvements to the technology. Consider my eyebrows raised.
Flugan said:I can't find the thread on the top of my head but it basically stated that vsync was no longer being forces 6 months back or similar. The exact date of the change was not specified.
Hmm. That implies that while everybody was bemoaning NVidia's total lack of interest in 3D last year, they were quietly introducing fundamental improvements to the technology. Consider my eyebrows raised.

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#29
Posted 05/11/2014 06:52 AM   
Really enjoyed the blog. I have been cogitating about upgrading to SLI for absolutely ages. The conundrum for me is: once you are in SLI you sort of have to stay SLI. This is more expensive than buying a new single card. I s'pose you could skip a generation and buy a pair of 'older' cards?
Really enjoyed the blog. I have been cogitating about upgrading to SLI for absolutely ages. The conundrum for me is: once you are in SLI you sort of have to stay SLI. This is more expensive than buying a new single card. I s'pose you could skip a generation and buy a pair of 'older' cards?

Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM

Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes

#30
Posted 05/11/2014 05:49 PM   
  2 / 3    
Scroll To Top