Just an update: It seems like Max Payne is running properly. I changed the SLI rendering mode to SFR and now it's also running very well with adaptive vsync. It's much smoother now than it was before, but still has frame rate drops because of the bad performance of SFR.
Just an update: It seems like Max Payne is running properly. I changed the SLI rendering mode to SFR and now it's also running very well with adaptive vsync. It's much smoother now than it was before, but still has frame rate drops because of the bad performance of SFR.
[quote="munwaal"]
It could be, although I'm using a standard HDMI cable.[/quote]
I could be wrong, but I don't think hdmi is ideal for 3dvision. I can (and must) use it for Tridef to work on my monitor in sideBySide mode. But I don't think native 3dvision even works on my monitor without a dvi cable
[quote]If I use adaptive vsync in Max Payne, the game is jerky. Looking at FRAPS, it seems the frames are all over the place between 50 and 60. This number changes with every frame, meaning it's so fast that the numbers just appear as a blur.[/quote]
That's what you want! Ideally, you want vsync to cap your framerate at 60fps (which will prevent tearing), but not interfere with it if dips below 60fps (which would cause abrupt flatline drops to 40 or 30fps, and/or input lag).
Yes, having a variable framerate is jerky and annoying, but it's better than being artificially forced to 30fps.
In lieu of your system being able to always pump out a smooth minimum of 60fps (which in 3Dvision is 120fps of course), a variable framerate is the best you're going to get. And until we have g-sync screens, variable framerates are always going to be annoying, because they introduce duplicated and/or skipped frames (ie. stutter)
If you like, you can try a 3rd party program to try and cap your framerate at a number of your choosing, for example 53fps, which will minimise the variability (ie. your frame would jiggle lightly around 50-53 rather than jumping around 50-60).
Dxtory does this well. I believe the new versions of afterburner and evga precision X might have this functionality now too.
[quote]
Now this I didn't know. I changed this setting and enabled adaptive vsync with Tomb Raider and it's smooth as silk with no tearing in sight, so thanks a lot for this! [/quote]excellent!! Yeah, it's not a very well publicised or understood setting that one, which is a shame because it makes a big difference (I've heard of game-breaking bugs in 3D without it being at 1). I don't fully understand its impact myself, although since vsync is based around keeping pre-rendered frames in the buffer, it seemed relevant.
[quote]
Max Payne is a different story, though, but I'm going to keep on messing around with it. To me it feels like the 2 cards are just not in proper synchronization as far as vsync is concerned. It's just a feeling, though, no way to actually prove it.[/quote]
Have a look at this:
[img]http://postachio-images.s3-website-us-east-1.amazonaws.com/700adbbdfac613bd3a6c994890af82a8/eb95c1d7c0ac450dfe4c2303b4c94f96/5ce8453a8f86576a7f5a2fcde7653b31.png[/img]
It's from a test I did in Metro Last Light (without vsync btw). I was testing how well a PhysX card works (whole blog post [url=http://volnapc.com/how-much-difference-does-a-dedicated-physx-card-make]here[/url]), but what is particularly striking is the ridiculous variability in fps that SLI brings. Look at the top-right chart. That black area is actually a single line, jumping up and down so much (over 50 fps in a fraction of a second at times!) that it looks almost like a solid area.
So yeah, your hunch about your cards not being perfectly in sync with one another is most likely spot on. We've known this about SLI for a while now, and nNvidia's solution is said to be much smoother than AMD's CrossfireX. But still, as you can see, the problem is very real (I should try seeing MLL in SFR mode)
In the above case, getting a 3rd PhysX card really helps the problem. Though that of course only works for games that use PhysX, which I believe Max Payne 3 doesn't.
munwaal said:
It could be, although I'm using a standard HDMI cable.
I could be wrong, but I don't think hdmi is ideal for 3dvision. I can (and must) use it for Tridef to work on my monitor in sideBySide mode. But I don't think native 3dvision even works on my monitor without a dvi cable
If I use adaptive vsync in Max Payne, the game is jerky. Looking at FRAPS, it seems the frames are all over the place between 50 and 60. This number changes with every frame, meaning it's so fast that the numbers just appear as a blur.
That's what you want! Ideally, you want vsync to cap your framerate at 60fps (which will prevent tearing), but not interfere with it if dips below 60fps (which would cause abrupt flatline drops to 40 or 30fps, and/or input lag).
Yes, having a variable framerate is jerky and annoying, but it's better than being artificially forced to 30fps.
In lieu of your system being able to always pump out a smooth minimum of 60fps (which in 3Dvision is 120fps of course), a variable framerate is the best you're going to get. And until we have g-sync screens, variable framerates are always going to be annoying, because they introduce duplicated and/or skipped frames (ie. stutter)
If you like, you can try a 3rd party program to try and cap your framerate at a number of your choosing, for example 53fps, which will minimise the variability (ie. your frame would jiggle lightly around 50-53 rather than jumping around 50-60).
Dxtory does this well. I believe the new versions of afterburner and evga precision X might have this functionality now too.
Now this I didn't know. I changed this setting and enabled adaptive vsync with Tomb Raider and it's smooth as silk with no tearing in sight, so thanks a lot for this!
excellent!! Yeah, it's not a very well publicised or understood setting that one, which is a shame because it makes a big difference (I've heard of game-breaking bugs in 3D without it being at 1). I don't fully understand its impact myself, although since vsync is based around keeping pre-rendered frames in the buffer, it seemed relevant.
Max Payne is a different story, though, but I'm going to keep on messing around with it. To me it feels like the 2 cards are just not in proper synchronization as far as vsync is concerned. It's just a feeling, though, no way to actually prove it.
Have a look at this:
It's from a test I did in Metro Last Light (without vsync btw). I was testing how well a PhysX card works (whole blog post here), but what is particularly striking is the ridiculous variability in fps that SLI brings. Look at the top-right chart. That black area is actually a single line, jumping up and down so much (over 50 fps in a fraction of a second at times!) that it looks almost like a solid area.
So yeah, your hunch about your cards not being perfectly in sync with one another is most likely spot on. We've known this about SLI for a while now, and nNvidia's solution is said to be much smoother than AMD's CrossfireX. But still, as you can see, the problem is very real (I should try seeing MLL in SFR mode)
In the above case, getting a 3rd PhysX card really helps the problem. Though that of course only works for games that use PhysX, which I believe Max Payne 3 doesn't.
[quote="munwaal"]I'm using FRAPS to measure the FPS, and I'm using MSI Afterburner to monitor the video card usage and ram, as well as the CPU usage.
I'm overclocking both video cards (although I have tried it with both of them on stock settings) and I'm overclocking my i5 2500k to 4.5Ghz. I doubt very much that there's a bottleneck anywhere, since I can really run most games completely maxed out without really breaking a sweat. I also have 8 gigs of ram.
I'm also running it all on a 50 inch LED 3DTV. I am using the Acer HR274H EDID override driver for the TV. My method of 3D is Line Interlaced, which I also use on Tridef 3D without any issues.
I want to use 3D Vision because it can utilize SLI, whereas Tridef can not.[/quote]Thanks for the details, and the screen shots. I agree that your system is just about as good as you can buy, and should not have any problem here.
The back-to-back test of vsync on, vsync off, also demonstrates that it's definitely vSync and not some other problem- because it all runs well vsync off.
Based on what you've said, it seems to me that the problem has something to do with the EDID override. I don't know much about that mechanism, but it's definitely in the display chain, and related to vSync, because it has to lie to the driver about system capabilities.
It'a also possible, but a lot less likely that it's Afterburner. From what I read, it also has the ability to play with vSync and set a maximum limit. If it were accidentally or by bug set to 40, that would explain it.
So based on that line of reasoning, try two things.
1) Disable Afterburner, and do your back-to-back vSync test. Do this one first, because it's easy.
2) Try to disable EDID override and test. The goal is to get a double vision looking image, with proper frame rates. For purposes of testing, it doesn't matter if you cannot actually see it in 3D.
If you have a second monitor hooked up, send 3D to it, instead of the TV.
Or, try changing to CRT mode if that's possible, to see if you can get double vision looking display with proper framerates.
If that doesn't seem possible, put it in Discover mode, red/cyan, as that keeps everything else in the pipeline with the alternate render output. You can disable the EDID override completely for example, or go to a normal monitor.
munwaal said:I'm using FRAPS to measure the FPS, and I'm using MSI Afterburner to monitor the video card usage and ram, as well as the CPU usage.
I'm overclocking both video cards (although I have tried it with both of them on stock settings) and I'm overclocking my i5 2500k to 4.5Ghz. I doubt very much that there's a bottleneck anywhere, since I can really run most games completely maxed out without really breaking a sweat. I also have 8 gigs of ram.
I'm also running it all on a 50 inch LED 3DTV. I am using the Acer HR274H EDID override driver for the TV. My method of 3D is Line Interlaced, which I also use on Tridef 3D without any issues.
I want to use 3D Vision because it can utilize SLI, whereas Tridef can not.
Thanks for the details, and the screen shots. I agree that your system is just about as good as you can buy, and should not have any problem here.
The back-to-back test of vsync on, vsync off, also demonstrates that it's definitely vSync and not some other problem- because it all runs well vsync off.
Based on what you've said, it seems to me that the problem has something to do with the EDID override. I don't know much about that mechanism, but it's definitely in the display chain, and related to vSync, because it has to lie to the driver about system capabilities.
It'a also possible, but a lot less likely that it's Afterburner. From what I read, it also has the ability to play with vSync and set a maximum limit. If it were accidentally or by bug set to 40, that would explain it.
So based on that line of reasoning, try two things.
1) Disable Afterburner, and do your back-to-back vSync test. Do this one first, because it's easy.
2) Try to disable EDID override and test. The goal is to get a double vision looking image, with proper frame rates. For purposes of testing, it doesn't matter if you cannot actually see it in 3D.
If you have a second monitor hooked up, send 3D to it, instead of the TV.
Or, try changing to CRT mode if that's possible, to see if you can get double vision looking display with proper framerates.
If that doesn't seem possible, put it in Discover mode, red/cyan, as that keeps everything else in the pipeline with the alternate render output. You can disable the EDID override completely for example, or go to a normal monitor.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Hey guys
Thanks for the responses.
Volnaiskra: From what I've read, adding another card to the mix would re-enable also re-enable triple buffering. Apparently it works with 3 cards, but not 2. Those graphs you posted are very interesting. It's definitely a similar thing that I'm experiencing. SFR definitely makes a difference, but the performance is really bad.
bo3b: I removed the EDID driver and let Windows install the standard plug and play monitor driver. I then enabled 3D in the control panel (to work correctly, it requires 3DTV Play, which I don't have and don't want to use because it's crap) and ran Tomb Raider again. The results were rather interesting. Even with 3D turned on in the control panel, but not turned on in-game (I didn't activate it with the hotkey), the frames still went down to 40, as you can see here: http://www.uploadit.org/f/10db334a234fcf0cd703f3eff6fee9b8.jpg
Could it be a driver issue?
Unfortunately I don't have another monitor to swap out and test with.
Thanks a lot guys. I really appreciate all the help. I think I'm going to stick to Tridef for Max Payne 3. I get a much smoother experience than with 3D Vision. I'm really stoked that Tomb Raider is at least working properly now. The game is amazing in 3D.
Volnaiskra: From what I've read, adding another card to the mix would re-enable also re-enable triple buffering. Apparently it works with 3 cards, but not 2. Those graphs you posted are very interesting. It's definitely a similar thing that I'm experiencing. SFR definitely makes a difference, but the performance is really bad.
bo3b: I removed the EDID driver and let Windows install the standard plug and play monitor driver. I then enabled 3D in the control panel (to work correctly, it requires 3DTV Play, which I don't have and don't want to use because it's crap) and ran Tomb Raider again. The results were rather interesting. Even with 3D turned on in the control panel, but not turned on in-game (I didn't activate it with the hotkey), the frames still went down to 40, as you can see here: http://www.uploadit.org/f/10db334a234fcf0cd703f3eff6fee9b8.jpg
Could it be a driver issue?
Unfortunately I don't have another monitor to swap out and test with.
Thanks a lot guys. I really appreciate all the help. I think I'm going to stick to Tridef for Max Payne 3. I get a much smoother experience than with 3D Vision. I'm really stoked that Tomb Raider is at least working properly now. The game is amazing in 3D.
I could be wrong, but I don't think hdmi is ideal for 3dvision. I can (and must) use it for Tridef to work on my monitor in sideBySide mode. But I don't think native 3dvision even works on my monitor without a dvi cable
That's what you want! Ideally, you want vsync to cap your framerate at 60fps (which will prevent tearing), but not interfere with it if dips below 60fps (which would cause abrupt flatline drops to 40 or 30fps, and/or input lag).
Yes, having a variable framerate is jerky and annoying, but it's better than being artificially forced to 30fps.
In lieu of your system being able to always pump out a smooth minimum of 60fps (which in 3Dvision is 120fps of course), a variable framerate is the best you're going to get. And until we have g-sync screens, variable framerates are always going to be annoying, because they introduce duplicated and/or skipped frames (ie. stutter)
If you like, you can try a 3rd party program to try and cap your framerate at a number of your choosing, for example 53fps, which will minimise the variability (ie. your frame would jiggle lightly around 50-53 rather than jumping around 50-60).
Dxtory does this well. I believe the new versions of afterburner and evga precision X might have this functionality now too.
excellent!! Yeah, it's not a very well publicised or understood setting that one, which is a shame because it makes a big difference (I've heard of game-breaking bugs in 3D without it being at 1). I don't fully understand its impact myself, although since vsync is based around keeping pre-rendered frames in the buffer, it seemed relevant.
Have a look at this:
It's from a test I did in Metro Last Light (without vsync btw). I was testing how well a PhysX card works (whole blog post here), but what is particularly striking is the ridiculous variability in fps that SLI brings. Look at the top-right chart. That black area is actually a single line, jumping up and down so much (over 50 fps in a fraction of a second at times!) that it looks almost like a solid area.
So yeah, your hunch about your cards not being perfectly in sync with one another is most likely spot on. We've known this about SLI for a while now, and nNvidia's solution is said to be much smoother than AMD's CrossfireX. But still, as you can see, the problem is very real (I should try seeing MLL in SFR mode)
In the above case, getting a 3rd PhysX card really helps the problem. Though that of course only works for games that use PhysX, which I believe Max Payne 3 doesn't.
The back-to-back test of vsync on, vsync off, also demonstrates that it's definitely vSync and not some other problem- because it all runs well vsync off.
Based on what you've said, it seems to me that the problem has something to do with the EDID override. I don't know much about that mechanism, but it's definitely in the display chain, and related to vSync, because it has to lie to the driver about system capabilities.
It'a also possible, but a lot less likely that it's Afterburner. From what I read, it also has the ability to play with vSync and set a maximum limit. If it were accidentally or by bug set to 40, that would explain it.
So based on that line of reasoning, try two things.
1) Disable Afterburner, and do your back-to-back vSync test. Do this one first, because it's easy.
2) Try to disable EDID override and test. The goal is to get a double vision looking image, with proper frame rates. For purposes of testing, it doesn't matter if you cannot actually see it in 3D.
If you have a second monitor hooked up, send 3D to it, instead of the TV.
Or, try changing to CRT mode if that's possible, to see if you can get double vision looking display with proper framerates.
If that doesn't seem possible, put it in Discover mode, red/cyan, as that keeps everything else in the pipeline with the alternate render output. You can disable the EDID override completely for example, or go to a normal monitor.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Thanks for the responses.
Volnaiskra: From what I've read, adding another card to the mix would re-enable also re-enable triple buffering. Apparently it works with 3 cards, but not 2. Those graphs you posted are very interesting. It's definitely a similar thing that I'm experiencing. SFR definitely makes a difference, but the performance is really bad.
bo3b: I removed the EDID driver and let Windows install the standard plug and play monitor driver. I then enabled 3D in the control panel (to work correctly, it requires 3DTV Play, which I don't have and don't want to use because it's crap) and ran Tomb Raider again. The results were rather interesting. Even with 3D turned on in the control panel, but not turned on in-game (I didn't activate it with the hotkey), the frames still went down to 40, as you can see here: http://www.uploadit.org/f/10db334a234fcf0cd703f3eff6fee9b8.jpg
Could it be a driver issue?
Unfortunately I don't have another monitor to swap out and test with.
Thanks a lot guys. I really appreciate all the help. I think I'm going to stick to Tridef for Max Payne 3. I get a much smoother experience than with 3D Vision. I'm really stoked that Tomb Raider is at least working properly now. The game is amazing in 3D.