If OSD says 60fps, is that 30fps per eye or 60fps per eye?
Using EVGA PrecisionX OSD in Witcher 2. I'm getting between 60fps and 90fps on the OSD. Am I really getting 30-45fps? Details: Playing on S2716DG monitor (1440p, G-Sync, 144hz), 2x GTX 1070 in SLI. Thanks.
Using EVGA PrecisionX OSD in Witcher 2. I'm getting between 60fps and 90fps on the OSD. Am I really getting 30-45fps?

Details:
Playing on S2716DG monitor (1440p, G-Sync, 144hz), 2x GTX 1070 in SLI.

Thanks.

#1
Posted 07/28/2016 01:45 PM   
I'm not sure about that OSD, but I'm pretty sure FRAPS is per eye. Those look to be per-eye numbers for Witcher 2 on a system like that. What does the framerate feel like to you?
I'm not sure about that OSD, but I'm pretty sure FRAPS is per eye. Those look to be per-eye numbers for Witcher 2 on a system like that. What does the framerate feel like to you?

#2
Posted 07/28/2016 02:11 PM   
Same goes for AB (statistic server) and Precision, imagine it "per eye"
Same goes for AB (statistic server) and Precision, imagine it "per eye"

i5 4670K 4.4 Ghz H2O, G.skill 16GB @2.4 Ghz C10, 2xGTX970 G1 SLI, AOC G2460PG, G-sync+3D Vision 2, Win 7x64(ssd), Games on RAID-0

#3
Posted 07/28/2016 02:43 PM   
I'll try FRAPS and compare the two.
I'll try FRAPS and compare the two.

#4
Posted 07/28/2016 02:43 PM   
Just tested. Same numbers in FRAPS. Feels like less, but maybe that's the strobing effect of the glasses or the fact that G-sync is disabled when 3D vision is enabled.
Just tested. Same numbers in FRAPS. Feels like less, but maybe that's the strobing effect of the glasses or the fact that G-sync is disabled when 3D vision is enabled.

#5
Posted 07/28/2016 02:48 PM   
Their will alway be a micro stuttering on active 3D, even with 120 fps.
Their will alway be a micro stuttering on active 3D, even with 120 fps.
Try disabling SLI to see if that makes a difference. Do you have multiple monitors? Set the second one to be the primary one under the nvidia control panel, then change it back - it's helped with some frame desync issues for me. It also doesn't hurt to limit the framerate - a fluctuating framerate will always feel worse to the eye than a stable one. Limiting it will generally limit it on a per-eye basis, so a 60fps limit might be the best choice. Also, try with vsync on and off. Generally on is much better for most games, but occasionally you'll get better results with it off. I disagree that 3d always has microstutter, that isn't the case in my experience.
Try disabling SLI to see if that makes a difference. Do you have multiple monitors? Set the second one to be the primary one under the nvidia control panel, then change it back - it's helped with some frame desync issues for me.

It also doesn't hurt to limit the framerate - a fluctuating framerate will always feel worse to the eye than a stable one. Limiting it will generally limit it on a per-eye basis, so a 60fps limit might be the best choice. Also, try with vsync on and off. Generally on is much better for most games, but occasionally you'll get better results with it off.

I disagree that 3d always has microstutter, that isn't the case in my experience.

#7
Posted 07/28/2016 05:28 PM   
Those FPS overlays are just approximations. There are articles about it. frames built, frames displayed, frames dropped, bottlenecking, etc...
Those FPS overlays are just approximations. There are articles about it. frames built, frames displayed, frames dropped, bottlenecking, etc...

#8
Posted 07/28/2016 06:03 PM   
https://www.youtube.com/watch?v=O951b2AQFTY http://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools http://www.geforce.com/hardware/technology/fcat/technology
I have always Wandered whats the deal with pc and projectors ? , as long as i remember i have gamed only with projector and i have never found a game that does not suffer from tearing if vsynch is off. Its like rule n.1 Turn on vsynch.
I have always Wandered whats the deal with pc and projectors ?

, as long as i remember i have gamed only with projector and i have never found a game that does not suffer from tearing if vsynch is off. Its like rule n.1
Turn on vsynch.

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@stock
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

#10
Posted 08/07/2016 11:08 AM   
Interestingly, the OSD is telling you both! You see, the 3D Vision driver will render a frame in time for one eye, and then render the exact same frame in time for the other eye but from a different perspective. This means that when the OSD says 60FPS, and your eyes are seeing 120FPS, you are actually only perceiving 60 FPS because there are only 60 frame snapshots in time taken in the second! A way to display and perceive true 120FPS would be to render every frame from a new time snapshot from a different perspective, and it wouldn't even be more demanding in performance - in theory, you could potentially get double the performance for free. I made a thread about it a long time ago: https://forums.geforce.com/default/topic/572033/3d-vision/please-help-me-fix-the-60fps-120hz-issue-once-and-for-all-/1/ helifax made a tool which showed that it worked great! However, it has never been implemented in the 3D Vision driver as it is closed source.
Interestingly, the OSD is telling you both!

You see, the 3D Vision driver will render a frame in time for one eye, and then render the exact same frame in time for the other eye but from a different perspective. This means that when the OSD says 60FPS, and your eyes are seeing 120FPS, you are actually only perceiving 60 FPS because there are only 60 frame snapshots in time taken in the second!

A way to display and perceive true 120FPS would be to render every frame from a new time snapshot from a different perspective, and it wouldn't even be more demanding in performance - in theory, you could potentially get double the performance for free. I made a thread about it a long time ago:

https://forums.geforce.com/default/topic/572033/3d-vision/please-help-me-fix-the-60fps-120hz-issue-once-and-for-all-/1/

helifax made a tool which showed that it worked great! However, it has never been implemented in the 3D Vision driver as it is closed source.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#11
Posted 08/07/2016 03:40 PM   
[quote="RAGEdemon"] A way to display and perceive true 120FPS would be to render every frame from a new time snapshot from a different perspective, and it wouldn't even be more demanding in performance [/quote] There are many games where going from 120fps in 2D to 60fps per eye in 3D reduce CPU requirements. There are some extreme examples where you can go from 80-90fps in 2D to 60fps in 3D (Castlevania LoS 2 in big zones, for example)! With the method you said, it would probably be a bit more demanding than 120fps in 2D, at least. And there are other problems too. Remember RAGE or Wolfenstein in 3D, which is 30fps per eye but in a sequential way? When time between two eyes is big, it can be nauseating the more you move (the fixes are still good if you play slowly, or if you want to take screenshots). I'm sure that the DOOM fix is a lot better because you can get 60fps per eye, however. It would be cool if 3D Vision monitors had a 120Hz passive mode, to get 120 real fps per eye.
RAGEdemon said:
A way to display and perceive true 120FPS would be to render every frame from a new time snapshot from a different perspective, and it wouldn't even be more demanding in performance


There are many games where going from 120fps in 2D to 60fps per eye in 3D reduce CPU requirements. There are some extreme examples where you can go from 80-90fps in 2D to 60fps in 3D (Castlevania LoS 2 in big zones, for example)!

With the method you said, it would probably be a bit more demanding than 120fps in 2D, at least.

And there are other problems too. Remember RAGE or Wolfenstein in 3D, which is 30fps per eye but in a sequential way? When time between two eyes is big, it can be nauseating the more you move (the fixes are still good if you play slowly, or if you want to take screenshots). I'm sure that the DOOM fix is a lot better because you can get 60fps per eye, however.

It would be cool if 3D Vision monitors had a 120Hz passive mode, to get 120 real fps per eye.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#12
Posted 08/07/2016 04:05 PM   
That's right! The CPU overhead for 3D vision isn't significant IIRC, so if you can play 120FPS 2D, you should be able to play 120FPS 3D. At low fps however, it should be programmed to auto-lock to half-time snapshot FPS of course, or someone cleverer than myself can come up with a better solution. I have not had a chance to play Doom 2016 as yet, but if what you say is true, I greatly look forward to the first 60fps per eye per time increment experience; if this ID engine implementation has support for SLi of course, which id engines have not had before AFAIK...
That's right!

The CPU overhead for 3D vision isn't significant IIRC, so if you can play 120FPS 2D, you should be able to play 120FPS 3D. At low fps however, it should be programmed to auto-lock to half-time snapshot FPS of course, or someone cleverer than myself can come up with a better solution.

I have not had a chance to play Doom 2016 as yet, but if what you say is true, I greatly look forward to the first 60fps per eye per time increment experience; if this ID engine implementation has support for SLi of course, which id engines have not had before AFAIK...

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#13
Posted 08/07/2016 04:33 PM   
Scroll To Top