Please help me fix the 60FPS @ 120Hz issue once and for all!
  4 / 6    
[quote="helifax"][quote="mbloof"]It is my understanding that on 3DVision, Tridef and even with many/most applications which do their own 3D rendering, the scene is rendered ONCE and then the 3D driver moves the "camera" for the two viewpoints to create the S3D image we see. (at least in most cases the "heavy hitting" is done once and the details for each view is filled in) [/quote] Actually this is not correct. In stereo 3D you need to setup 2 perspective projections (one for each eye) and DRAW the same scene 1 TIME using the projection matrix for each each = Same scene rendered 2 times from different perspective. Since Fraps is counting the number of drawcalls it will say 120 calls per second thus giving 120fps which is false sincee 60 draw calls are for left eye and 60 for right eye. Ofc NOT all thing are rendered two times like Shadow maps for example (according to nVidia - personally haven't tried it yet) Also in deferred rendering some other consideration need to be taken in regards to the G-buffer. But most of the times you draw 2 times the same frame.[/quote] Actually, I've never seen "fraps" record anything more than 60FPS when 3DPlay/3Dvision is activated since the 3DVision driver actually 'renders' the two different perspectives without the application program or 'fraps' knowing about it. Will 'fraps' display 120FPS when using a 120Hz monitor and using 3DVision? Since I never use my 120Hz monitor (don't feel like hooking it up) IDK but it appears that the basis of this thread is that fraps is reporting only 60FPS when running 3DVision while the monitor is getting 120FPS (60 unique FPS per eye).
helifax said:
mbloof said:It is my understanding that on 3DVision, Tridef and even with many/most applications which do their own 3D rendering, the scene is rendered ONCE and then the 3D driver moves the "camera" for the two viewpoints to create the S3D image we see. (at least in most cases the "heavy hitting" is done once and the details for each view is filled in)



Actually this is not correct.

In stereo 3D you need to setup 2 perspective projections (one for each eye) and DRAW the same scene 1 TIME using the projection matrix for each each = Same scene rendered 2 times from different perspective.

Since Fraps is counting the number of drawcalls it will say 120 calls per second thus giving 120fps which is false sincee 60 draw calls are for left eye and 60 for right eye.

Ofc NOT all thing are rendered two times like Shadow maps for example (according to nVidia - personally haven't tried it yet)
Also in deferred rendering some other consideration need to be taken in regards to the G-buffer.
But most of the times you draw 2 times the same frame.


Actually, I've never seen "fraps" record anything more than 60FPS when 3DPlay/3Dvision is activated since the 3DVision driver actually 'renders' the two different perspectives without the application program or 'fraps' knowing about it.

Will 'fraps' display 120FPS when using a 120Hz monitor and using 3DVision? Since I never use my 120Hz monitor (don't feel like hooking it up) IDK but it appears that the basis of this thread is that fraps is reporting only 60FPS when running 3DVision while the monitor is getting 120FPS (60 unique FPS per eye).

i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"

#46
Posted 08/09/2013 10:47 PM   
[quote="mbloof"][quote="helifax"][quote="mbloof"]It is my understanding that on 3DVision, Tridef and even with many/most applications which do their own 3D rendering, the scene is rendered ONCE and then the 3D driver moves the "camera" for the two viewpoints to create the S3D image we see. (at least in most cases the "heavy hitting" is done once and the details for each view is filled in) [/quote] Actually this is not correct. In stereo 3D you need to setup 2 perspective projections (one for each eye) and DRAW the same scene 1 TIME using the projection matrix for each each = Same scene rendered 2 times from different perspective. Since Fraps is counting the number of drawcalls it will say 120 calls per second thus giving 120fps which is false sincee 60 draw calls are for left eye and 60 for right eye. Ofc NOT all thing are rendered two times like Shadow maps for example (according to nVidia - personally haven't tried it yet) Also in deferred rendering some other consideration need to be taken in regards to the G-buffer. But most of the times you draw 2 times the same frame.[/quote] Actually, I've never seen "fraps" record anything more than 60FPS when 3DPlay/3Dvision is activated since the 3DVision driver actually 'renders' the two different perspectives without the application program or 'fraps' knowing about it. Will 'fraps' display 120FPS when using a 120Hz monitor and using 3DVision? Since I never use my 120Hz monitor (don't feel like hooking it up) IDK but it appears that the basis of this thread is that fraps is reporting only 60FPS when running 3DVision while the monitor is getting 120FPS (60 unique FPS per eye).[/quote] Fraps screenshot in 3D Mode: [url=http://www.iforce.co.nz/View.aspx?i=n1f4ylu3.1yw.jpg][img]http://iforce.co.nz/i/n1f4ylu3.1yw.jpg[/img][/url] (Fraps doesn't capture pictures in 3D :( ) Fraps screenshot in 2D Mode (disabled via CTRL+T) [url=http://www.iforce.co.nz/View.aspx?i=2441kn0d.y0j.jpg][img]http://iforce.co.nz/i/2441kn0d.y0j.jpg[/img][/url] Behavior is like above...
mbloof said:
helifax said:
mbloof said:It is my understanding that on 3DVision, Tridef and even with many/most applications which do their own 3D rendering, the scene is rendered ONCE and then the 3D driver moves the "camera" for the two viewpoints to create the S3D image we see. (at least in most cases the "heavy hitting" is done once and the details for each view is filled in)



Actually this is not correct.

In stereo 3D you need to setup 2 perspective projections (one for each eye) and DRAW the same scene 1 TIME using the projection matrix for each each = Same scene rendered 2 times from different perspective.

Since Fraps is counting the number of drawcalls it will say 120 calls per second thus giving 120fps which is false sincee 60 draw calls are for left eye and 60 for right eye.

Ofc NOT all thing are rendered two times like Shadow maps for example (according to nVidia - personally haven't tried it yet)
Also in deferred rendering some other consideration need to be taken in regards to the G-buffer.
But most of the times you draw 2 times the same frame.


Actually, I've never seen "fraps" record anything more than 60FPS when 3DPlay/3Dvision is activated since the 3DVision driver actually 'renders' the two different perspectives without the application program or 'fraps' knowing about it.

Will 'fraps' display 120FPS when using a 120Hz monitor and using 3DVision? Since I never use my 120Hz monitor (don't feel like hooking it up) IDK but it appears that the basis of this thread is that fraps is reporting only 60FPS when running 3DVision while the monitor is getting 120FPS (60 unique FPS per eye).



Fraps screenshot in 3D Mode:
Image

(Fraps doesn't capture pictures in 3D :( )

Fraps screenshot in 2D Mode (disabled via CTRL+T)
Image


Behavior is like above...

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#47
Posted 08/09/2013 10:59 PM   
Hi mbloof, I'm afraid your understanding of the basis of this thread is incorrect. I don't understand why you keep bringing up FRAPS when I have mentioned multiple times that not only has someone else confirmed that unique FPS have halved, but I have demonstrated it through GPU usage measurement under different scenarios in one of my earlier posts. Furthermore, you seem to be overlooking/ignoring a lot of posts in this thread. I am trying to be engaging but unfortunately, you don't seem to be reciprocating. It is unfortunate that I will not be able to to speak with you further unless something changes. Regretfully, -- Shahzad.
Hi mbloof,

I'm afraid your understanding of the basis of this thread is incorrect.

I don't understand why you keep bringing up FRAPS when I have mentioned multiple times that not only has someone else confirmed that unique FPS have halved, but I have demonstrated it through GPU usage measurement under different scenarios in one of my earlier posts. Furthermore, you seem to be overlooking/ignoring a lot of posts in this thread.

I am trying to be engaging but unfortunately, you don't seem to be reciprocating. It is unfortunate that I will not be able to to speak with you further unless something changes.

Regretfully,
-- Shahzad.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#48
Posted 08/09/2013 11:06 PM   
[quote="RAGEdemon"]Hi fellas, There is a lot of misinformation out there. Opinions may differ but the truth is that there is a noticeable difference betwen 60FPS and 120FPS. Some people are more sensitive to it than others. The problem: 3D vision is always locked at half the FPS for the max possible of any given display, e.g. 60 FPS with a 120Hz display. It is true that the driver shows 60Hz to each eye, but the frame for the other eye is exactly duplicated instead of being the next logical frame. Let me give an example: Glasses shutter followed by frame number: LRLRLRLRLR 1122334455 60FPS @ 120Hz It should be: LRLRLRLRL 123456789 Proper 120FPS @ 120Hz Does that make sense? [/quote] It makes sense for 2D... but Stereo you need to see EXACTLY the same thing from TWO perspectives. The above description of yours is not correct Correct is: 1(LP)1(RP) 2(LP)2(RP) 3(LP)3(RP) etc LP = Left Perspective RP = Right Perspective If you DO : 1(LP)1(LP) 2(LP)1(LP) 3(LP)1(LP) => You get PLAIN 3D (aka 2D no stereo) The way OpenGL and DX works is this: Mono: render in back buffer & swap buffers Stereo: Render in left_back buffer & render in right_back buffer & swap buffers The 60FPS or 60 hz is ARTIFICIAL. You need to understand how the hardware works: 120 fps ONE IMAGE = means 120 DRAWN images per second in 3D vision and SHUTTER glasses you STILL display at 120hz but the whole image (left + right) is at 60hz since YOU lose one aditional Hz to render the right eye. In order to KEEP a clean image the driver is forcing the VSYNC at 60fps Since in Frame sequential you NEVER EVER display both left and right eye AT the same TIME basically you get 120 hz /2 eyes = 60hz per eye. The glasses are also working on this frequency. So if you think about it you are still rendering 120 frames. Ofc you can probably force the driver to break the 60hz VSYNC but I bet you LOSE frames (meaning they are sent to the screen but not shown due to hardware limitations) I don't think this is a result you want no?
RAGEdemon said:Hi fellas,

There is a lot of misinformation out there.

Opinions may differ but the truth is that there is a noticeable difference betwen 60FPS and 120FPS. Some people are more sensitive to it than others.

The problem:

3D vision is always locked at half the FPS for the max possible of any given display, e.g. 60 FPS with a 120Hz display.

It is true that the driver shows 60Hz to each eye, but the frame for the other eye is exactly duplicated instead of being the next logical frame.

Let me give an example:
Glasses shutter followed by frame number:
LRLRLRLRLR
1122334455
60FPS @ 120Hz

It should be:
LRLRLRLRL
123456789
Proper 120FPS @ 120Hz


Does that make sense?


It makes sense for 2D... but Stereo you need to see EXACTLY the same thing from TWO perspectives.
The above description of yours is not correct

Correct is:

1(LP)1(RP) 2(LP)2(RP) 3(LP)3(RP) etc

LP = Left Perspective
RP = Right Perspective

If you DO :
1(LP)1(LP) 2(LP)1(LP) 3(LP)1(LP) => You get PLAIN 3D (aka 2D no stereo)

The way OpenGL and DX works is this:

Mono: render in back buffer & swap buffers
Stereo: Render in left_back buffer & render in right_back buffer & swap buffers

The 60FPS or 60 hz is ARTIFICIAL. You need to understand how the hardware works:

120 fps ONE IMAGE = means 120 DRAWN images per second
in 3D vision and SHUTTER glasses you STILL display at 120hz but the whole image (left + right) is at 60hz since YOU lose one aditional Hz to render the right eye. In order to KEEP a clean image the driver is forcing the VSYNC at 60fps
Since in Frame sequential you NEVER EVER display both left and right eye AT the same TIME basically you get 120 hz /2 eyes = 60hz per eye. The glasses are also working on this frequency. So if you think about it you are still rendering 120 frames.

Ofc you can probably force the driver to break the 60hz VSYNC but I bet you LOSE frames (meaning they are sent to the screen but not shown due to hardware limitations)

I don't think this is a result you want no?

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#49
Posted 08/09/2013 11:10 PM   
Hi Helifax, Welcome to the discussion :) [quote="helifax"] It makes sense for 2D... but Stereo you need to see EXACTLY the same thing from TWO perspectives. The above description of yours is not correct [/quote] I'm afraid I am going to have to disagree with you there. Please refer to my experiment on the top post of page 3. When you put your 3D glasses on and move an object such as your hand across your field of view, you will notice that your hand is perfect 3D without any artifacts. Each frame your eyes are receiving is not the same for both eyes from different perspectives - they are completely different frames. Not only is it glorious 3D, but it is also 120fps. [quote="helifax"] Correct is: 1(LP)1(RP) 2(LP)2(RP) 3(LP)3(RP) etc LP = Left Perspective RP = Right Perspective If you DO : 1(LP)1(LP) 2(LP)1(LP) 3(LP)1(LP) => You get PLAIN 3D (aka 2D no stereo) [/quote] Sorry, but I do not know where you got that last line from. What I am suggesting is in fact: 1(LP) 2(RP) 3(LP) 4(RP) 5(LP) 6(RP). The rest of your text, I have always been in complete agreement with. :) Yes, the FPS is 120, but the unique FPS is only 60 as frames are being duplicated from a different perspective for the other eye. If you have progressive frames where each eye is getting a newly updated frame instead of catching up to the first eye, you will attain true 120FPS albeit each unique frame will only be shown in one eye at a time (exactly the same as the experiment in the post I refer to). The result will be that the motion will be smoother at 120fps rather than 60fps. What hardware limitations do you speak of? A 120Hz display is capable of displaying 120 unique frames per second even when using shutter glasses - the unique frames will just be shown to each eye but not both at the same time because a shutter will always be closed for one eye.
Hi Helifax,

Welcome to the discussion :)

helifax said:
It makes sense for 2D... but Stereo you need to see EXACTLY the same thing from TWO perspectives.
The above description of yours is not correct


I'm afraid I am going to have to disagree with you there. Please refer to my experiment on the top post of page 3. When you put your 3D glasses on and move an object such as your hand across your field of view, you will notice that your hand is perfect 3D without any artifacts. Each frame your eyes are receiving is not the same for both eyes from different perspectives - they are completely different frames. Not only is it glorious 3D, but it is also 120fps.

helifax said:
Correct is:

1(LP)1(RP) 2(LP)2(RP) 3(LP)3(RP) etc

LP = Left Perspective
RP = Right Perspective

If you DO :
1(LP)1(LP) 2(LP)1(LP) 3(LP)1(LP) => You get PLAIN 3D (aka 2D no stereo)


Sorry, but I do not know where you got that last line from.
What I am suggesting is in fact:

1(LP) 2(RP) 3(LP) 4(RP) 5(LP) 6(RP).

The rest of your text, I have always been in complete agreement with. :)

Yes, the FPS is 120, but the unique FPS is only 60 as frames are being duplicated from a different perspective for the other eye.

If you have progressive frames where each eye is getting a newly updated frame instead of catching up to the first eye, you will attain true 120FPS albeit each unique frame will only be shown in one eye at a time (exactly the same as the experiment in the post I refer to).

The result will be that the motion will be smoother at 120fps rather than 60fps.

What hardware limitations do you speak of? A 120Hz display is capable of displaying 120 unique frames per second even when using shutter glasses - the unique frames will just be shown to each eye but not both at the same time because a shutter will always be closed for one eye.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#50
Posted 08/09/2013 11:32 PM   
[quote="RAGEdemon"][quote="D-Man11"]Doesn't it go 112233445566 simply because each frame has to be rendered twice. Once for each eye, so you get a 3D perspective. If each frame was different, your brain could not properly process the object from both views because the 2nd view is changed and the object is moved/changed.[/quote] No, I'm afraid that is a huge misconception, probably the same misconception that the driver writers believed. Try it yourself. Force vSync to OFF, and check that you are getting 120fps in a game by using a utility such as FRAPS. You will notice much smoother gameplay and 3D will be perfectly fine - even more immerse due to higher fluidity I would say... if it wasn't for the tearing :) [/quote] Hmmm, it's NOT like SOMEONE did not suggest using fraps to comfirm it for ourselves, why so hard on mbloof. BTW he explained to you why the GPU usage went down, because the 2nd frame needs very little rerendering, it's mostly information from the prior image. Also, I'm not sure any of us comprehend or are "understanding", how you can have two unique/different frames produce a 3D image. You need to come up with something better than putting my hand in front of my face with my glasses on. I'm not saying you're wrong, but I do not think you are right.
RAGEdemon said:
D-Man11 said:Doesn't it go 112233445566 simply because each frame has to be rendered twice. Once for each eye, so you get a 3D perspective. If each frame was different, your brain could not properly process the object from both views because the 2nd view is changed and the object is moved/changed.


No, I'm afraid that is a huge misconception, probably the same misconception that the driver writers believed.

Try it yourself. Force vSync to OFF, and check that you are getting 120fps in a game by using a utility such as FRAPS. You will notice much smoother gameplay and 3D will be perfectly fine - even more immerse due to higher fluidity I would say... if it wasn't for the tearing :)



Hmmm, it's NOT like SOMEONE did not suggest using fraps to comfirm it for ourselves, why so hard on mbloof.

BTW he explained to you why the GPU usage went down, because the 2nd frame needs very little rerendering, it's mostly information from the prior image.

Also, I'm not sure any of us comprehend or are "understanding", how you can have two unique/different frames produce a 3D image.

You need to come up with something better than putting my hand in front of my face with my glasses on.



I'm not saying you're wrong, but I do not think you are right.

#51
Posted 08/10/2013 12:29 AM   
[quote="D-Man11"] You need to come up with something better than putting my hand in front of my face with my glasses on. [/quote] Sorry mate, but if you cannot figure out why an object moving across your face with 3D shutter glasses on and perceiving no artifacts is virtually the same as having slightly progressive non-same frames delivered to each eye, then I'm afraid I don't have the time, nor the obligation to nanny-feed you. There are people who get it. Perhaps you should re-read some of my lengthy past posts regarding the matter - its all there. It appears that the issue is hard coded into the driver by people who have the same misconception, and no hacks will be forthcoming anytime soon. My reason for starting this thread seems to have reached its end. For what it's worth, I have proven that games are forced down to and capped at 60fps, and if you understand it or not, 120fps is possible without artifacts, as proven by my "hand across the face" experiment which you and probably others, don't understand. Fortunately, there are people who do. Unfortunately, I cannot make it much clearer as these posts are lengthy and time consuming, I am having to repeat myself often, and in the end they are mostly overlooked by the people who do not understand. The concept isn't difficult by any means. Perhaps someone who is more creative than I can explain it better. If there are serious questions by ones who have actually read and tried to internalize some of the explanations, I would be most glad to continue a discussion; but as it currently seems to stand, this thread has come to the end of its tether. Fare well.
D-Man11 said:

You need to come up with something better than putting my hand in front of my face with my glasses on.



Sorry mate, but if you cannot figure out why an object moving across your face with 3D shutter glasses on and perceiving no artifacts is virtually the same as having slightly progressive non-same frames delivered to each eye, then I'm afraid I don't have the time, nor the obligation to nanny-feed you. There are people who get it. Perhaps you should re-read some of my lengthy past posts regarding the matter - its all there.

It appears that the issue is hard coded into the driver by people who have the same misconception, and no hacks will be forthcoming anytime soon. My reason for starting this thread seems to have reached its end.

For what it's worth, I have proven that games are forced down to and capped at 60fps, and if you understand it or not, 120fps is possible without artifacts, as proven by my "hand across the face" experiment which you and probably others, don't understand. Fortunately, there are people who do. Unfortunately, I cannot make it much clearer as these posts are lengthy and time consuming, I am having to repeat myself often, and in the end they are mostly overlooked by the people who do not understand.

The concept isn't difficult by any means. Perhaps someone who is more creative than I can explain it better.

If there are serious questions by ones who have actually read and tried to internalize some of the explanations, I would be most glad to continue a discussion; but as it currently seems to stand, this thread has come to the end of its tether.

Fare well.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#52
Posted 08/10/2013 12:55 AM   
Even if it did work, how are we going to make this happen?
Even if it did work, how are we going to make this happen?

#53
Posted 08/10/2013 01:03 AM   
[quote="Cookybiscuit"]Even if it did work, how are we going to make this happen?[/quote] I was hoping for some kind of driver hack which would simply turn off the 60fps cap, and allow unique frames to be rendered for the other eye.
Cookybiscuit said:Even if it did work, how are we going to make this happen?


I was hoping for some kind of driver hack which would simply turn off the 60fps cap, and allow unique frames to be rendered for the other eye.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#54
Posted 08/10/2013 01:20 AM   
Cap removed=frames discarded
Cap removed=frames discarded

#55
Posted 08/10/2013 01:26 AM   
Maybe something like GPUView, XPerf, Visual Studio could be used to comfirm the number of frames and if they are 11223344 or 12345678 [url]http://graphics.stanford.edu/~mdfisher/GPUView.html[/url] [url]http://msdn.microsoft.com/en-us/library/windows/desktop/jj585574(v=vs.85).aspx[/url] http://msdn.microsoft.com/en-us/library/vstudio/hh873207.aspx http://msdn.microsoft.com/en-us/library/hh708963.aspx/css Or what about Nvidias PerfHud or Insight https://developer.nvidia.com/nvidia-perfhud https://developer.nvidia.com/nsight-visual-studio-edition-videos Note: the newest version of Perfhud is integrated into Insight and not available seperately
Maybe something like GPUView, XPerf, Visual Studio could be used to comfirm the number of frames and if they are 11223344 or 12345678

http://graphics.stanford.edu/~mdfisher/GPUView.html

http://msdn.microsoft.com/en-us/library/windows/desktop/jj585574(v=vs.85).aspx

http://msdn.microsoft.com/en-us/library/vstudio/hh873207.aspx

http://msdn.microsoft.com/en-us/library/hh708963.aspx/css

Or what about Nvidias PerfHud or Insight

https://developer.nvidia.com/nvidia-perfhud

https://developer.nvidia.com/nsight-visual-studio-edition-videos

Note: the newest version of Perfhud is integrated into Insight and not available seperately

#56
Posted 08/10/2013 06:07 AM   
[quote="RAGEdemon"]Hi mbloof, I'm afraid your understanding of the basis of this thread is incorrect. I don't understand why you keep bringing up FRAPS when I have mentioned multiple times that not only has someone else confirmed that unique FPS have halved, but I have demonstrated it through GPU usage measurement under different scenarios in one of my earlier posts. Furthermore, you seem to be overlooking/ignoring a lot of posts in this thread. I am trying to be engaging but unfortunately, you don't seem to be reciprocating. It is unfortunate that I will not be able to to speak with you further unless something changes. Regretfully, -- Shahzad. [/quote] My mention of 'fraps' could be used as a reference to every other frame rate measuring technique that uses the same technology - they can't actually measure the # of frames squirting out the jack on the back of the card. So it seems what your proposing is allowing the application program to "free run" without any vsync restriction? If the game was simply squirting out LRLRLRLRLR images as fast as it can regardless of the ability of the display to keep up or stay in 'sync' with them what do you think that'd buy you? Better response time in FPS games? Better picture quality? Better S3D experience? Sorry, but walking around with a pair of activated S3D shutter glasses on in a real 3D environment is not only silly but it proves nothing. Try this instead: using two different systems/display devices display 3D content. Now using the glasses from the 1st system/display set watch the content on the second system/display set. Does it look S3D? Does the background move when you rock side-to-side? Hows the image quality? After painfully watching just about every type of fake 3D method come across the big screen and after a few years of creating my own S3D images with mono and 3D cameras I've come to have a fairly good understanding on what makes up a good S3D image and IMHO your idea(s) won't do anything productive. I'll agree to disagree and bow out of the discussion.
RAGEdemon said:Hi mbloof,

I'm afraid your understanding of the basis of this thread is incorrect.

I don't understand why you keep bringing up FRAPS when I have mentioned multiple times that not only has someone else confirmed that unique FPS have halved, but I have demonstrated it through GPU usage measurement under different scenarios in one of my earlier posts. Furthermore, you seem to be overlooking/ignoring a lot of posts in this thread.

I am trying to be engaging but unfortunately, you don't seem to be reciprocating. It is unfortunate that I will not be able to to speak with you further unless something changes.

Regretfully,
-- Shahzad.



My mention of 'fraps' could be used as a reference to every other frame rate measuring technique that uses the same technology - they can't actually measure the # of frames squirting out the jack on the back of the card.

So it seems what your proposing is allowing the application program to "free run" without any vsync restriction? If the game was simply squirting out LRLRLRLRLR images as fast as it can regardless of the ability of the display to keep up or stay in 'sync' with them what do you think that'd buy you?

Better response time in FPS games?
Better picture quality?
Better S3D experience?

Sorry, but walking around with a pair of activated S3D shutter glasses on in a real 3D environment is not only silly but it proves nothing. Try this instead: using two different systems/display devices display 3D content. Now using the glasses from the 1st system/display set watch the content on the second system/display set. Does it look S3D? Does the background move when you rock side-to-side? Hows the image quality?

After painfully watching just about every type of fake 3D method come across the big screen and after a few years of creating my own S3D images with mono and 3D cameras I've come to have a fairly good understanding on what makes up a good S3D image and IMHO your idea(s) won't do anything productive.

I'll agree to disagree and bow out of the discussion.

i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"

#57
Posted 08/10/2013 06:10 AM   
Where Fraps measures in the pipeline: [img]http://images.anandtech.com/doci/6862/ATPipeline_575px.png[/img] From a good article on FCAT: [url]http://www.anandtech.com/show/6862/fcat-the-evolution-of-frame-interval-benchmarking-part-1[/url]
Where Fraps measures in the pipeline:

Image

From a good article on FCAT:

http://www.anandtech.com/show/6862/fcat-the-evolution-of-frame-interval-benchmarking-part-1

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#58
Posted 08/10/2013 11:53 AM   
Now that 3D Vision is supported in OpenGL on Geforce and you can write custom renderers in DirectX that don't rely on 3D Vision Automatic you can pretty much spoonfeed the exact images you want to the screen while using shutter glasses. I'm not saying it's easy but there is nothing stopping you from creating an application that uses the 3D rendering technique you are proposing. You have to realize that your argument about moving your hand in front of shutter glasses is terrible. 3D Vision glasses especially with lightboost stay open for a significant proportion of time while the backlight is dark to make it easier to see your surroundings. You also can't compare a frozen rendered image to the fluidity of the real world.
Now that 3D Vision is supported in OpenGL on Geforce and you can write custom renderers in DirectX that don't rely on 3D Vision Automatic you can pretty much spoonfeed the exact images you want to the screen while using shutter glasses. I'm not saying it's easy but there is nothing stopping you from creating an application that uses the 3D rendering technique you are proposing.

You have to realize that your argument about moving your hand in front of shutter glasses is terrible. 3D Vision glasses especially with lightboost stay open for a significant proportion of time while the backlight is dark to make it easier to see your surroundings.

You also can't compare a frozen rendered image to the fluidity of the real world.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

#59
Posted 08/10/2013 12:38 PM   
[quote="Flugan"]Now that 3D Vision is supported in OpenGL on Geforce and you can write custom renderers in DirectX that don't rely on 3D Vision Automatic you can pretty much spoonfeed the exact images you want to the screen while using shutter glasses. I'm not saying it's easy but there is nothing stopping you from creating an application that uses the 3D rendering technique you are proposing. You have to realize that your argument about moving your hand in front of shutter glasses is terrible. 3D Vision glasses especially with lightboost stay open for a significant proportion of time while the backlight is dark to make it easier to see your surroundings. You also can't compare a frozen rendered image to the fluidity of the real world.[/quote] It is the best example I could come up with, with the knowledge and skills I had. Not everyone has light-boost 2 glasses. For these people, it would be quite ideal. Since my experiment is "terrible", then you must have a better idea? If you can think of a better example, then please, by all means put it forth ;-) I also do not have the technical knowledge to write said software. Do you? I am a hardware engineer; my software writing skills are quite limited. Perhaps someone else can. It would certainly be a cool experiment. I would pay a reasonable sum to have something developed, if anyone is interested? helifax perhaps? All in the name of science and progress :)
Flugan said:Now that 3D Vision is supported in OpenGL on Geforce and you can write custom renderers in DirectX that don't rely on 3D Vision Automatic you can pretty much spoonfeed the exact images you want to the screen while using shutter glasses. I'm not saying it's easy but there is nothing stopping you from creating an application that uses the 3D rendering technique you are proposing.

You have to realize that your argument about moving your hand in front of shutter glasses is terrible. 3D Vision glasses especially with lightboost stay open for a significant proportion of time while the backlight is dark to make it easier to see your surroundings.

You also can't compare a frozen rendered image to the fluidity of the real world.


It is the best example I could come up with, with the knowledge and skills I had. Not everyone has light-boost 2 glasses. For these people, it would be quite ideal. Since my experiment is "terrible", then you must have a better idea? If you can think of a better example, then please, by all means put it forth ;-)

I also do not have the technical knowledge to write said software. Do you?
I am a hardware engineer; my software writing skills are quite limited.
Perhaps someone else can. It would certainly be a cool experiment.

I would pay a reasonable sum to have something developed, if anyone is interested?

helifax perhaps?

All in the name of science and progress :)

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#60
Posted 08/10/2013 01:03 PM   
  4 / 6    
Scroll To Top