Lower FPS with 3d Vision Enabled FPS drops by about 50% when enabled
2 / 2
I second what Helifax said.
[center]140 FPS - 2D @ 1080p
|
120 FPS - 3D @ 1080p
/ \
60 FPS + 60 FPS - per eye
[/center]
Also, since you are mostly concerned with framerate dropping, you should consider a current projector at 720p. This gives you a HUGE advantage in terms of PC requirements.
I still only SLI 580s because the only game I dip below 60 fps on the minimum is Metro. On most games I do FSAA technique to improve jaggies, and typically run at 1600x900 as AA.
If you have not seen one in person, you should. Just like SLI, you think there will be a terrible price to pay for the lower resolution, but it's just not true. The immersion with the lower hardware requirements is a winning combo.
Also, since you are mostly concerned with framerate dropping, you should consider a current projector at 720p. This gives you a HUGE advantage in terms of PC requirements.
I still only SLI 580s because the only game I dip below 60 fps on the minimum is Metro. On most games I do FSAA technique to improve jaggies, and typically run at 1600x900 as AA.
If you have not seen one in person, you should. Just like SLI, you think there will be a terrible price to pay for the lower resolution, but it's just not true. The immersion with the lower hardware requirements is a winning combo.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="helifax"]
FALSE!!!! 140fps in 2D = MAX 120fps in 3D (NO MATTER what you get in 2D without VSYNC)
Which in turn is 60fps PER EYE!!!!
In 3D Vision you cannot go above 60FPS and that is per swapBuffers routine which is responsible for both framebuffers (left + right eye) thus 60 fps per eye:)
[/quote]
I guess you ain't good at math are you? [b]140/2 = 70 FPS[/b] (3D output) which is divided again by [b]1/2[/b] for each of your eyes. [b]120 FPS[/b] is the MAX for [b]3D output from the monitor[/b] which is [b]60FPS/eye[/b], but calculating that in [b]2D output is 240FPS[/b] (120x2=240). The reason is simple - [b]for 3D[/b], your [b]GPU renders 2 times one frame[/b]. So good luck for achieving this!
helifax said:
FALSE!!!! 140fps in 2D = MAX 120fps in 3D (NO MATTER what you get in 2D without VSYNC)
Which in turn is 60fps PER EYE!!!!
In 3D Vision you cannot go above 60FPS and that is per swapBuffers routine which is responsible for both framebuffers (left + right eye) thus 60 fps per eye:)
I guess you ain't good at math are you? 140/2 = 70 FPS (3D output) which is divided again by 1/2 for each of your eyes. 120 FPS is the MAX for 3D output from the monitor which is 60FPS/eye, but calculating that in 2D output is 240FPS (120x2=240). The reason is simple - for 3D, your GPU renders 2 times one frame. So good luck for achieving this!
@WhiteSkyMage: Sorry man, you keep mixing up max frame rate with what each eye sees. No display anywhere does 120 fps [i]per eye[/i]. Helifax was observing that even if you run 140 Hz in 2D, 3D does not allow anything greater than 120 Hz.
Best to keep a civil tone while making your own mistakes. You have one too many divide by twos.
Take a careful look at your stick figure that I updated two posts above.
@WhiteSkyMage: Sorry man, you keep mixing up max frame rate with what each eye sees. No display anywhere does 120 fps per eye. Helifax was observing that even if you run 140 Hz in 2D, 3D does not allow anything greater than 120 Hz.
Best to keep a civil tone while making your own mistakes. You have one too many divide by twos.
Take a careful look at your stick figure that I updated two posts above.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]@WhiteSkyMage: Sorry man, you keep mixing up max frame rate with what each eye sees. No display anywhere does 120 fps [i]per eye[/i]. Helifax was observing that even if you run 140 Hz in 2D, 3D does not allow anything greater than 120 Hz.
Best to keep a civil tone while making your own mistakes. You have one too many divide by twos.[/quote]
Yes I do - I divide by 2 each time. You are right - there is no display that can offer you 120Hz/eye and also you can not run 3D at more than 120fps, because the 120Hz screens display only 120FPS max at 3D, but what you are getting is actually 60FPS/eye, not 120. However, what is 120FPS in 2D? Let me give you a small source:
http://www.tomshardware.co.uk/forum/113838-13-nvidia-help
(read [b]Jay_83[/b])
As I said, GPU renders TWICE one frame - so those 120FPS at 3D are actually 240FPS at 2D...again, you won't reach that soon... By the way, you said it on one of the posts where I was asking about 3D. This is stereoscopic 3D dude, it's not a joke...it's simply merciless...
bo3b said:@WhiteSkyMage: Sorry man, you keep mixing up max frame rate with what each eye sees. No display anywhere does 120 fps per eye. Helifax was observing that even if you run 140 Hz in 2D, 3D does not allow anything greater than 120 Hz.
Best to keep a civil tone while making your own mistakes. You have one too many divide by twos.
Yes I do - I divide by 2 each time. You are right - there is no display that can offer you 120Hz/eye and also you can not run 3D at more than 120fps, because the 120Hz screens display only 120FPS max at 3D, but what you are getting is actually 60FPS/eye, not 120. However, what is 120FPS in 2D? Let me give you a small source:
http://www.tomshardware.co.uk/forum/113838-13-nvidia-help (read Jay_83)
As I said, GPU renders TWICE one frame - so those 120FPS at 3D are actually 240FPS at 2D...again, you won't reach that soon... By the way, you said it on one of the posts where I was asking about 3D. This is stereoscopic 3D dude, it's not a joke...it's simply merciless...
No, you are still confused.
Let's back up. No one anywhere in the world is gaming in 2D at 240Hz. It does not exist. You may be thinking of the fake TV stuff that talks about 240Hz, but all of that is upconverting frame-sync junk. The actual vertical blanking rate is still 60Hz on those TVs.
Let's look at your stick figure again, that I fixed:
[center]
140 FPS - 2D @ 1080p
|
120 FPS - 3D @ 1080p
/ \
60 FPS + 60 FPS - per eye[/center]
There is only ONE divide by two in there.
There is no point in rendering frames faster than the hardware can display them. There is no need to render anything at 240 fps, because there is no hardware anywhere, at any price, that can display that.
Let's back up. No one anywhere in the world is gaming in 2D at 240Hz. It does not exist. You may be thinking of the fake TV stuff that talks about 240Hz, but all of that is upconverting frame-sync junk. The actual vertical blanking rate is still 60Hz on those TVs.
Let's look at your stick figure again, that I fixed:
There is no point in rendering frames faster than the hardware can display them. There is no need to render anything at 240 fps, because there is no hardware anywhere, at any price, that can display that.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]No, you are still confused.
Let's back up. No one anywhere in the world is gaming in 2D at 240Hz. It does not exist. You may be thinking of the fake TV stuff that talks about 240Hz, but all of that is upconverting frame-sync junk. The actual vertical blanking rate is still 60Hz on those TVs.
Let's look at your stick figure again, that I fixed:
[center]
140 FPS - 2D @ 1080p
|
120 FPS - 3D @ 1080p
/ \
60 FPS + 60 FPS - per eye[/center]
There is only ONE divide by two in there.
There is no point in rendering frames faster than the hardware can display them. There is no need to render anything at 240 fps, because there is no hardware anywhere, at any price, that can display that. [/quote]
Yup, you are right about that:
- There is [b]no way to display 240 FPS on 2D[/b] (because the screen would restrict you at max of 144FPS if it is 144Hz)
- There is [b]no hardware[/b] that can render at such a high framerate
-> Let's leave the consideration of what each eye gets for now.
Well that's why it's hard to get the perfect 120FPS stereoscopic 3D - because GPUs can't render that high framerates - it's all about performance - you will need a 4-way SLI to do that, which you can't, because 3D vision is limited to 2-way SLI. If you really want, SLI 2 GTX 690s or 2 Titans Z...would be a bit expensive :D
You are however again forgetting that the GPU needs to [b][u]render each frame twice[/b][/u] (i think i said that a couple of times already). You said it yourself - only [b]divide[/b] [b]ones[/b], alright but you divide the 2D FPS output by 2 to get the 3D-frame output. However to get the 2D output framerates (or the [b]240FPS of GPU POWER[/b]), you would do the opposite - [b]multiply[/b] 120 FPS by 2 and that's why the 2D image will be at 240FPS, which will not be displayed, only the cap FPS of your monitor will the framerate displayed and you don't need the performance that perfect 3D requires, to play at 2D 144FPS, you will need less.
That's why I say that 140FPS on 2D will be 70FPS at 3D, since you divide by 2. If you want to divide it another time to see the frame rate each eye sees or not it's ur decision - but 140FPS on 2D is not 120FPS on 3D output, that's completely wrong by all means...You simply need the extra performance for the 240FPS 2D to get ur "perfect" stereoscopic 3D 120FPS displayed on your screen.
In this case the figure for the [b]maximum 3D framerate[/b] (on let's say VG278QE) at [b]144Hz @ 144FPS[/b] would be:
[center][b]144FPS - 2D @ 1080p[/b] (lower boundary [b]60FPS with V-Sync[/b]; [b]30FPS with G-Sync[/b])
|
1/2
|
[b]72FPS - 3D @ 1080p[/b] (Lower boundary [b]60FPS[/b] with[b] V-Sync[/b])
(divided ones - [b]GPU rendering twice each frame[/b])
[/center]
This figure actually show how Frame rate drops work when you've got only the driver from Nvidia pushing 3D, and there is no 3D support from the devs... Sometimes that game is optimized good for 3D, there are different solutions. If the game is really supported and it's rated "excellent" for 3D, even at[b] twice [/b]the rendering, [b]you will not loose 50%[/b] of your frame rates, but again, they will decrease! The point I am making here is that the 50% FPS drop is just that big that you will hardly manage to go around it. The performance required is simply tremendous.
That's how i judge if I would buy this generation GPUs or cancel my build - All benchmarks are made on 2D gaming. I divide each frame rate of 2D by 2 to see what framerate will i get on 3D when i switch on the 3D vision mode at this resolution. I don't know how they do those tests, that sometimes framerates go as high as 160FPS at a resolution, having in mind that would be 80FPS at 3D, I would take this GPU, since it is far higher frame rate from the 60FPS boundary so I am sure that even in the most demanding scenes of a game, if this framerate is far off, it will go down but not go below the 60FPS boundary which is crucial when V-Sync is on (G-Sync would not work and it's also useless - it's more useful if you are playing a game at 30-60FPS). Maybe you know that [b]stutter[/b] starts below [b]60FPS with V-Sync[/b] on both 2D and 3D, that's why 60FPS is really the boundary for 2D AND 3D (with V-Sync) and 30FPS is boundary for 2D gaming (with G-Sync).
Now coming back to the frames for each eye, imagine how stuttering feel when you are gaming at 3D...terrible experience...In this case we will need "wasted frames" which are 120FPS for perfect stereoscopic 3D ...and here is where I come to the point where each eye would see 120FPS because of that... So yup, it all adds up to [b]240FPS[/b] again. :D I [b]divided by 2 twice[/b], because [u]I did not count the stutter and the "wasted" frames needed, I admit that's my mistake.[/u]
So yes for a small conclusion - if you want the best fluid 3D motions, 3D in this case is a merciless technology... :D you will need the GPU power of 240FPS (2D frame rate) to achieve this. This is why 3D monitors are "performance hungry". I would imagine how would I be getting high FPS at 1440p resolution on Asus Swift monitor...I would have to cool and overclock those 2 cards with no mercy :D counting that I wanted to combine all the Nvidia texture together...well looks like I will have not only to switch them off but to sacrifice some graphics quality xD Nvidia will either have to enable more SLI support to 3D or improve customization drivers for 3D games, otherwise the huge frame drops will be horrific in future games...
Here is another source:
[quote="Foulplay"]
...3D requires huge amounts of additional processing, and unless games use the Power 3D or Crysis 2 3D technique, you will lose roughly 50% performance in most games when switching on 3D because every frame has to be rendered twice. If you get the odd game that really sucks then maybe the game is to blame for bad optimisation and is probably a console port, but most games seem to have the same (roughly 50%) FPS drop in 3D.
[/quote]
[url]https://www.tridef.com/forum/viewtopic.php?f=2&t=2470&view=next[/url]
Read this, explains it better:
http://hardforum.com/showthread.php?t=1723120
[quote="mdrejhon"]
If you are someone who WANT to get the "120 fps @ 120 Hz" fluid motion effect during nVidia 3D Vision, you need 240fps of GPU power.
This is even with VSYNC on (required for 3D).
With the nVidia 3D shutter glasses, you're sending 60fps to each eye (60/60).
(This does not apply to passive 3D, which display both left/right eye frames at the same time)
However, each eye is temporally displaced by 1/120th of a second, taking turns 1/120th of a second apart.
You are always getting a continuous 120 images per second, even if even images goes to one eye, and odd images go to the other eye.
Generating just 60fps isn't smooth enough for _active_ shutter glasses.
You really need to generate each frame 1/120th of a second apart to get fully fluid motion.
Here's why do you need 240 fps to get the full "120 fps effect" during 3D:
...
[/quote]
Let's back up. No one anywhere in the world is gaming in 2D at 240Hz. It does not exist. You may be thinking of the fake TV stuff that talks about 240Hz, but all of that is upconverting frame-sync junk. The actual vertical blanking rate is still 60Hz on those TVs.
Let's look at your stick figure again, that I fixed:
There is no point in rendering frames faster than the hardware can display them. There is no need to render anything at 240 fps, because there is no hardware anywhere, at any price, that can display that.
Yup, you are right about that:
- There is no way to display 240 FPS on 2D (because the screen would restrict you at max of 144FPS if it is 144Hz)
- There is no hardware that can render at such a high framerate
-> Let's leave the consideration of what each eye gets for now.
Well that's why it's hard to get the perfect 120FPS stereoscopic 3D - because GPUs can't render that high framerates - it's all about performance - you will need a 4-way SLI to do that, which you can't, because 3D vision is limited to 2-way SLI. If you really want, SLI 2 GTX 690s or 2 Titans Z...would be a bit expensive :D
You are however again forgetting that the GPU needs to render each frame twice (i think i said that a couple of times already). You said it yourself - only divideones, alright but you divide the 2D FPS output by 2 to get the 3D-frame output. However to get the 2D output framerates (or the 240FPS of GPU POWER), you would do the opposite - multiply 120 FPS by 2 and that's why the 2D image will be at 240FPS, which will not be displayed, only the cap FPS of your monitor will the framerate displayed and you don't need the performance that perfect 3D requires, to play at 2D 144FPS, you will need less.
That's why I say that 140FPS on 2D will be 70FPS at 3D, since you divide by 2. If you want to divide it another time to see the frame rate each eye sees or not it's ur decision - but 140FPS on 2D is not 120FPS on 3D output, that's completely wrong by all means...You simply need the extra performance for the 240FPS 2D to get ur "perfect" stereoscopic 3D 120FPS displayed on your screen.
In this case the figure for the maximum 3D framerate (on let's say VG278QE) at 144Hz @ 144FPS would be:
144FPS - 2D @ 1080p (lower boundary 60FPS with V-Sync; 30FPS with G-Sync)
|
1/2
| 72FPS - 3D @ 1080p (Lower boundary 60FPS with V-Sync)
(divided ones - GPU rendering twice each frame)
This figure actually show how Frame rate drops work when you've got only the driver from Nvidia pushing 3D, and there is no 3D support from the devs... Sometimes that game is optimized good for 3D, there are different solutions. If the game is really supported and it's rated "excellent" for 3D, even at twice the rendering, you will not loose 50% of your frame rates, but again, they will decrease! The point I am making here is that the 50% FPS drop is just that big that you will hardly manage to go around it. The performance required is simply tremendous.
That's how i judge if I would buy this generation GPUs or cancel my build - All benchmarks are made on 2D gaming. I divide each frame rate of 2D by 2 to see what framerate will i get on 3D when i switch on the 3D vision mode at this resolution. I don't know how they do those tests, that sometimes framerates go as high as 160FPS at a resolution, having in mind that would be 80FPS at 3D, I would take this GPU, since it is far higher frame rate from the 60FPS boundary so I am sure that even in the most demanding scenes of a game, if this framerate is far off, it will go down but not go below the 60FPS boundary which is crucial when V-Sync is on (G-Sync would not work and it's also useless - it's more useful if you are playing a game at 30-60FPS). Maybe you know that stutter starts below 60FPS with V-Sync on both 2D and 3D, that's why 60FPS is really the boundary for 2D AND 3D (with V-Sync) and 30FPS is boundary for 2D gaming (with G-Sync).
Now coming back to the frames for each eye, imagine how stuttering feel when you are gaming at 3D...terrible experience...In this case we will need "wasted frames" which are 120FPS for perfect stereoscopic 3D ...and here is where I come to the point where each eye would see 120FPS because of that... So yup, it all adds up to 240FPS again. :D I divided by 2 twice, because I did not count the stutter and the "wasted" frames needed, I admit that's my mistake.
So yes for a small conclusion - if you want the best fluid 3D motions, 3D in this case is a merciless technology... :D you will need the GPU power of 240FPS (2D frame rate) to achieve this. This is why 3D monitors are "performance hungry". I would imagine how would I be getting high FPS at 1440p resolution on Asus Swift monitor...I would have to cool and overclock those 2 cards with no mercy :D counting that I wanted to combine all the Nvidia texture together...well looks like I will have not only to switch them off but to sacrifice some graphics quality xD Nvidia will either have to enable more SLI support to 3D or improve customization drivers for 3D games, otherwise the huge frame drops will be horrific in future games...
Here is another source:
Foulplay said:
...3D requires huge amounts of additional processing, and unless games use the Power 3D or Crysis 2 3D technique, you will lose roughly 50% performance in most games when switching on 3D because every frame has to be rendered twice. If you get the odd game that really sucks then maybe the game is to blame for bad optimisation and is probably a console port, but most games seem to have the same (roughly 50%) FPS drop in 3D.
mdrejhon said:
If you are someone who WANT to get the "120 fps @ 120 Hz" fluid motion effect during nVidia 3D Vision, you need 240fps of GPU power.
This is even with VSYNC on (required for 3D).
With the nVidia 3D shutter glasses, you're sending 60fps to each eye (60/60).
(This does not apply to passive 3D, which display both left/right eye frames at the same time)
However, each eye is temporally displaced by 1/120th of a second, taking turns 1/120th of a second apart.
You are always getting a continuous 120 images per second, even if even images goes to one eye, and odd images go to the other eye.
Generating just 60fps isn't smooth enough for _active_ shutter glasses.
You really need to generate each frame 1/120th of a second apart to get fully fluid motion.
Here's why do you need 240 fps to get the full "120 fps effect" during 3D:
...
My friend...you are confused....really... Read more about it first then come here and tell me that I don't know math since it is clearly the other way around...
First of all there is no 3D Vision at anything higher than 120hz NO MATTER HOW MANY REAL HZ your monitor has (but there is at 100, 110 and 120)
Next... 120 frames that YOU RENDER in 2D (with Vblank ON) will give you 60fps in 3D.
There is no 120FPS RENDERING IN 3D!!!!!! or the stuff you are going around there... the maximum FPS that you can get in 3D Vision is 60FPS which like I said is being calculated on the swap of buffers (that contains BOTH EYES). So 60x2 = 120 FPS that you normally get in 2D.
Out of curiosity have you even tested "the thing" you are saying?? If you can give PROOF that what you are claiming is true PLEASE DO SHARE and enlighten us all "dummies" around here:)
My friend...you are confused....really... Read more about it first then come here and tell me that I don't know math since it is clearly the other way around...
First of all there is no 3D Vision at anything higher than 120hz NO MATTER HOW MANY REAL HZ your monitor has (but there is at 100, 110 and 120)
Next... 120 frames that YOU RENDER in 2D (with Vblank ON) will give you 60fps in 3D.
There is no 120FPS RENDERING IN 3D!!!!!! or the stuff you are going around there... the maximum FPS that you can get in 3D Vision is 60FPS which like I said is being calculated on the swap of buffers (that contains BOTH EYES). So 60x2 = 120 FPS that you normally get in 2D.
Out of curiosity have you even tested "the thing" you are saying?? If you can give PROOF that what you are claiming is true PLEASE DO SHARE and enlighten us all "dummies" around here:)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
I know, it gets confusing as there are a few things going on that frame rate reporting tools don't understand.
There are a number of games that WITHOUT VSYNC on will show in many reporting tools clocking in over 400 FPS!! Of course no monitor can display that so what you see is the 2D refresh HZ of your monitor and the extra frames get over written so fast that you can't see it.
If you have a 60hz monitor you'll see 60FPS. If you have a 144hz monitor you'll see 144hz. Even while the reporting tool claims that your GPU is rendering 400FPS your really mostly only seeing whatever the refresh rate of your monitor is. If the FPS is high enough you won't see all the screen tearing going on while the GPU constantly updates the screen buffer while its drawing the image on your screen.
Most (if not all) FPS reporting tools hook onto DX driver calls BEFORE 3DVision does its "magic".
So what happens with that 400hz FPS when 3DVision is enabled?
The 1st thing is that whatever FPS reporting tool your using is going to report 60FPS.
One of the reasons why some of the Hardware Press websites are now prone to measuring the frames as they come out of the CONNECTOR on the back of your computer on their way to the monitor is because simply "hooking onto" the DX driver calls does not tell the whole story of what the GPU card is doing. None of the aftermarket tools hook onto how often 3DVision creates a frame or the timing of the GPU sending it to your monitor - they have to hook onto the "chain of software commands" to far in ahead of whats actually going on with the Hardware driver and GPU.
With 3DVision this (60FPS or 60HZ) is multiplied 2x to get 120hz - 60hz per eye.
Most (if not all) 3D active shutter glasses are designed for opening each eye 60hz per second. Some of the modern DLP projectors have a 144HZ rating because 120Hz (60 per eye +24 sync frames per second = 144hz) DLP uses injected visual "sync" frames that happen (optimally) when both eyes (shutters are closed) are dark.
So the math as I understand it equates to: "large 2D FPS number" + 3DVision = 60 FPS *2 = 60Hz per eye.
(120hz or 120FPS)
While there is much debate on the actual "FPS COST" with each 3D solution with some estimates being as little as 15% overhead and others being as high as 60-75% overhead the MIN-FPS number is the one you have to worry about.
Chances are that if your able to get an AVG-FPS of 60FPS in 3D or 120FPS or better in 2D you'll have IMHO decent enough game play with a few (maybe noticeable or not) dips, stutters and dropped frames from time to time. Then again, not many "off the shelf" computers are able to play very many (if any) modern games with an 2D AVG-FPS of 120hz or better. Heck, I still can't "max out" Crysis1 with S3D and get a playable frame rate - I have to turn some graphics settings down - and this is a fairly OLD game!
I hope this helps.
I know, it gets confusing as there are a few things going on that frame rate reporting tools don't understand.
There are a number of games that WITHOUT VSYNC on will show in many reporting tools clocking in over 400 FPS!! Of course no monitor can display that so what you see is the 2D refresh HZ of your monitor and the extra frames get over written so fast that you can't see it.
If you have a 60hz monitor you'll see 60FPS. If you have a 144hz monitor you'll see 144hz. Even while the reporting tool claims that your GPU is rendering 400FPS your really mostly only seeing whatever the refresh rate of your monitor is. If the FPS is high enough you won't see all the screen tearing going on while the GPU constantly updates the screen buffer while its drawing the image on your screen.
Most (if not all) FPS reporting tools hook onto DX driver calls BEFORE 3DVision does its "magic".
So what happens with that 400hz FPS when 3DVision is enabled?
The 1st thing is that whatever FPS reporting tool your using is going to report 60FPS.
One of the reasons why some of the Hardware Press websites are now prone to measuring the frames as they come out of the CONNECTOR on the back of your computer on their way to the monitor is because simply "hooking onto" the DX driver calls does not tell the whole story of what the GPU card is doing. None of the aftermarket tools hook onto how often 3DVision creates a frame or the timing of the GPU sending it to your monitor - they have to hook onto the "chain of software commands" to far in ahead of whats actually going on with the Hardware driver and GPU.
With 3DVision this (60FPS or 60HZ) is multiplied 2x to get 120hz - 60hz per eye.
Most (if not all) 3D active shutter glasses are designed for opening each eye 60hz per second. Some of the modern DLP projectors have a 144HZ rating because 120Hz (60 per eye +24 sync frames per second = 144hz) DLP uses injected visual "sync" frames that happen (optimally) when both eyes (shutters are closed) are dark.
So the math as I understand it equates to: "large 2D FPS number" + 3DVision = 60 FPS *2 = 60Hz per eye.
(120hz or 120FPS)
While there is much debate on the actual "FPS COST" with each 3D solution with some estimates being as little as 15% overhead and others being as high as 60-75% overhead the MIN-FPS number is the one you have to worry about.
Chances are that if your able to get an AVG-FPS of 60FPS in 3D or 120FPS or better in 2D you'll have IMHO decent enough game play with a few (maybe noticeable or not) dips, stutters and dropped frames from time to time. Then again, not many "off the shelf" computers are able to play very many (if any) modern games with an 2D AVG-FPS of 120hz or better. Heck, I still can't "max out" Crysis1 with S3D and get a playable frame rate - I have to turn some graphics settings down - and this is a fairly OLD game!
Man, I don't know who's actually confused, but I did not say ANYTHING about 3D vision at higher frame rate than 120Hz. I think you miss understood me there... I was talking about GPU power FPS which, for having perfect fluid motion of 120Hz 3D vision, you need to have a performance which is able to drive a game at 240FPS on 2D, because of the GPU [b]Twice frame rendering[/b]. However that's maybe a bit wrong there as well...for some and for others is acceptable.
And for the frame rate of the eyes, you see, a lot of people have different opinions - I first thought like you - that for each eye the max frame rate is 60FPS so that 60+60 is 120 in total, which you just said, however I read stuff around like those sources above, and people say different stuff...i don't know who to trust sorry for that...i am also searching for a correct answer of how it all works out.
I will accept it the way you guys say it is - 144Hz max for 2D, 120Hz max for 3D and each eye gets 60FPS for a total of 120FPS.
Anyhow, FPS/eye doesn't really matter that much, what matters is that the 50% drop of FPS which occurs in some games, is simply huge and if that thing continues, 3D games in future will be hard to handle. Driver support and optimization is crucial for 3D vision... Let's just, you know, wait and see what will the next GPUs offer and see what we can get out of it.
Thanks mbloof for ur explanation...
Man, I don't know who's actually confused, but I did not say ANYTHING about 3D vision at higher frame rate than 120Hz. I think you miss understood me there... I was talking about GPU power FPS which, for having perfect fluid motion of 120Hz 3D vision, you need to have a performance which is able to drive a game at 240FPS on 2D, because of the GPU Twice frame rendering. However that's maybe a bit wrong there as well...for some and for others is acceptable.
And for the frame rate of the eyes, you see, a lot of people have different opinions - I first thought like you - that for each eye the max frame rate is 60FPS so that 60+60 is 120 in total, which you just said, however I read stuff around like those sources above, and people say different stuff...i don't know who to trust sorry for that...i am also searching for a correct answer of how it all works out.
I will accept it the way you guys say it is - 144Hz max for 2D, 120Hz max for 3D and each eye gets 60FPS for a total of 120FPS.
Anyhow, FPS/eye doesn't really matter that much, what matters is that the 50% drop of FPS which occurs in some games, is simply huge and if that thing continues, 3D games in future will be hard to handle. Driver support and optimization is crucial for 3D vision... Let's just, you know, wait and see what will the next GPUs offer and see what we can get out of it.
[quote="WhiteSkyMage"]Man, I don't know who's actually confused, but I did not say ANYTHING about 3D vision at higher frame rate than 120Hz. I think you miss understood me there... I was talking about GPU power FPS which, for having perfect fluid motion of 120Hz 3D vision, you need to have a performance which is able to drive a game at 240FPS on 2D, because of the GPU [b]Twice frame rendering[/b]. However that's maybe a bit wrong there...[/quote]
It helps that I'm a photographer and have 3D cameras. :)
For each FRAME the Game renders the 3DVision driver will calculate 2 different viewpoints - one for each eye. Hence you don't need 240FPS to get a 60FPS (120hz or 60FPS per eye). Granted, the higher performance your gaming rig has the less dips and stutters you'll have. If your AVG-FPS or MAX-FPS in 2D is only 60FPS (with Vsync off) your not going to like turning on 3DVision.
240FPS 2D performance? What kind? Is that MIN-FPS/AVG-FPS or MAX-FPS? Generally speaking 120FPS-AVG 2D performance is a good enough measure most of the time for decent S3D performance. YMMV.
One thing is for sure: I started my PC 3DGaming with a single GTX460 and after 3 GPU upgrades I still want more!! (maybe twin GTX1080ti's in SLI?)
WhiteSkyMage said:Man, I don't know who's actually confused, but I did not say ANYTHING about 3D vision at higher frame rate than 120Hz. I think you miss understood me there... I was talking about GPU power FPS which, for having perfect fluid motion of 120Hz 3D vision, you need to have a performance which is able to drive a game at 240FPS on 2D, because of the GPU Twice frame rendering. However that's maybe a bit wrong there...
It helps that I'm a photographer and have 3D cameras. :)
For each FRAME the Game renders the 3DVision driver will calculate 2 different viewpoints - one for each eye. Hence you don't need 240FPS to get a 60FPS (120hz or 60FPS per eye). Granted, the higher performance your gaming rig has the less dips and stutters you'll have. If your AVG-FPS or MAX-FPS in 2D is only 60FPS (with Vsync off) your not going to like turning on 3DVision.
240FPS 2D performance? What kind? Is that MIN-FPS/AVG-FPS or MAX-FPS? Generally speaking 120FPS-AVG 2D performance is a good enough measure most of the time for decent S3D performance. YMMV.
One thing is for sure: I started my PC 3DGaming with a single GTX460 and after 3 GPU upgrades I still want more!! (maybe twin GTX1080ti's in SLI?)
Yeah I was talking about Max 240 FPS rendering at 2D, but the thing is that, when I gamed at 3D with a friend, the frame rate dropped 50% down, so luckily he had double GTX 680s and we were playing an MMO, so the stutter wasn't that horrible, but it was still noticeable.
I read that lots of people happen to get this huge frame rate drop and for me that means that, to make sure i cover the FPS drop and at the same time have 120Hz (60+60FPS at 3D) with V-Sync, with no stutters, i would need to aim at a big 2D frame rate - say 240 (double of 120Hz). I count FPS drops as well here, because I see that in some drastic scenes, the FPS can actually drop down another 50%. So yeah I just was looking for a way to "cover" as much as possible to possibility of stutters.
So I was thinking about going SLI with 2 GTX 860s (Maxwell)...So that even if I get around AVG 140 FPS for 2D, I can enjoy less stuttering at 3D. But after seeing so many 3D gamers having the problem of massive FPS drops at 3D vision and whatever stuttering, ripping, I wasn't sure of what to do so in the worst case, only to cover all those FPS (which are bound to be lost for some reason), I was thinking for a double 880s or double Titans or some crazy high end card which I could SLI...
On top of that I was crazy and I still am about combining ALL Nvidia texture technologies, so stuff like TXAA, PhysX, HBAO, ShadowPlay etc. to combine with 3D vision on highest settings. Well that's really another reason for me to aim at crazily highest-end performance I could get with even overclocking.
Yeah I was talking about Max 240 FPS rendering at 2D, but the thing is that, when I gamed at 3D with a friend, the frame rate dropped 50% down, so luckily he had double GTX 680s and we were playing an MMO, so the stutter wasn't that horrible, but it was still noticeable.
I read that lots of people happen to get this huge frame rate drop and for me that means that, to make sure i cover the FPS drop and at the same time have 120Hz (60+60FPS at 3D) with V-Sync, with no stutters, i would need to aim at a big 2D frame rate - say 240 (double of 120Hz). I count FPS drops as well here, because I see that in some drastic scenes, the FPS can actually drop down another 50%. So yeah I just was looking for a way to "cover" as much as possible to possibility of stutters.
So I was thinking about going SLI with 2 GTX 860s (Maxwell)...So that even if I get around AVG 140 FPS for 2D, I can enjoy less stuttering at 3D. But after seeing so many 3D gamers having the problem of massive FPS drops at 3D vision and whatever stuttering, ripping, I wasn't sure of what to do so in the worst case, only to cover all those FPS (which are bound to be lost for some reason), I was thinking for a double 880s or double Titans or some crazy high end card which I could SLI...
On top of that I was crazy and I still am about combining ALL Nvidia texture technologies, so stuff like TXAA, PhysX, HBAO, ShadowPlay etc. to combine with 3D vision on highest settings. Well that's really another reason for me to aim at crazily highest-end performance I could get with even overclocking.
|
120 FPS - 3D @ 1080p
/ \
60 FPS + 60 FPS - per eye
Also, since you are mostly concerned with framerate dropping, you should consider a current projector at 720p. This gives you a HUGE advantage in terms of PC requirements.
I still only SLI 580s because the only game I dip below 60 fps on the minimum is Metro. On most games I do FSAA technique to improve jaggies, and typically run at 1600x900 as AA.
If you have not seen one in person, you should. Just like SLI, you think there will be a terrible price to pay for the lower resolution, but it's just not true. The immersion with the lower hardware requirements is a winning combo.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I guess you ain't good at math are you? 140/2 = 70 FPS (3D output) which is divided again by 1/2 for each of your eyes. 120 FPS is the MAX for 3D output from the monitor which is 60FPS/eye, but calculating that in 2D output is 240FPS (120x2=240). The reason is simple - for 3D, your GPU renders 2 times one frame. So good luck for achieving this!
Best to keep a civil tone while making your own mistakes. You have one too many divide by twos.
Take a careful look at your stick figure that I updated two posts above.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Yes I do - I divide by 2 each time. You are right - there is no display that can offer you 120Hz/eye and also you can not run 3D at more than 120fps, because the 120Hz screens display only 120FPS max at 3D, but what you are getting is actually 60FPS/eye, not 120. However, what is 120FPS in 2D? Let me give you a small source:
http://www.tomshardware.co.uk/forum/113838-13-nvidia-help
(read Jay_83)
As I said, GPU renders TWICE one frame - so those 120FPS at 3D are actually 240FPS at 2D...again, you won't reach that soon... By the way, you said it on one of the posts where I was asking about 3D. This is stereoscopic 3D dude, it's not a joke...it's simply merciless...
Let's back up. No one anywhere in the world is gaming in 2D at 240Hz. It does not exist. You may be thinking of the fake TV stuff that talks about 240Hz, but all of that is upconverting frame-sync junk. The actual vertical blanking rate is still 60Hz on those TVs.
Let's look at your stick figure again, that I fixed:
140 FPS - 2D @ 1080p
|
120 FPS - 3D @ 1080p
/ \
60 FPS + 60 FPS - per eye
There is only ONE divide by two in there.
There is no point in rendering frames faster than the hardware can display them. There is no need to render anything at 240 fps, because there is no hardware anywhere, at any price, that can display that.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Yup, you are right about that:
- There is no way to display 240 FPS on 2D (because the screen would restrict you at max of 144FPS if it is 144Hz)
- There is no hardware that can render at such a high framerate
-> Let's leave the consideration of what each eye gets for now.
Well that's why it's hard to get the perfect 120FPS stereoscopic 3D - because GPUs can't render that high framerates - it's all about performance - you will need a 4-way SLI to do that, which you can't, because 3D vision is limited to 2-way SLI. If you really want, SLI 2 GTX 690s or 2 Titans Z...would be a bit expensive :D
You are however again forgetting that the GPU needs to render each frame twice (i think i said that a couple of times already). You said it yourself - only divide ones, alright but you divide the 2D FPS output by 2 to get the 3D-frame output. However to get the 2D output framerates (or the 240FPS of GPU POWER), you would do the opposite - multiply 120 FPS by 2 and that's why the 2D image will be at 240FPS, which will not be displayed, only the cap FPS of your monitor will the framerate displayed and you don't need the performance that perfect 3D requires, to play at 2D 144FPS, you will need less.
That's why I say that 140FPS on 2D will be 70FPS at 3D, since you divide by 2. If you want to divide it another time to see the frame rate each eye sees or not it's ur decision - but 140FPS on 2D is not 120FPS on 3D output, that's completely wrong by all means...You simply need the extra performance for the 240FPS 2D to get ur "perfect" stereoscopic 3D 120FPS displayed on your screen.
In this case the figure for the maximum 3D framerate (on let's say VG278QE) at 144Hz @ 144FPS would be:
|
1/2
|
72FPS - 3D @ 1080p (Lower boundary 60FPS with V-Sync)
(divided ones - GPU rendering twice each frame)
This figure actually show how Frame rate drops work when you've got only the driver from Nvidia pushing 3D, and there is no 3D support from the devs... Sometimes that game is optimized good for 3D, there are different solutions. If the game is really supported and it's rated "excellent" for 3D, even at twice the rendering, you will not loose 50% of your frame rates, but again, they will decrease! The point I am making here is that the 50% FPS drop is just that big that you will hardly manage to go around it. The performance required is simply tremendous.
That's how i judge if I would buy this generation GPUs or cancel my build - All benchmarks are made on 2D gaming. I divide each frame rate of 2D by 2 to see what framerate will i get on 3D when i switch on the 3D vision mode at this resolution. I don't know how they do those tests, that sometimes framerates go as high as 160FPS at a resolution, having in mind that would be 80FPS at 3D, I would take this GPU, since it is far higher frame rate from the 60FPS boundary so I am sure that even in the most demanding scenes of a game, if this framerate is far off, it will go down but not go below the 60FPS boundary which is crucial when V-Sync is on (G-Sync would not work and it's also useless - it's more useful if you are playing a game at 30-60FPS). Maybe you know that stutter starts below 60FPS with V-Sync on both 2D and 3D, that's why 60FPS is really the boundary for 2D AND 3D (with V-Sync) and 30FPS is boundary for 2D gaming (with G-Sync).
Now coming back to the frames for each eye, imagine how stuttering feel when you are gaming at 3D...terrible experience...In this case we will need "wasted frames" which are 120FPS for perfect stereoscopic 3D ...and here is where I come to the point where each eye would see 120FPS because of that... So yup, it all adds up to 240FPS again. :D I divided by 2 twice, because I did not count the stutter and the "wasted" frames needed, I admit that's my mistake.
So yes for a small conclusion - if you want the best fluid 3D motions, 3D in this case is a merciless technology... :D you will need the GPU power of 240FPS (2D frame rate) to achieve this. This is why 3D monitors are "performance hungry". I would imagine how would I be getting high FPS at 1440p resolution on Asus Swift monitor...I would have to cool and overclock those 2 cards with no mercy :D counting that I wanted to combine all the Nvidia texture together...well looks like I will have not only to switch them off but to sacrifice some graphics quality xD Nvidia will either have to enable more SLI support to 3D or improve customization drivers for 3D games, otherwise the huge frame drops will be horrific in future games...
Here is another source:
https://www.tridef.com/forum/viewtopic.php?f=2&t=2470&view=next
Read this, explains it better:
http://hardforum.com/showthread.php?t=1723120
First of all there is no 3D Vision at anything higher than 120hz NO MATTER HOW MANY REAL HZ your monitor has (but there is at 100, 110 and 120)
Next... 120 frames that YOU RENDER in 2D (with Vblank ON) will give you 60fps in 3D.
There is no 120FPS RENDERING IN 3D!!!!!! or the stuff you are going around there... the maximum FPS that you can get in 3D Vision is 60FPS which like I said is being calculated on the swap of buffers (that contains BOTH EYES). So 60x2 = 120 FPS that you normally get in 2D.
Out of curiosity have you even tested "the thing" you are saying?? If you can give PROOF that what you are claiming is true PLEASE DO SHARE and enlighten us all "dummies" around here:)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
There are a number of games that WITHOUT VSYNC on will show in many reporting tools clocking in over 400 FPS!! Of course no monitor can display that so what you see is the 2D refresh HZ of your monitor and the extra frames get over written so fast that you can't see it.
If you have a 60hz monitor you'll see 60FPS. If you have a 144hz monitor you'll see 144hz. Even while the reporting tool claims that your GPU is rendering 400FPS your really mostly only seeing whatever the refresh rate of your monitor is. If the FPS is high enough you won't see all the screen tearing going on while the GPU constantly updates the screen buffer while its drawing the image on your screen.
Most (if not all) FPS reporting tools hook onto DX driver calls BEFORE 3DVision does its "magic".
So what happens with that 400hz FPS when 3DVision is enabled?
The 1st thing is that whatever FPS reporting tool your using is going to report 60FPS.
One of the reasons why some of the Hardware Press websites are now prone to measuring the frames as they come out of the CONNECTOR on the back of your computer on their way to the monitor is because simply "hooking onto" the DX driver calls does not tell the whole story of what the GPU card is doing. None of the aftermarket tools hook onto how often 3DVision creates a frame or the timing of the GPU sending it to your monitor - they have to hook onto the "chain of software commands" to far in ahead of whats actually going on with the Hardware driver and GPU.
With 3DVision this (60FPS or 60HZ) is multiplied 2x to get 120hz - 60hz per eye.
Most (if not all) 3D active shutter glasses are designed for opening each eye 60hz per second. Some of the modern DLP projectors have a 144HZ rating because 120Hz (60 per eye +24 sync frames per second = 144hz) DLP uses injected visual "sync" frames that happen (optimally) when both eyes (shutters are closed) are dark.
So the math as I understand it equates to: "large 2D FPS number" + 3DVision = 60 FPS *2 = 60Hz per eye.
(120hz or 120FPS)
While there is much debate on the actual "FPS COST" with each 3D solution with some estimates being as little as 15% overhead and others being as high as 60-75% overhead the MIN-FPS number is the one you have to worry about.
Chances are that if your able to get an AVG-FPS of 60FPS in 3D or 120FPS or better in 2D you'll have IMHO decent enough game play with a few (maybe noticeable or not) dips, stutters and dropped frames from time to time. Then again, not many "off the shelf" computers are able to play very many (if any) modern games with an 2D AVG-FPS of 120hz or better. Heck, I still can't "max out" Crysis1 with S3D and get a playable frame rate - I have to turn some graphics settings down - and this is a fairly OLD game!
I hope this helps.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
And for the frame rate of the eyes, you see, a lot of people have different opinions - I first thought like you - that for each eye the max frame rate is 60FPS so that 60+60 is 120 in total, which you just said, however I read stuff around like those sources above, and people say different stuff...i don't know who to trust sorry for that...i am also searching for a correct answer of how it all works out.
I will accept it the way you guys say it is - 144Hz max for 2D, 120Hz max for 3D and each eye gets 60FPS for a total of 120FPS.
Anyhow, FPS/eye doesn't really matter that much, what matters is that the 50% drop of FPS which occurs in some games, is simply huge and if that thing continues, 3D games in future will be hard to handle. Driver support and optimization is crucial for 3D vision... Let's just, you know, wait and see what will the next GPUs offer and see what we can get out of it.
Thanks mbloof for ur explanation...
It helps that I'm a photographer and have 3D cameras. :)
For each FRAME the Game renders the 3DVision driver will calculate 2 different viewpoints - one for each eye. Hence you don't need 240FPS to get a 60FPS (120hz or 60FPS per eye). Granted, the higher performance your gaming rig has the less dips and stutters you'll have. If your AVG-FPS or MAX-FPS in 2D is only 60FPS (with Vsync off) your not going to like turning on 3DVision.
240FPS 2D performance? What kind? Is that MIN-FPS/AVG-FPS or MAX-FPS? Generally speaking 120FPS-AVG 2D performance is a good enough measure most of the time for decent S3D performance. YMMV.
One thing is for sure: I started my PC 3DGaming with a single GTX460 and after 3 GPU upgrades I still want more!! (maybe twin GTX1080ti's in SLI?)
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
I read that lots of people happen to get this huge frame rate drop and for me that means that, to make sure i cover the FPS drop and at the same time have 120Hz (60+60FPS at 3D) with V-Sync, with no stutters, i would need to aim at a big 2D frame rate - say 240 (double of 120Hz). I count FPS drops as well here, because I see that in some drastic scenes, the FPS can actually drop down another 50%. So yeah I just was looking for a way to "cover" as much as possible to possibility of stutters.
So I was thinking about going SLI with 2 GTX 860s (Maxwell)...So that even if I get around AVG 140 FPS for 2D, I can enjoy less stuttering at 3D. But after seeing so many 3D gamers having the problem of massive FPS drops at 3D vision and whatever stuttering, ripping, I wasn't sure of what to do so in the worst case, only to cover all those FPS (which are bound to be lost for some reason), I was thinking for a double 880s or double Titans or some crazy high end card which I could SLI...
On top of that I was crazy and I still am about combining ALL Nvidia texture technologies, so stuff like TXAA, PhysX, HBAO, ShadowPlay etc. to combine with 3D vision on highest settings. Well that's really another reason for me to aim at crazily highest-end performance I could get with even overclocking.