NVIDIA Working on Something Big!
  11 / 15    
[quote="RAGEdemon"]As usual, misinformation on this board is rampant... Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation. [/quote]Well I am personally offended by the suggestion that I'm spreading misinformation. You cannot take the same limited information that I've seen and say that you somehow understand better than everyone else and that the rest of us are all NVidia tools. The real answer is that none of us know, we are all guessing based on limited information. Not one of us here has actually seen it in operation. If you don't mind, I'll take the word of NVidia and Anandtech over people on a forum any day. No offense to people on the forum, I get a lot of terrific insight and good suggestions. But before making any solid conclusions I really think we need to at least try it.
RAGEdemon said:As usual, misinformation on this board is rampant...

Remember, nVidia is trying to hype this thing by providing limited information on purpose. Although the tech has good uses, please don't allow yourselves to be used to spread free marketing for them by propagating misinformation.
Well I am personally offended by the suggestion that I'm spreading misinformation.

You cannot take the same limited information that I've seen and say that you somehow understand better than everyone else and that the rest of us are all NVidia tools.

The real answer is that none of us know, we are all guessing based on limited information. Not one of us here has actually seen it in operation.

If you don't mind, I'll take the word of NVidia and Anandtech over people on a forum any day.

No offense to people on the forum, I get a lot of terrific insight and good suggestions. But before making any solid conclusions I really think we need to at least try it.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 12/18/2013 11:55 PM   
Here's a review that goes into some of the technical details of G-Sync [url]http://www.anandtech.com/show/7582/nvidia-gsync-review[/url]. While the reviewer did most of their testing at 60hz and a whole whopping 0% with Stereoscopic 3D some of the details and limitations were fleshed out. However IMHO it ought to be noted that my opinion of the author may be jaded because of their opinion that Nvidia Experience is a "step in the right direction".
Here's a review that goes into some of the technical details of G-Sync http://www.anandtech.com/show/7582/nvidia-gsync-review. While the reviewer did most of their testing at 60hz and a whole whopping 0% with Stereoscopic 3D some of the details and limitations were fleshed out.

However IMHO it ought to be noted that my opinion of the author may be jaded because of their opinion that Nvidia Experience is a "step in the right direction".

i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"

Posted 12/19/2013 12:20 AM   
[quote="RAGEdemon"] As long as the FPS the GPU can generate is above the VSync, those red balls in picture 3 will be almost exactly equally spaced. This is due to double (or triple) buffering combined with frame pre-rendering which is 3 by default in most games.[/quote] This pre-buffering is part of the problem. The longer a frame sits in a buffer, the more out of sync with the true animation it is. The more out of sync it is, the more jittery it will look. (unless all the other frames are equally out of sync) You absolutely can't say they will be "almost exactly equally spaced". In some cases, they might be. But in other cases, all sorts of things can happen to add irregularity to the pipeline (rapid mouse movement, explosions, cpu activity, PhysX, a buffered image gets used, SLI playing up, a buffered image gets ignored and flushed, etc.). Surely you must have noticed that old games that are a breeze for your GPU look better running at 60fps than most new games do at 60fps. There's a big difference in the pipeline between a GPU that just barely manages to produce 60fps and one that produces it easily. My diagram was purposely simplified and exxagerated, to make it easy to see the issue. But the principle of that diagram is sound, and describes a real phenomenon: unless the source rate of an animation perfectly matches its output rate, the animation will be imperfect and jittery. The images you linked to only tell half the story: the length of time it takes to process a frame. What's missing is when those frames are actually from. It matters little that each frame takes 3ms to produce if the frames aren't *sourced* exactly one second apart. Anyway, debate is good, as we're all doing our best to figure out the details of how this works. But I agree with bo3b: we can all do without the self-righteous arrogance and condescension, thanks.
RAGEdemon said:
As long as the FPS the GPU can generate is above the VSync, those red balls in picture 3 will be almost exactly equally spaced. This is due to double (or triple) buffering combined with frame pre-rendering which is 3 by default in most games.


This pre-buffering is part of the problem. The longer a frame sits in a buffer, the more out of sync with the true animation it is. The more out of sync it is, the more jittery it will look. (unless all the other frames are equally out of sync)

You absolutely can't say they will be "almost exactly equally spaced". In some cases, they might be. But in other cases, all sorts of things can happen to add irregularity to the pipeline (rapid mouse movement, explosions, cpu activity, PhysX, a buffered image gets used, SLI playing up, a buffered image gets ignored and flushed, etc.). Surely you must have noticed that old games that are a breeze for your GPU look better running at 60fps than most new games do at 60fps. There's a big difference in the pipeline between a GPU that just barely manages to produce 60fps and one that produces it easily.

My diagram was purposely simplified and exxagerated, to make it easy to see the issue. But the principle of that diagram is sound, and describes a real phenomenon: unless the source rate of an animation perfectly matches its output rate, the animation will be imperfect and jittery.

The images you linked to only tell half the story: the length of time it takes to process a frame. What's missing is when those frames are actually from. It matters little that each frame takes 3ms to produce if the frames aren't *sourced* exactly one second apart.

Anyway, debate is good, as we're all doing our best to figure out the details of how this works. But I agree with bo3b: we can all do without the self-righteous arrogance and condescension, thanks.

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

Posted 12/19/2013 05:44 AM   
[quote="Volnaiskra"] This pre-buffering is part of the problem. The longer a frame sits in a buffer, the more out of sync with the true animation it is. The more out of sync it is, the more jittery it will look. (unless all the other frames are equally out of sync)[/quote] GSync will use pre-buffering too. I can see your misconception. You think that gsync will allign frames to when they were meant to be displayed by the gpu somehow telling the monitor when the frame render took place in a timeline. This would be an impressive feat but I hate to dissapoint you by saying that this is not the case. The gpu will only communicate to the display after the frame has been rendered that it is ready to transmit. The vsync signal will then transmit as soon as it can after calculating how much the vsync needs to be lagged. There will always be a few frames difference between live game time and what is being shown on the screen even with gsync. [quote="Volnaiskra"]You absolutely can't say they will be "almost exactly equally spaced". In some cases, they might be. [/quote] Yes I can. Each Hz scan will be displaying a different calculated frame, no Hz scan will ever duplicate a previous frame. Absolute worst case scenario, one frame will take exactly 1/60th of a second to complete where the next frame will take an infinitely less time to reproduce, then the display would look something like a 30Hz display. Granted that this is a possibility, but is extremely unlikely (read: never going to happen in the real world). [quote="Volnaiskra"]My diagram was purposely simplified and exxagerated, to make it easy to see the issue. But the principle of that diagram is sound, and describes a real phenomenon: unless the source rate of an animation perfectly matches its output rate, the animation will be imperfect and jittery. [/quote] Your diagram wasn't exaggerating, it was lying. Redraw that animation in a worst case scenario where each red ball is in a column this time, and see how smooth the through line is. Compare it to your first image. Is there a perceivable difference? There is a reason that nvidia showed a drop in fps below vsync to highlight the advantages of gsync. If they had shown vsync vs gsync when the gpu was powerful enough to maintain vsync frame rates, only the most hawk eyed, if anybody at all would have been able to see the difference.
Volnaiskra said:

This pre-buffering is part of the problem. The longer a frame sits in a buffer, the more out of sync with the true animation it is. The more out of sync it is, the more jittery it will look. (unless all the other frames are equally out of sync)


GSync will use pre-buffering too. I can see your misconception. You think that gsync will allign frames to when they were meant to be displayed by the gpu somehow telling the monitor when the frame render took place in a timeline. This would be an impressive feat but I hate to dissapoint you by saying that this is not the case. The gpu will only communicate to the display after the frame has been rendered that it is ready to transmit. The vsync signal will then transmit as soon as it can after calculating how much the vsync needs to be lagged. There will always be a few frames difference between live game time and what is being shown on the screen even with gsync.

Volnaiskra said:You absolutely can't say they will be "almost exactly equally spaced". In some cases, they might be.


Yes I can. Each Hz scan will be displaying a different calculated frame, no Hz scan will ever duplicate a previous frame. Absolute worst case scenario, one frame will take exactly 1/60th of a second to complete where the next frame will take an infinitely less time to reproduce, then the display would look something like a 30Hz display. Granted that this is a possibility, but is extremely unlikely (read: never going to happen in the real world).

Volnaiskra said:My diagram was purposely simplified and exxagerated, to make it easy to see the issue. But the principle of that diagram is sound, and describes a real phenomenon: unless the source rate of an animation perfectly matches its output rate, the animation will be imperfect and jittery.


Your diagram wasn't exaggerating, it was lying. Redraw that animation in a worst case scenario where each red ball is in a column this time, and see how smooth the through line is. Compare it to your first image. Is there a perceivable difference?

There is a reason that nvidia showed a drop in fps below vsync to highlight the advantages of gsync. If they had shown vsync vs gsync when the gpu was powerful enough to maintain vsync frame rates, only the most hawk eyed, if anybody at all would have been able to see the difference.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 12/20/2013 01:32 AM   
[quote="RAGEdemon"][i]Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.[/i] [/quote]You seem to be confusing us with other people. No one here is arguing this point. The only caveat I've added is that what really matters is [i]minimum [/i]frame rate, not average. I couldn't care less that you can average 70 fps and nicely match your 60Hz monitor- if it drops to 45 during firefights, then it's a poor experience. The ONLY thing I care about is minimum frame rate. And I submit that this where GSync has the potential to help 3D gaming. In 3D, it's not 60fps that we need to sustain, we need to sustain 120fps. If we ever drop below 120fps, we are going to get stutter. In current 3D gaming, sustaining 120fps is typically not possible with high end games like Metro, Bioshock Infinite, Batman with PhysX. Volnaiskra noted earlier that even with essentially the best hardware that you can buy- SLI Titans, 4.5GHz CPU, he cannot sustain 120fps. This is the scenario where GSync has the potential to improve 3D Vision. If we drop below 120, we can delay sync and avoid the stutter during firefights. The lower limit is not presently clear because no one has so far been bothered to give any 3D results. And NVidia has not provided enough information for the 3D Vision case. Maybe it will GSync down to a minimum frame rate of 100Hz, but we cannot really say just yet. Nevertheless, depending upon details, this has real potential and is not some marketing gimmick.
RAGEdemon said:Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
You seem to be confusing us with other people. No one here is arguing this point.

The only caveat I've added is that what really matters is minimum frame rate, not average. I couldn't care less that you can average 70 fps and nicely match your 60Hz monitor- if it drops to 45 during firefights, then it's a poor experience. The ONLY thing I care about is minimum frame rate.

And I submit that this where GSync has the potential to help 3D gaming.

In 3D, it's not 60fps that we need to sustain, we need to sustain 120fps. If we ever drop below 120fps, we are going to get stutter.

In current 3D gaming, sustaining 120fps is typically not possible with high end games like Metro, Bioshock Infinite, Batman with PhysX. Volnaiskra noted earlier that even with essentially the best hardware that you can buy- SLI Titans, 4.5GHz CPU, he cannot sustain 120fps.

This is the scenario where GSync has the potential to improve 3D Vision. If we drop below 120, we can delay sync and avoid the stutter during firefights.

The lower limit is not presently clear because no one has so far been bothered to give any 3D results. And NVidia has not provided enough information for the 3D Vision case. Maybe it will GSync down to a minimum frame rate of 100Hz, but we cannot really say just yet.

Nevertheless, depending upon details, this has real potential and is not some marketing gimmick.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 12/20/2013 11:37 AM   
As long as Tv's dont support this it's wortless, total crap. Nvidia at it again with their propriatary SHIT!! I have been a nvidia user very long but their shite just stinks, every new thing is propriatary CRAP. Everyone should boycot these fucking DIPSHITS, I had enough. AMD Mantle and their cards in evry console says fuck you nvidia your time has come, it's over! And these IDIOTS take our money with a smile and keep fucking us over again and again as long as we are stupid suporting these ASHOLES. Look at 3d vision they dont give a fuck so think here people and just give nvidia the finger and say FUCK YOU and go AMD insteed.
As long as Tv's dont support this it's wortless, total crap.
Nvidia at it again with their propriatary SHIT!!

I have been a nvidia user very long but their shite just stinks, every new thing is propriatary CRAP.

Everyone should boycot these fucking DIPSHITS, I had enough.

AMD Mantle and their cards in evry console says fuck you nvidia your time has come, it's over!

And these IDIOTS take our money with a smile and keep fucking us over again and again as long as we are stupid suporting these ASHOLES. Look at 3d vision they dont give a fuck so think here people and just give nvidia the finger and say FUCK YOU and go AMD insteed.

Posted 12/21/2013 01:13 PM   
TVs can't support this, it requires special hardware.
TVs can't support this, it requires special hardware.

Posted 12/21/2013 02:59 PM   
It looks like you have a problem Mr. loader.
It looks like you have a problem Mr. loader.

Posted 12/21/2013 03:20 PM   
[quote="JnLoader"]As long as Tv's dont support this it's wortless, total crap. Nvidia at it again with their propriatary SHIT!! I have been a nvidia user very long but their shite just stinks, every new thing is propriatary CRAP. Everyone should boycot these fucking DIPSHITS, I had enough. AMD Mantle and their cards in evry console says fuck you nvidia your time has come, it's over! And these IDIOTS take our money with a smile and keep fucking us over again and again as long as we are stupid suporting these ASHOLES. Look at 3d vision they dont give a fuck so think here people and just give nvidia the finger and say FUCK YOU and go AMD insteed. [/quote]You're certainly entitled to your opinion and have made it clear numerous times that is the way you feel, then leave this Nvidia forum. Thank you and have a good day.
JnLoader said:As long as Tv's dont support this it's wortless, total crap.
Nvidia at it again with their propriatary SHIT!!

I have been a nvidia user very long but their shite just stinks, every new thing is propriatary CRAP.

Everyone should boycot these fucking DIPSHITS, I had enough.

AMD Mantle and their cards in evry console says fuck you nvidia your time has come, it's over!

And these IDIOTS take our money with a smile and keep fucking us over again and again as long as we are stupid suporting these ASHOLES. Look at 3d vision they dont give a fuck so think here people and just give nvidia the finger and say FUCK YOU and go AMD insteed.
You're certainly entitled to your opinion and have made it clear numerous times that is the way you feel, then leave this Nvidia forum. Thank you and have a good day.
Sorry double post.
[quote="JnLoader"]As long as Tv's dont support this it's wortless, total crap. Nvidia at it again with their propriatary SHIT!! I have been a nvidia user very long but their shite just stinks, every new thing is propriatary CRAP. Everyone should boycot these fucking DIPSHITS, I had enough. AMD Mantle and their cards in evry console says fuck you nvidia your time has come, it's over! And these IDIOTS take our money with a smile and keep fucking us over again and again as long as we are stupid suporting these ASHOLES. Look at 3d vision they dont give a fuck so think here people and just give nvidia the finger and say FUCK YOU and go AMD insteed. [/quote] This post is hilarious. And the capper is celebrating mantle. You think if this back to the future stuff takes off (API's were popular in the early 90s and ditched because they SUCKED for the consumer), Nvidia won't release the same stupid thing? Proprietary APIs don't benefit anyone.
JnLoader said:As long as Tv's dont support this it's wortless, total crap.
Nvidia at it again with their propriatary SHIT!!

I have been a nvidia user very long but their shite just stinks, every new thing is propriatary CRAP.

Everyone should boycot these fucking DIPSHITS, I had enough.

AMD Mantle and their cards in evry console says fuck you nvidia your time has come, it's over!

And these IDIOTS take our money with a smile and keep fucking us over again and again as long as we are stupid suporting these ASHOLES. Look at 3d vision they dont give a fuck so think here people and just give nvidia the finger and say FUCK YOU and go AMD insteed.


This post is hilarious.

And the capper is celebrating mantle. You think if this back to the future stuff takes off (API's were popular in the early 90s and ditched because they SUCKED for the consumer), Nvidia won't release the same stupid thing? Proprietary APIs don't benefit anyone.

Posted 12/21/2013 04:23 PM   
[quote="bo3b"] The only caveat I've added is that what really matters is [i]minimum [/i]frame rate, not average. I couldn't care less that you can average 70 fps and nicely match your 60Hz monitor- if it drops to 45 during firefights, then it's a poor experience. The ONLY thing I care about is minimum frame rate. And I submit that this where GSync has the potential to help 3D gaming. In 3D, it's not 60fps that we need to sustain, we need to sustain 120fps. If we ever drop below 120fps, we are going to get stutter. In current 3D gaming, sustaining 120fps is typically not possible with high end games like Metro, Bioshock Infinite, Batman with PhysX. Volnaiskra noted earlier that even with essentially the best hardware that you can buy- SLI Titans, 4.5GHz CPU, he cannot sustain 120fps. This is the scenario where GSync has the potential to improve 3D Vision. If we drop below 120, we can delay sync and avoid the stutter during firefights. The lower limit is not presently clear because no one has so far been bothered to give any 3D results. And NVidia has not provided enough information for the 3D Vision case. Maybe it will GSync down to a minimum frame rate of 100Hz, but we cannot really say just yet. Nevertheless, depending upon details, this has real potential and is not some marketing gimmick.[/quote] bo3b, I see your misconception too. 3D vison doesn't render 120fps. It renders 60 for the first eye. The other 60 for the other eye are the exact same frames but from a different perspective. This costs the GPU a fraction of the power above generating only 60fps. Try monitoring 3D vision with a true 120FPS game such as borderlands 2. You will note that if 100% of the GPU is being utilised to produce 120fps (ie 3D vision disabled), then enabling 3D vision drops the GPU usage to ~60% as the FPS is now capped at 60fps, and the motion in 3D vision is noticeably choppier compared with 3D vision off. I started an entire thread regarding this issue here: [url]https://forums.geforce.com/default/topic/572033/please-help-me-fix-the-60fps-120hz-issue-once-and-for-all-/[/url] I postulated that we could achieve true 120FPS in 3D vision, by each and every frame being different in the timeline. I got an immense amount of backlash for the suggestion, but was slowly able to convince some of the more enlightened folk. Helifax was also working on an OpenGL experiment at the time. His app proved once and for all that it could be done; and the result was glorious. I tried to find a developer to make a better demo, but none could be found. I wholeheartedly agree that gsync isn't just a gimmick if a GPU cannot sustain 60fps (sustain = minimum) in Stereo3D, in which case gsync would be invaluable. I am, however, questioning the cost of such a system. I feel that the money would be better spent on a more powerful GPU / another card for SLI, unless the price for a gsync product was comparatively cheap. It is not a kill-all magic bullet. For reference, I am of the somewhat radical opinion that 60FPS (or even 120FPS for that matter) is just not enough for a smooth gaming experience. I can still see choppiness on a bright screen at even 120fps. I would love to try out a 240Hz display if the arrogant sloths in their ivory towers ever decide that people can perceive more than 120fps. There is debate amongst modern society that higher than 24fps is not even worth while which is plain ridiculous. These luddites have been brought up with the myth that the eye can't perceive more than 24fps; and will spew filth against anyone who would try to educate otherwise.
bo3b said:

The only caveat I've added is that what really matters is minimum frame rate, not average. I couldn't care less that you can average 70 fps and nicely match your 60Hz monitor- if it drops to 45 during firefights, then it's a poor experience. The ONLY thing I care about is minimum frame rate.

And I submit that this where GSync has the potential to help 3D gaming.

In 3D, it's not 60fps that we need to sustain, we need to sustain 120fps. If we ever drop below 120fps, we are going to get stutter.

In current 3D gaming, sustaining 120fps is typically not possible with high end games like Metro, Bioshock Infinite, Batman with PhysX. Volnaiskra noted earlier that even with essentially the best hardware that you can buy- SLI Titans, 4.5GHz CPU, he cannot sustain 120fps.

This is the scenario where GSync has the potential to improve 3D Vision. If we drop below 120, we can delay sync and avoid the stutter during firefights.

The lower limit is not presently clear because no one has so far been bothered to give any 3D results. And NVidia has not provided enough information for the 3D Vision case. Maybe it will GSync down to a minimum frame rate of 100Hz, but we cannot really say just yet.

Nevertheless, depending upon details, this has real potential and is not some marketing gimmick.


bo3b, I see your misconception too.

3D vison doesn't render 120fps. It renders 60 for the first eye. The other 60 for the other eye are the exact same frames but from a different perspective. This costs the GPU a fraction of the power above generating only 60fps.

Try monitoring 3D vision with a true 120FPS game such as borderlands 2. You will note that if 100% of the GPU is being utilised to produce 120fps (ie 3D vision disabled), then enabling 3D vision drops the GPU usage to ~60% as the FPS is now capped at 60fps, and the motion in 3D vision is noticeably choppier compared with 3D vision off.

I started an entire thread regarding this issue here:

https://forums.geforce.com/default/topic/572033/please-help-me-fix-the-60fps-120hz-issue-once-and-for-all-/

I postulated that we could achieve true 120FPS in 3D vision, by each and every frame being different in the timeline. I got an immense amount of backlash for the suggestion, but was slowly able to convince some of the more enlightened folk. Helifax was also working on an OpenGL experiment at the time. His app proved once and for all that it could be done; and the result was glorious. I tried to find a developer to make a better demo, but none could be found.

I wholeheartedly agree that gsync isn't just a gimmick if a GPU cannot sustain 60fps (sustain = minimum) in Stereo3D, in which case gsync would be invaluable.

I am, however, questioning the cost of such a system. I feel that the money would be better spent on a more powerful GPU / another card for SLI, unless the price for a gsync product was comparatively cheap. It is not a kill-all magic bullet.

For reference, I am of the somewhat radical opinion that 60FPS (or even 120FPS for that matter) is just not enough for a smooth gaming experience. I can still see choppiness on a bright screen at even 120fps. I would love to try out a 240Hz display if the arrogant sloths in their ivory towers ever decide that people can perceive more than 120fps. There is debate amongst modern society that higher than 24fps is not even worth while which is plain ridiculous. These luddites have been brought up with the myth that the eye can't perceive more than 24fps; and will spew filth against anyone who would try to educate otherwise.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 12/21/2013 05:59 PM   
LOL at you DUDES!. Was just joking so take it easy. I know eaiter AMD or Nvidia gives a shit about us, same as all other they just want our money and now I run over and hide at AMD forum ..LOL
LOL at you DUDES!.

Was just joking so take it easy.

I know eaiter AMD or Nvidia gives a shit about us, same as all other they just want our money and now I run over and hide at AMD forum ..LOL

Posted 12/21/2013 05:54 PM   
[quote="Pirateguybrush"]TVs can't support this, it requires special hardware.[/quote] Yeah mate I am fully aware of it and said in other threads how much im looking forward to it :) It sounds fantastic this G-Sync tech!
Pirateguybrush said:TVs can't support this, it requires special hardware.


Yeah mate I am fully aware of it and said in other threads how much im looking forward to it :)
It sounds fantastic this G-Sync tech!

Posted 12/21/2013 05:56 PM   
[quote="RAGEdemon"][quote="bo3b"] The only caveat I've added is that what really matters is [i]minimum [/i]frame rate, not average. I couldn't care less that you can average 70 fps and nicely match your 60Hz monitor- if it drops to 45 during firefights, then it's a poor experience. The ONLY thing I care about is minimum frame rate. And I submit that this where GSync has the potential to help 3D gaming. In 3D, it's not 60fps that we need to sustain, we need to sustain 120fps. If we ever drop below 120fps, we are going to get stutter. In current 3D gaming, sustaining 120fps is typically not possible with high end games like Metro, Bioshock Infinite, Batman with PhysX. Volnaiskra noted earlier that even with essentially the best hardware that you can buy- SLI Titans, 4.5GHz CPU, he cannot sustain 120fps. This is the scenario where GSync has the potential to improve 3D Vision. If we drop below 120, we can delay sync and avoid the stutter during firefights. The lower limit is not presently clear because no one has so far been bothered to give any 3D results. And NVidia has not provided enough information for the 3D Vision case. Maybe it will GSync down to a minimum frame rate of 100Hz, but we cannot really say just yet. Nevertheless, depending upon details, this has real potential and is not some marketing gimmick.[/quote] bo3b, I see your misconception too. 3D vison doesn't render 120fps. It renders 60 for the first eye. The other 60 for the other eye are the exact same frames but from a different perspective. This costs the GPU a fraction of the power above generating only 60fps. Try monitoring 3D vision with a true 120FPS game such as borderlands 2. You will note that if 100% of the GPU is being utilised to produce 120fps (ie 3D vision disabled), then enabling 3D vision drops the GPU usage to ~60% as the FPS is now capped at 60fps, and the motion in 3D vision is noticeably choppier compared with 3D vision off. I started an entire thread regarding this issue here: [url]https://forums.geforce.com/default/topic/572033/please-help-me-fix-the-60fps-120hz-issue-once-and-for-all-/[/url] I postulated that we could achieve true 120FPS in 3D vision, by each and every frame being different in the timeline. I got an immense amount of backlash for the suggestion, but was slowly able to convince some of the more enlightened folk. Helifax was also working on an OpenGL experiment at the time. His app proved once and for all that it could be done; and the result was glorious. I tried to find a developer to make a better demo, but none could be found. I wholeheartedly agree that gsync isn't just a gimmick if a GPU cannot sustain 60fps (sustain = minimum) in Stereo3D, in which case gsync would be invaluable. I am, however, questioning the cost of such a system. I feel that the money would be better spent on a more powerful GPU / another card for SLI, unless the price for a gsync product was comparatively cheap. It is not a kill-all magic bullet. For reference, I am of the somewhat radical opinion that 60FPS (or even 120FPS for that matter) is just not enough for a smooth gaming experience. I can still see choppiness on a bright screen at even 120fps. I would love to try out a 240Hz display if the arrogant sloths in their ivory towers ever decide that people can perceive more than 120fps. There is debate amongst modern society that higher than 24fps is not even worth while which is plain ridiculous. These luddites have been brought up with the myth that the eye can't perceive more than 24fps; and will spew filth against anyone who would try to educate otherwise.[/quote] That may be true with a small percentage of games, but it's much more common for a game to require a doubling of needed power to do 3D @60fps. I'd be great if every game only had a 20 - 40 percent penalty, but that's sure not my experience. It's a rarity.
RAGEdemon said:
bo3b said:

The only caveat I've added is that what really matters is minimum frame rate, not average. I couldn't care less that you can average 70 fps and nicely match your 60Hz monitor- if it drops to 45 during firefights, then it's a poor experience. The ONLY thing I care about is minimum frame rate.

And I submit that this where GSync has the potential to help 3D gaming.

In 3D, it's not 60fps that we need to sustain, we need to sustain 120fps. If we ever drop below 120fps, we are going to get stutter.

In current 3D gaming, sustaining 120fps is typically not possible with high end games like Metro, Bioshock Infinite, Batman with PhysX. Volnaiskra noted earlier that even with essentially the best hardware that you can buy- SLI Titans, 4.5GHz CPU, he cannot sustain 120fps.

This is the scenario where GSync has the potential to improve 3D Vision. If we drop below 120, we can delay sync and avoid the stutter during firefights.

The lower limit is not presently clear because no one has so far been bothered to give any 3D results. And NVidia has not provided enough information for the 3D Vision case. Maybe it will GSync down to a minimum frame rate of 100Hz, but we cannot really say just yet.

Nevertheless, depending upon details, this has real potential and is not some marketing gimmick.


bo3b, I see your misconception too.

3D vison doesn't render 120fps. It renders 60 for the first eye. The other 60 for the other eye are the exact same frames but from a different perspective. This costs the GPU a fraction of the power above generating only 60fps.

Try monitoring 3D vision with a true 120FPS game such as borderlands 2. You will note that if 100% of the GPU is being utilised to produce 120fps (ie 3D vision disabled), then enabling 3D vision drops the GPU usage to ~60% as the FPS is now capped at 60fps, and the motion in 3D vision is noticeably choppier compared with 3D vision off.

I started an entire thread regarding this issue here:

https://forums.geforce.com/default/topic/572033/please-help-me-fix-the-60fps-120hz-issue-once-and-for-all-/

I postulated that we could achieve true 120FPS in 3D vision, by each and every frame being different in the timeline. I got an immense amount of backlash for the suggestion, but was slowly able to convince some of the more enlightened folk. Helifax was also working on an OpenGL experiment at the time. His app proved once and for all that it could be done; and the result was glorious. I tried to find a developer to make a better demo, but none could be found.

I wholeheartedly agree that gsync isn't just a gimmick if a GPU cannot sustain 60fps (sustain = minimum) in Stereo3D, in which case gsync would be invaluable.

I am, however, questioning the cost of such a system. I feel that the money would be better spent on a more powerful GPU / another card for SLI, unless the price for a gsync product was comparatively cheap. It is not a kill-all magic bullet.

For reference, I am of the somewhat radical opinion that 60FPS (or even 120FPS for that matter) is just not enough for a smooth gaming experience. I can still see choppiness on a bright screen at even 120fps. I would love to try out a 240Hz display if the arrogant sloths in their ivory towers ever decide that people can perceive more than 120fps. There is debate amongst modern society that higher than 24fps is not even worth while which is plain ridiculous. These luddites have been brought up with the myth that the eye can't perceive more than 24fps; and will spew filth against anyone who would try to educate otherwise.


That may be true with a small percentage of games, but it's much more common for a game to require a doubling of needed power to do 3D @60fps. I'd be great if every game only had a 20 - 40 percent penalty, but that's sure not my experience. It's a rarity.

Posted 12/21/2013 06:29 PM   
  11 / 15    
Scroll To Top