[quote="b4thman"]I disagree with some things, I have an Acer H5360 projector and I hardly ever play with it, and instead I usually play with my Samsung 2233RZ monitor. Why?... the main reason is "comfortability", and I can think on other reasons that I prefer not to describe to not avoid the main point of this topic.
Anyway the purpose of this thread I think was to discuss about monitors, and I would like to have a clear idea about the real differences among the best monitors, and what is exactly the best choice today (if that is possible), in case anybody wants to buy just now.
It would be also great to have an opinion about OLED related to Nvidia 3d Vision (if that is possible), because OLED offers apparently a real difference in terms of visual quality.[/quote]
I believe that any discussion of getting a new monitor should at least consider a DLP projector. It's the right choice for everyone, but it should be considered.
Specifically, I'm trying to answer the OPs question for g-sync. As to whether it is worth it or not. And also to avoid having to upgrade his 680 SLI. If he goes with more pixels, he has to upgrade, making it all much more expensive. If he goes with a DLP projector he can stay with cheaper hardware and still have a terrific experience.
But I do agree it's a bit off-topic.
Still I'd be curious about your "comfortability" with H5360. Maybe too low contrast? That's a genuine drawback of projectors.
For OLEDs, those are pretty promising, with fast switch times. There are some pricey TVs that are OLED that ought to work with 3D TV Play, but I haven't heard anyone try it.
b4thman said:I disagree with some things, I have an Acer H5360 projector and I hardly ever play with it, and instead I usually play with my Samsung 2233RZ monitor. Why?... the main reason is "comfortability", and I can think on other reasons that I prefer not to describe to not avoid the main point of this topic.
Anyway the purpose of this thread I think was to discuss about monitors, and I would like to have a clear idea about the real differences among the best monitors, and what is exactly the best choice today (if that is possible), in case anybody wants to buy just now.
It would be also great to have an opinion about OLED related to Nvidia 3d Vision (if that is possible), because OLED offers apparently a real difference in terms of visual quality.
I believe that any discussion of getting a new monitor should at least consider a DLP projector. It's the right choice for everyone, but it should be considered.
Specifically, I'm trying to answer the OPs question for g-sync. As to whether it is worth it or not. And also to avoid having to upgrade his 680 SLI. If he goes with more pixels, he has to upgrade, making it all much more expensive. If he goes with a DLP projector he can stay with cheaper hardware and still have a terrific experience.
But I do agree it's a bit off-topic.
Still I'd be curious about your "comfortability" with H5360. Maybe too low contrast? That's a genuine drawback of projectors.
For OLEDs, those are pretty promising, with fast switch times. There are some pricey TVs that are OLED that ought to work with 3D TV Play, but I haven't heard anyone try it.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="Maegirom"]Some doubts I have about g-sync and tearing:
Is it supposed that tearing ONLY happens when the fps are HIGHER than the monitor's refresh rate? 'Cause if this is correct, does the problem is not solved by simply adding a frame capper to the game?
SO, if tearing only happens at HIGER framerates, and a capper is the solution, g sync has no sense, right?
AND if the gpu framerate is LOWER than the refresh rate I don't need any solution to avoid tearing neither capper, v sync or g sync, right?
Ilumine me please.[/quote]
[s]That's right, tearing only happens at Higher than refresh rate, as you draw multiple images while the screen is up. [/s] With vSync off, you get tearing. With vSync on, that solves the tearing, by locking the fastest rate to the refresh. But, that causes stutter if you ever slip below the refresh rate. If you ever slip below the refresh rate, you'll skip an entire frame, so two frames in a row will be identical- stutter.
G-Sync is more oriented around the latter, the stutter. But it also allows you to run with a higher refresh rate. So if before you could run at 125 fps, with dips to 75 fps, you'd have to choose between two evils. Run at 120 to avoid tearing mostly, or run at 75 with vsync to avoid stutter.
With G-Sync you don't have to choose. You set to 120 Hz refresh, and if it ever slips below that rate, the frame is delayed. So no need for vsync, and no stutter. Down to a minimum frame rate, which is generally set at 30 fps, where g-sync is disabled, and you get stutter if it slips that low.
More in depth discussion:
http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
Does that help?
Edit: Sorry brain fade there- tearing happens when slower or faster than refresh, when vsync is off. When faster, you can get multiple tears as seen here:
http://www.anandtech.com/show/7582/nvidia-gsync-review
When slower, you can still get a single tear, which can still be bad, especially as you rotate.
Maegirom said:Some doubts I have about g-sync and tearing:
Is it supposed that tearing ONLY happens when the fps are HIGHER than the monitor's refresh rate? 'Cause if this is correct, does the problem is not solved by simply adding a frame capper to the game?
SO, if tearing only happens at HIGER framerates, and a capper is the solution, g sync has no sense, right?
AND if the gpu framerate is LOWER than the refresh rate I don't need any solution to avoid tearing neither capper, v sync or g sync, right?
Ilumine me please.
That's right, tearing only happens at Higher than refresh rate, as you draw multiple images while the screen is up. With vSync off, you get tearing. With vSync on, that solves the tearing, by locking the fastest rate to the refresh. But, that causes stutter if you ever slip below the refresh rate. If you ever slip below the refresh rate, you'll skip an entire frame, so two frames in a row will be identical- stutter.
G-Sync is more oriented around the latter, the stutter. But it also allows you to run with a higher refresh rate. So if before you could run at 125 fps, with dips to 75 fps, you'd have to choose between two evils. Run at 120 to avoid tearing mostly, or run at 75 with vsync to avoid stutter.
With G-Sync you don't have to choose. You set to 120 Hz refresh, and if it ever slips below that rate, the frame is delayed. So no need for vsync, and no stutter. Down to a minimum frame rate, which is generally set at 30 fps, where g-sync is disabled, and you get stutter if it slips that low.
Then, in a game where I know that fps won't reach the refresh rate (let's say metro 2033, crysis, etc ) I know I'll have no tearing, so there is no point in activate v sync and NEITHER g-sync, right?
And even if my gpu can reach the refresh rate in other games, a frame capper would make the trick, wouldn't it?
I mean, if fps don't go beyond refresh rate (due to gpu's low performance OR a frame capper) I don't need any "synchro" system (v sync or g sync), right?
Personally, I ALWAYS have played with v-sync OFF so never have experienced lag or stuttering due to this. AND in older games where fps go far beyond monitor's refresh rate, I use frame cappers at 120. Honestly, still don't understand when or why to use g-sync
Then, in a game where I know that fps won't reach the refresh rate (let's say metro 2033, crysis, etc ) I know I'll have no tearing, so there is no point in activate v sync and NEITHER g-sync, right?
And even if my gpu can reach the refresh rate in other games, a frame capper would make the trick, wouldn't it?
I mean, if fps don't go beyond refresh rate (due to gpu's low performance OR a frame capper) I don't need any "synchro" system (v sync or g sync), right?
Personally, I ALWAYS have played with v-sync OFF so never have experienced lag or stuttering due to this. AND in older games where fps go far beyond monitor's refresh rate, I use frame cappers at 120. Honestly, still don't understand when or why to use g-sync
Sorry - wall of text incoming :-(
[quote="Maegirom"]Then, in a game where I know that fps won't reach the refresh rate (let's say metro 2033, crysis, etc ) I know I'll have no tearing, so there is no point in activate v sync and NEITHER g-sync, right?[/quote]
I think you have possibly mixed up which problems are fixed by certain functions.
Stuttering is caused when the GPU fails to render a full image "frame" fast enough to keep up with the refresh rate of your monitor. When this happens the card spits out the last fully rendered image again, so you get 2 of the same image. This causes input lag for your mouse and your brain sees the image hitching and stuttering - so you end up feeling the stuttering in your aim, and you can see it happen. This problem can happen at any framerate (more often at lower framerates) and especially when your framerates are fluctuating. G-Sync cures this by altering the refresh rate of the monitor to match the output of the graphics card - and the results are pretty much perfect.
Your mouse input never suffers, you don't get stuttering and FPS games particularly are far more enjoyable and you can be more accurate with your shots. This is because games will not track your input devices whilst frames are duplicated and G-sync stops this happening.
Tearing is caused by the FPS exceeding the refresh rate of the monitor. G-sync avoids this because it sets the refresh rate of the screen to match the GPU output. Also, you get rid of tearing just by using any 120hz monitor anyway - unless you can still run your games faster than 120FPS. Tearing can be "fixed" with V-Sync or Framerate capping, but you don't get rid of stuttering.
If fps don't go beyond refresh rate (due to gpu's low performance OR a frame capper) you do still need a synchronisation system - because any GPU will still suffer from out of sync frames unless it can constantly supply 60 or 120 FPS for the entirety of gameplay without fail - which isn't going to happen unless you really go overboard with a huge single GPU and are mostly playing older games, or you turn down gfx settings.
You still want G-Sync because it stops frame stuttering caused by frames being rendered twice when your card can't output them fast enough. The added smoothness to the image is great, however what can't be understated is how damn well it increases the smoothness of the mouse input, it makes FPS games feel brand new, and for me it made a noticeable improvement in my online play. I'm basically a free kill, but playing with G-Sync actually enabled me to start being competitive.
You may believe you have never experienced lag or stuttering, however you have probably just got used to the way "normal" monitors work. The problem is that we all got used to lag and stuttering, and some of us are more/less sensitive to it than others.
I really started to notice it when I started using SLI a few years ago, and since using an ROG SWIFT for the last 3 months I can't stand gaming on my girlfriends PC because her monitor kills my eyes with the slow response time and all the stuttering - which for some reason is worse in WoW than other games.
I'm not saying everyone *should* rush out and get G-sync, but it is the way all monitors should work, and its only because lazy engineers applied CRT timing principles to LCD technology that we have this problem in the first place. It's also why AMD and the other guys came up with Freesync. Like 3D and VR its something you can't even start to appreciate until you have a go with it, and have time to adapt - and then try going back again afterwards and this makes it a hard sell for NVidia and the monitor manufacturers.
Now I've used it, I'm not going back. I'm not hard-core enough to forgo playing 2D games, so G-sync keeps me happy enough when 3D isn't possible.
Maegirom said:Then, in a game where I know that fps won't reach the refresh rate (let's say metro 2033, crysis, etc ) I know I'll have no tearing, so there is no point in activate v sync and NEITHER g-sync, right?
I think you have possibly mixed up which problems are fixed by certain functions.
Stuttering is caused when the GPU fails to render a full image "frame" fast enough to keep up with the refresh rate of your monitor. When this happens the card spits out the last fully rendered image again, so you get 2 of the same image. This causes input lag for your mouse and your brain sees the image hitching and stuttering - so you end up feeling the stuttering in your aim, and you can see it happen. This problem can happen at any framerate (more often at lower framerates) and especially when your framerates are fluctuating. G-Sync cures this by altering the refresh rate of the monitor to match the output of the graphics card - and the results are pretty much perfect.
Your mouse input never suffers, you don't get stuttering and FPS games particularly are far more enjoyable and you can be more accurate with your shots. This is because games will not track your input devices whilst frames are duplicated and G-sync stops this happening.
Tearing is caused by the FPS exceeding the refresh rate of the monitor. G-sync avoids this because it sets the refresh rate of the screen to match the GPU output. Also, you get rid of tearing just by using any 120hz monitor anyway - unless you can still run your games faster than 120FPS. Tearing can be "fixed" with V-Sync or Framerate capping, but you don't get rid of stuttering.
If fps don't go beyond refresh rate (due to gpu's low performance OR a frame capper) you do still need a synchronisation system - because any GPU will still suffer from out of sync frames unless it can constantly supply 60 or 120 FPS for the entirety of gameplay without fail - which isn't going to happen unless you really go overboard with a huge single GPU and are mostly playing older games, or you turn down gfx settings.
You still want G-Sync because it stops frame stuttering caused by frames being rendered twice when your card can't output them fast enough. The added smoothness to the image is great, however what can't be understated is how damn well it increases the smoothness of the mouse input, it makes FPS games feel brand new, and for me it made a noticeable improvement in my online play. I'm basically a free kill, but playing with G-Sync actually enabled me to start being competitive.
You may believe you have never experienced lag or stuttering, however you have probably just got used to the way "normal" monitors work. The problem is that we all got used to lag and stuttering, and some of us are more/less sensitive to it than others.
I really started to notice it when I started using SLI a few years ago, and since using an ROG SWIFT for the last 3 months I can't stand gaming on my girlfriends PC because her monitor kills my eyes with the slow response time and all the stuttering - which for some reason is worse in WoW than other games.
I'm not saying everyone *should* rush out and get G-sync, but it is the way all monitors should work, and its only because lazy engineers applied CRT timing principles to LCD technology that we have this problem in the first place. It's also why AMD and the other guys came up with Freesync. Like 3D and VR its something you can't even start to appreciate until you have a go with it, and have time to adapt - and then try going back again afterwards and this makes it a hard sell for NVidia and the monitor manufacturers.
Now I've used it, I'm not going back. I'm not hard-core enough to forgo playing 2D games, so G-sync keeps me happy enough when 3D isn't possible.
There's a bit of a misconception here: screen tearing also occurs below a display's refresh rate.
Wolfenstein: The New Order, which uses a form of Adaptive V-Sync, is a good example of this. Console games also suffer badly from screen tearing, particularly those on the PS3 and X360, and console games are almost always far below the display's refresh rate.
There's a bit of a misconception here: screen tearing also occurs below a display's refresh rate.
Wolfenstein: The New Order, which uses a form of Adaptive V-Sync, is a good example of this. Console games also suffer badly from screen tearing, particularly those on the PS3 and X360, and console games are almost always far below the display's refresh rate.
I think these last 2 post have convinced me. I think I'm going to go to the Acer XB270ha monitor. Thanks.
Btw FOULPLAY99, Have you said you use sli? Me too. How is the experience with sli+g sync? Any problem?
I think these last 2 post have convinced me. I think I'm going to go to the Acer XB270ha monitor. Thanks.
Btw FOULPLAY99, Have you said you use sli? Me too. How is the experience with sli+g sync? Any problem?
Gsync I find a bit over the top, im used to getting high frame rates anyway and high frame rates combined with gsync I find the game hard to play sometimes because everything moves so quick. Battlefeild 4 is like this, im seeing well over 100fps but combined with gsync its hard to play lol or maybe im just to slow?
Gsync I find a bit over the top, im used to getting high frame rates anyway and high frame rates combined with gsync I find the game hard to play sometimes because everything moves so quick. Battlefeild 4 is like this, im seeing well over 100fps but combined with gsync its hard to play lol or maybe im just to slow?
i7 4930K @ 4.4GHz
Asus P9X79 Pro
3 Way SLI Titan Black @ 1400mhz skyn3t VBIOS (Hardvolt Mod)
Mushkin Redline @ 2200MHz 32GB
Asus Xonar U7 Echelon Soundcard
Samsung Pro 256 GB SSD Games
Samsung Evo 256 GB SSD Windows 8.1 Pro
Samsung Evo 256 GB SSD Windows 7 Ultimate
Asus ROG Swift 1440p 144hz G-Sync
PSU Corsair AX1500i
Astro A50 Wireless Headset
Corsair 800D Case Custom Waterloop
[quote="Maegirom"]I think these last 2 post have convinced me. I think I'm going to go to the Acer XB270ha monitor. Thanks.
Btw FOULPLAY99, Have you said you use sli? Me too. How is the experience with sli+g sync? Any problem? [/quote]
I sold my SLI 670's and ASUS VG278H when the 980's came out and went to the ROG SWIFT and single 980. Probably just as well considering the SLI issues reported with that monitor in one of the other threads.
[quote="logzz"]Gsync I find a bit over the top, im used to getting high frame rates anyway and high frame rates combined with gsync I find the game hard to play sometimes because everything moves so quick. Battlefeild 4 is like this, im seeing well over 100fps but combined with gsync its hard to play lol or maybe im just to slow?[/quote]
I found G-sync odd to get used to at first, it almost felt too smooth, which is odd because its supposed to be smooth. I guess that's just us getting used to the lack of stuttering.
Maegirom said:I think these last 2 post have convinced me. I think I'm going to go to the Acer XB270ha monitor. Thanks.
Btw FOULPLAY99, Have you said you use sli? Me too. How is the experience with sli+g sync? Any problem?
I sold my SLI 670's and ASUS VG278H when the 980's came out and went to the ROG SWIFT and single 980. Probably just as well considering the SLI issues reported with that monitor in one of the other threads.
logzz said:Gsync I find a bit over the top, im used to getting high frame rates anyway and high frame rates combined with gsync I find the game hard to play sometimes because everything moves so quick. Battlefeild 4 is like this, im seeing well over 100fps but combined with gsync its hard to play lol or maybe im just to slow?
I found G-sync odd to get used to at first, it almost felt too smooth, which is odd because its supposed to be smooth. I guess that's just us getting used to the lack of stuttering.
[quote="JunkieXcel"]There's a bit of a misconception here: screen tearing also occurs below a display's refresh rate.
Wolfenstein: The New Order, which uses a form of Adaptive V-Sync, is a good example of this. Console games also suffer badly from screen tearing, particularly those on the PS3 and X360, and console games are almost always far below the display's refresh rate. [/quote]
Yes, sorry, I edited to my post above. Tearing definitely happens when slower than refresh as well.
If you can change game settings to sustain 120 fps as the [i]minimum [/i]frame rate, then with vsync on you'll have the g-sync experience in 3D.
I'd just love to be able to run slower frame rates, more eye-candy, but in 3D.
JunkieXcel said:There's a bit of a misconception here: screen tearing also occurs below a display's refresh rate.
Wolfenstein: The New Order, which uses a form of Adaptive V-Sync, is a good example of this. Console games also suffer badly from screen tearing, particularly those on the PS3 and X360, and console games are almost always far below the display's refresh rate.
Yes, sorry, I edited to my post above. Tearing definitely happens when slower than refresh as well.
If you can change game settings to sustain 120 fps as the minimum frame rate, then with vsync on you'll have the g-sync experience in 3D.
I'd just love to be able to run slower frame rates, more eye-candy, but in 3D.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]
I'd just love to be able to run slower frame rates, more eye-candy[/quote]
And then, in motion, you'll get less eye candy than lowest setting, but while playing at 120fps.
Less details. Less 3D "wow" moments. For example in explosions. It's amazing when all the particles, debris, in 3D are visible. And you know what? You wouldn't even know what you're just missing!
If you change the settings, you look for a difference. You stop and search for improvements. You can see the difference right away, everywhere. This all is judged by a static image quality, and games tend to have a little of motion in them ;) .
On the other side - there is quality of the motion, level of details clearly visible while tracking an object with your eyes. This is not visible everywhere, and therefore harder to judge.
This is also affecting your fun level, subconsciously. You may focus on something like driving your car or aiming, and miss some things. You'll have more fun, this will be something you'll be aware of, but you might not even know why did you like the previous scene in particular. This includes the sense of speed in driving games, more sources of fun (crucial when you play a game that bores you in particular section!) for example by allowing you to have more fun from simple exploring the world, immerse into that world, admire all the scenery more.
Bigger immersion. This one is important, and can have other shapes: stable 100fps in BF4 makes demolishing everything with a tank so much more fun and immersive - those pieces of concrete, those particles, those little details on everything in motion.
There is also another thing - you don't have a "beep sound" to let you know your frame rate just dropped below the desired limit.
You can have drops below 60fps or whatever your refresh rate is, making it even harder to spot the difference.
This topic fascinates me since more than a decade, and still, even now, I can also miss the difference. This mechanism needed just a few month to affect me - the person who warns other gamers of it's deceitful phenomenon. Still, I have a brain just like everybody does. Prone to the same distractions as everybody.
[b]Every time[/b] I was surprised how unaware I was in terms of what I was missing, when I finally went back to gaming with crystal clarity in motion.
Every damn time. I felt like a fool, for doing the same mistake again, but also it makes the topic even more interesting. :)
I start to think I'll never learn to defend against it. It took only 10 minutes of playing BF4 without perfect frame rate, and I already started to forget what I am loosing. This is one sneaky, deceitful phenomenon :D
bo3b said:
I'd just love to be able to run slower frame rates, more eye-candy
And then, in motion, you'll get less eye candy than lowest setting, but while playing at 120fps.
Less details. Less 3D "wow" moments. For example in explosions. It's amazing when all the particles, debris, in 3D are visible. And you know what? You wouldn't even know what you're just missing!
If you change the settings, you look for a difference. You stop and search for improvements. You can see the difference right away, everywhere. This all is judged by a static image quality, and games tend to have a little of motion in them ;) .
On the other side - there is quality of the motion, level of details clearly visible while tracking an object with your eyes. This is not visible everywhere, and therefore harder to judge.
This is also affecting your fun level, subconsciously. You may focus on something like driving your car or aiming, and miss some things. You'll have more fun, this will be something you'll be aware of, but you might not even know why did you like the previous scene in particular. This includes the sense of speed in driving games, more sources of fun (crucial when you play a game that bores you in particular section!) for example by allowing you to have more fun from simple exploring the world, immerse into that world, admire all the scenery more.
Bigger immersion. This one is important, and can have other shapes: stable 100fps in BF4 makes demolishing everything with a tank so much more fun and immersive - those pieces of concrete, those particles, those little details on everything in motion.
There is also another thing - you don't have a "beep sound" to let you know your frame rate just dropped below the desired limit.
You can have drops below 60fps or whatever your refresh rate is, making it even harder to spot the difference.
This topic fascinates me since more than a decade, and still, even now, I can also miss the difference. This mechanism needed just a few month to affect me - the person who warns other gamers of it's deceitful phenomenon. Still, I have a brain just like everybody does. Prone to the same distractions as everybody. Every time I was surprised how unaware I was in terms of what I was missing, when I finally went back to gaming with crystal clarity in motion.
Every damn time. I felt like a fool, for doing the same mistake again, but also it makes the topic even more interesting. :)
I start to think I'll never learn to defend against it. It took only 10 minutes of playing BF4 without perfect frame rate, and I already started to forget what I am loosing. This is one sneaky, deceitful phenomenon :D
No question that g-sync is the real deal, and it makes a difference. I played enough twitch shooter to get a good sense of the value it provides.
But, if it's a cage match between 3D and g-Sync, there is no chance in hell that I'm giving up 3D to play in smooth 2D. When I played in 2D/G-sync I thought "this is nice", and it felt smooth, very enjoyable. But 3D often makes me laugh out loud with joy.
That's why I want both, and am impatient with people saying "it can't be done."
No question that g-sync is the real deal, and it makes a difference. I played enough twitch shooter to get a good sense of the value it provides.
But, if it's a cage match between 3D and g-Sync, there is no chance in hell that I'm giving up 3D to play in smooth 2D. When I played in 2D/G-sync I thought "this is nice", and it felt smooth, very enjoyable. But 3D often makes me laugh out loud with joy.
That's why I want both, and am impatient with people saying "it can't be done."
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I totally agree. 3d gives you an unique experience in exploration, discovery and fun. I have spent hours in Far Cry 3 (A jewel thanks to helix patch) just exploring and enjoying the environment.
Finally I have ordered the XB270HA. Hope to get it in a few days.
I think I'll take advantage of both 3d and g sync. When I play alone I prefer 3d, but playing "more competitive" online, I think (I hope) G-Sync will make the difference.
I totally agree. 3d gives you an unique experience in exploration, discovery and fun. I have spent hours in Far Cry 3 (A jewel thanks to helix patch) just exploring and enjoying the environment.
Finally I have ordered the XB270HA. Hope to get it in a few days.
I think I'll take advantage of both 3d and g sync. When I play alone I prefer 3d, but playing "more competitive" online, I think (I hope) G-Sync will make the difference.
I believe that any discussion of getting a new monitor should at least consider a DLP projector. It's the right choice for everyone, but it should be considered.
Specifically, I'm trying to answer the OPs question for g-sync. As to whether it is worth it or not. And also to avoid having to upgrade his 680 SLI. If he goes with more pixels, he has to upgrade, making it all much more expensive. If he goes with a DLP projector he can stay with cheaper hardware and still have a terrific experience.
But I do agree it's a bit off-topic.
Still I'd be curious about your "comfortability" with H5360. Maybe too low contrast? That's a genuine drawback of projectors.
For OLEDs, those are pretty promising, with fast switch times. There are some pricey TVs that are OLED that ought to work with 3D TV Play, but I haven't heard anyone try it.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
That's right, tearing only happens at Higher than refresh rate, as you draw multiple images while the screen is up.With vSync off, you get tearing. With vSync on, that solves the tearing, by locking the fastest rate to the refresh. But, that causes stutter if you ever slip below the refresh rate. If you ever slip below the refresh rate, you'll skip an entire frame, so two frames in a row will be identical- stutter.G-Sync is more oriented around the latter, the stutter. But it also allows you to run with a higher refresh rate. So if before you could run at 125 fps, with dips to 75 fps, you'd have to choose between two evils. Run at 120 to avoid tearing mostly, or run at 75 with vsync to avoid stutter.
With G-Sync you don't have to choose. You set to 120 Hz refresh, and if it ever slips below that rate, the frame is delayed. So no need for vsync, and no stutter. Down to a minimum frame rate, which is generally set at 30 fps, where g-sync is disabled, and you get stutter if it slips that low.
More in depth discussion:
http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
Does that help?
Edit: Sorry brain fade there- tearing happens when slower or faster than refresh, when vsync is off. When faster, you can get multiple tears as seen here:
http://www.anandtech.com/show/7582/nvidia-gsync-review
When slower, you can still get a single tear, which can still be bad, especially as you rotate.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
And even if my gpu can reach the refresh rate in other games, a frame capper would make the trick, wouldn't it?
I mean, if fps don't go beyond refresh rate (due to gpu's low performance OR a frame capper) I don't need any "synchro" system (v sync or g sync), right?
Personally, I ALWAYS have played with v-sync OFF so never have experienced lag or stuttering due to this. AND in older games where fps go far beyond monitor's refresh rate, I use frame cappers at 120. Honestly, still don't understand when or why to use g-sync
I think you have possibly mixed up which problems are fixed by certain functions.
Stuttering is caused when the GPU fails to render a full image "frame" fast enough to keep up with the refresh rate of your monitor. When this happens the card spits out the last fully rendered image again, so you get 2 of the same image. This causes input lag for your mouse and your brain sees the image hitching and stuttering - so you end up feeling the stuttering in your aim, and you can see it happen. This problem can happen at any framerate (more often at lower framerates) and especially when your framerates are fluctuating. G-Sync cures this by altering the refresh rate of the monitor to match the output of the graphics card - and the results are pretty much perfect.
Your mouse input never suffers, you don't get stuttering and FPS games particularly are far more enjoyable and you can be more accurate with your shots. This is because games will not track your input devices whilst frames are duplicated and G-sync stops this happening.
Tearing is caused by the FPS exceeding the refresh rate of the monitor. G-sync avoids this because it sets the refresh rate of the screen to match the GPU output. Also, you get rid of tearing just by using any 120hz monitor anyway - unless you can still run your games faster than 120FPS. Tearing can be "fixed" with V-Sync or Framerate capping, but you don't get rid of stuttering.
If fps don't go beyond refresh rate (due to gpu's low performance OR a frame capper) you do still need a synchronisation system - because any GPU will still suffer from out of sync frames unless it can constantly supply 60 or 120 FPS for the entirety of gameplay without fail - which isn't going to happen unless you really go overboard with a huge single GPU and are mostly playing older games, or you turn down gfx settings.
You still want G-Sync because it stops frame stuttering caused by frames being rendered twice when your card can't output them fast enough. The added smoothness to the image is great, however what can't be understated is how damn well it increases the smoothness of the mouse input, it makes FPS games feel brand new, and for me it made a noticeable improvement in my online play. I'm basically a free kill, but playing with G-Sync actually enabled me to start being competitive.
You may believe you have never experienced lag or stuttering, however you have probably just got used to the way "normal" monitors work. The problem is that we all got used to lag and stuttering, and some of us are more/less sensitive to it than others.
I really started to notice it when I started using SLI a few years ago, and since using an ROG SWIFT for the last 3 months I can't stand gaming on my girlfriends PC because her monitor kills my eyes with the slow response time and all the stuttering - which for some reason is worse in WoW than other games.
I'm not saying everyone *should* rush out and get G-sync, but it is the way all monitors should work, and its only because lazy engineers applied CRT timing principles to LCD technology that we have this problem in the first place. It's also why AMD and the other guys came up with Freesync. Like 3D and VR its something you can't even start to appreciate until you have a go with it, and have time to adapt - and then try going back again afterwards and this makes it a hard sell for NVidia and the monitor manufacturers.
Now I've used it, I'm not going back. I'm not hard-core enough to forgo playing 2D games, so G-sync keeps me happy enough when 3D isn't possible.
i7 4790k @ 4.6 - 16GB RAM - 2x SLI Titan X
27" ASUS ROG SWIFT, 28" - 65" Samsung UHD8200 4k 3DTV - Oculus Rift CV1 - 34" Acer Predator X34 Ultrawide
Old kit:
i5 2500k @ 4.4 - 8gb RAM
Acer H5360BD projector
GTX 580, SLI 670, GTX 980 EVGA SC
Acer XB280HK 4k 60hz
Oculus DK2
Wolfenstein: The New Order, which uses a form of Adaptive V-Sync, is a good example of this. Console games also suffer badly from screen tearing, particularly those on the PS3 and X360, and console games are almost always far below the display's refresh rate.
Btw FOULPLAY99, Have you said you use sli? Me too. How is the experience with sli+g sync? Any problem?
i7 4930K @ 4.4GHz
Asus P9X79 Pro
3 Way SLI Titan Black @ 1400mhz skyn3t VBIOS (Hardvolt Mod)
Mushkin Redline @ 2200MHz 32GB
Asus Xonar U7 Echelon Soundcard
Samsung Pro 256 GB SSD Games
Samsung Evo 256 GB SSD Windows 8.1 Pro
Samsung Evo 256 GB SSD Windows 7 Ultimate
Asus ROG Swift 1440p 144hz G-Sync
PSU Corsair AX1500i
Astro A50 Wireless Headset
Corsair 800D Case Custom Waterloop
I sold my SLI 670's and ASUS VG278H when the 980's came out and went to the ROG SWIFT and single 980. Probably just as well considering the SLI issues reported with that monitor in one of the other threads.
I found G-sync odd to get used to at first, it almost felt too smooth, which is odd because its supposed to be smooth. I guess that's just us getting used to the lack of stuttering.
i7 4790k @ 4.6 - 16GB RAM - 2x SLI Titan X
27" ASUS ROG SWIFT, 28" - 65" Samsung UHD8200 4k 3DTV - Oculus Rift CV1 - 34" Acer Predator X34 Ultrawide
Old kit:
i5 2500k @ 4.4 - 8gb RAM
Acer H5360BD projector
GTX 580, SLI 670, GTX 980 EVGA SC
Acer XB280HK 4k 60hz
Oculus DK2
I'm quite certain he's referring to
https://forums.geforce.com/default/topic/787889/3d-vision/3d-vision-and-sli-not-working-correctly-for-asus-rog-swift-pg278q-/
Yes, sorry, I edited to my post above. Tearing definitely happens when slower than refresh as well.
If you can change game settings to sustain 120 fps as the minimum frame rate, then with vsync on you'll have the g-sync experience in 3D.
I'd just love to be able to run slower frame rates, more eye-candy, but in 3D.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
And then, in motion, you'll get less eye candy than lowest setting, but while playing at 120fps.
Less details. Less 3D "wow" moments. For example in explosions. It's amazing when all the particles, debris, in 3D are visible. And you know what? You wouldn't even know what you're just missing!
If you change the settings, you look for a difference. You stop and search for improvements. You can see the difference right away, everywhere. This all is judged by a static image quality, and games tend to have a little of motion in them ;) .
On the other side - there is quality of the motion, level of details clearly visible while tracking an object with your eyes. This is not visible everywhere, and therefore harder to judge.
This is also affecting your fun level, subconsciously. You may focus on something like driving your car or aiming, and miss some things. You'll have more fun, this will be something you'll be aware of, but you might not even know why did you like the previous scene in particular. This includes the sense of speed in driving games, more sources of fun (crucial when you play a game that bores you in particular section!) for example by allowing you to have more fun from simple exploring the world, immerse into that world, admire all the scenery more.
Bigger immersion. This one is important, and can have other shapes: stable 100fps in BF4 makes demolishing everything with a tank so much more fun and immersive - those pieces of concrete, those particles, those little details on everything in motion.
There is also another thing - you don't have a "beep sound" to let you know your frame rate just dropped below the desired limit.
You can have drops below 60fps or whatever your refresh rate is, making it even harder to spot the difference.
This topic fascinates me since more than a decade, and still, even now, I can also miss the difference. This mechanism needed just a few month to affect me - the person who warns other gamers of it's deceitful phenomenon. Still, I have a brain just like everybody does. Prone to the same distractions as everybody.
Every time I was surprised how unaware I was in terms of what I was missing, when I finally went back to gaming with crystal clarity in motion.
Every damn time. I felt like a fool, for doing the same mistake again, but also it makes the topic even more interesting. :)
I start to think I'll never learn to defend against it. It took only 10 minutes of playing BF4 without perfect frame rate, and I already started to forget what I am loosing. This is one sneaky, deceitful phenomenon :D
But, if it's a cage match between 3D and g-Sync, there is no chance in hell that I'm giving up 3D to play in smooth 2D. When I played in 2D/G-sync I thought "this is nice", and it felt smooth, very enjoyable. But 3D often makes me laugh out loud with joy.
That's why I want both, and am impatient with people saying "it can't be done."
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Finally I have ordered the XB270HA. Hope to get it in a few days.
I think I'll take advantage of both 3d and g sync. When I play alone I prefer 3d, but playing "more competitive" online, I think (I hope) G-Sync will make the difference.