[quote="logzz"]is gsync really this good lol? I cant stand any sort of vsync I find them a waste of time, now I cant comment on gsync because I have not tried it. Some games im getting 120fps at 120hz so I cant see gsync doing that much. I will give it a go tho when I get the swift for sure but to be fair I have not been interested in it at all. Maybe that will change. Haha maybe gsync is going to be another thing that dies like 3d vision and PhysX, I guess that's why I don't have much faith in it NVidia don't seem interested in backing anything they bring out after a year or so it dies a sad death[/quote]
A big YES IT IS, atleast thats what all the previews/reviews says :)
I allways play with V-Sync off as I hate stutter/lag more then tearing and I really hate tearing aswell as it totaly rips/destroys the picture/game and after living with this crap all these years and finaly say godbye to it will for me be the best thing ever happening so next week cant come soon enough :D
And I cant see it going away as this is a total gamechanger, it allready is a smashing succes and every gamer thats once tested it will never go back to v-sync/tearing crap, G-Sync is the future!
logzz said:is gsync really this good lol? I cant stand any sort of vsync I find them a waste of time, now I cant comment on gsync because I have not tried it. Some games im getting 120fps at 120hz so I cant see gsync doing that much. I will give it a go tho when I get the swift for sure but to be fair I have not been interested in it at all. Maybe that will change. Haha maybe gsync is going to be another thing that dies like 3d vision and PhysX, I guess that's why I don't have much faith in it NVidia don't seem interested in backing anything they bring out after a year or so it dies a sad death
A big YES IT IS, atleast thats what all the previews/reviews says :)
I allways play with V-Sync off as I hate stutter/lag more then tearing and I really hate tearing aswell as it totaly rips/destroys the picture/game and after living with this crap all these years and finaly say godbye to it will for me be the best thing ever happening so next week cant come soon enough :D
And I cant see it going away as this is a total gamechanger, it allready is a smashing succes and every gamer thats once tested it will never go back to v-sync/tearing crap, G-Sync is the future!
Just to make sure: is the ROG Swift Monitor able to use G sync when 3D Vison is enabled? As Bo3b said, it isn't possible with the G Sync Kit. But how about the Asus Monitor? Is GSync working in 3D Vision with the ROG Swift? Thanks.
Just to make sure: is the ROG Swift Monitor able to use G sync when 3D Vison is enabled? As Bo3b said, it isn't possible with the G Sync Kit. But how about the Asus Monitor? Is GSync working in 3D Vision with the ROG Swift? Thanks.
Hi Sam,
If I'm not mistaken, several people confirmed already that whenever G-Sync is enabled by the user, the monitor switches off 3D Vision, and vice versa.
Hi Sam,
If I'm not mistaken, several people confirmed already that whenever G-Sync is enabled by the user, the monitor switches off 3D Vision, and vice versa.
Yep, I expect there is no difference on Swift, because this is at the driver level. NVidia Control Panel:
[img]https://forums.geforce.com/cmd/default/download-comment-attachment/61408/[/img]
I'm super bummed, because 3D is where I really need G-Sync because of the low frame rates.
Do you think this can be fixed with a driver update? Or is it a hardware issue?
With working 3D Vision AND GSync I would order the monitor in the next minute:-)
Best, Sam
Do you think this can be fixed with a driver update? Or is it a hardware issue?
With working 3D Vision AND GSync I would order the monitor in the next minute:-)
I don't think anyone knows for sure. I've seen a lot of people make some guesses to say they are mutually incompatible technology, but my guess is that they could both run at once in principle. Whether the specific hardware can or not, is an open question.
For example, I know for a fact that V1 3D Vision glasses will shutter as low as 85 Hz (42.5 per eye). They used to allow that on CRT, not sure if they do now. We know that the glasses will run up to 120Hz as their normal frequency. Getting to a better 144 would be logical, but maybe they are too slow. Why can't we already use 3D Vision at 144? Not clear.
If we had that range for gsync of 85-144, that would still be a pretty good active range. Not as good as the 30-144 of 2D, but still good.
They probably don't want to go below 85 because of flicker reasons, although I think that is largely overblown, and in any case should be a user-preference, not some hard-coded thing that is 'good for everyone.'
But, the real question, the real unknown is whether the glasses can do [i]dynamic[/i] shuttering, where the frequency changes all the time like with gsync. If they have some sort of forced timing circuit, then current tech would not match. My reading on how the glasses work suggests that it's just an IR signal for on/off, not something complicated with a timing circuit.
If that's true, then there is no technical reason why it wouldn't work, it's just some ill-founded fear of flicker that denies them both working at once. Or perhaps driver level design flaws or something like that.
If all that is true- it would be possible with a driver update. But, like with all things NVidia, don't hold your breath. I'd like to buy one too, but there is no chance I will do that unless it supports both 3D and GSync at the same time.
I don't think anyone knows for sure. I've seen a lot of people make some guesses to say they are mutually incompatible technology, but my guess is that they could both run at once in principle. Whether the specific hardware can or not, is an open question.
For example, I know for a fact that V1 3D Vision glasses will shutter as low as 85 Hz (42.5 per eye). They used to allow that on CRT, not sure if they do now. We know that the glasses will run up to 120Hz as their normal frequency. Getting to a better 144 would be logical, but maybe they are too slow. Why can't we already use 3D Vision at 144? Not clear.
If we had that range for gsync of 85-144, that would still be a pretty good active range. Not as good as the 30-144 of 2D, but still good.
They probably don't want to go below 85 because of flicker reasons, although I think that is largely overblown, and in any case should be a user-preference, not some hard-coded thing that is 'good for everyone.'
But, the real question, the real unknown is whether the glasses can do dynamic shuttering, where the frequency changes all the time like with gsync. If they have some sort of forced timing circuit, then current tech would not match. My reading on how the glasses work suggests that it's just an IR signal for on/off, not something complicated with a timing circuit.
If that's true, then there is no technical reason why it wouldn't work, it's just some ill-founded fear of flicker that denies them both working at once. Or perhaps driver level design flaws or something like that.
If all that is true- it would be possible with a driver update. But, like with all things NVidia, don't hold your breath. I'd like to buy one too, but there is no chance I will do that unless it supports both 3D and GSync at the same time.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
3D Vision impressions, and a side by side with an IPS panel comparisson here...
[url]http://forums.overclockers.co.uk/showthread.php?p=26685844&posted=1#post26685844[/url]
BTW, just got the screen and let me tell you, it's the best thing I have ever purchased!
It totaly rules and changes the game, true 8bit wonderful colours with great contrast and super smooth together with g-sync. It's a wonder no doubt about it!
ROG SWIFT, the best monitor ever made, thanks Asus and nVidia for G-Sync, it's a total gamechanger, I love it :D
BTW, just got the screen and let me tell you, it's the best thing I have ever purchased!
It totaly rules and changes the game, true 8bit wonderful colours with great contrast and super smooth together with g-sync. It's a wonder no doubt about it!
ROG SWIFT, the best monitor ever made, thanks Asus and nVidia for G-Sync, it's a total gamechanger, I love it :D
A big YES IT IS, atleast thats what all the previews/reviews says :)
I allways play with V-Sync off as I hate stutter/lag more then tearing and I really hate tearing aswell as it totaly rips/destroys the picture/game and after living with this crap all these years and finaly say godbye to it will for me be the best thing ever happening so next week cant come soon enough :D
And I cant see it going away as this is a total gamechanger, it allready is a smashing succes and every gamer thats once tested it will never go back to v-sync/tearing crap, G-Sync is the future!
If I'm not mistaken, several people confirmed already that whenever G-Sync is enabled by the user, the monitor switches off 3D Vision, and vice versa.
Bye, Sam
3D Vision and GSync will not work together.
I'm super bummed, because 3D is where I really need G-Sync because of the low frame rates.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
With working 3D Vision AND GSync I would order the monitor in the next minute:-)
Best, Sam
For example, I know for a fact that V1 3D Vision glasses will shutter as low as 85 Hz (42.5 per eye). They used to allow that on CRT, not sure if they do now. We know that the glasses will run up to 120Hz as their normal frequency. Getting to a better 144 would be logical, but maybe they are too slow. Why can't we already use 3D Vision at 144? Not clear.
If we had that range for gsync of 85-144, that would still be a pretty good active range. Not as good as the 30-144 of 2D, but still good.
They probably don't want to go below 85 because of flicker reasons, although I think that is largely overblown, and in any case should be a user-preference, not some hard-coded thing that is 'good for everyone.'
But, the real question, the real unknown is whether the glasses can do dynamic shuttering, where the frequency changes all the time like with gsync. If they have some sort of forced timing circuit, then current tech would not match. My reading on how the glasses work suggests that it's just an IR signal for on/off, not something complicated with a timing circuit.
If that's true, then there is no technical reason why it wouldn't work, it's just some ill-founded fear of flicker that denies them both working at once. Or perhaps driver level design flaws or something like that.
If all that is true- it would be possible with a driver update. But, like with all things NVidia, don't hold your breath. I'd like to buy one too, but there is no chance I will do that unless it supports both 3D and GSync at the same time.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Sam
http://forums.overclockers.co.uk/showthread.php?p=26685844&posted=1#post26685844
It totaly rules and changes the game, true 8bit wonderful colours with great contrast and super smooth together with g-sync. It's a wonder no doubt about it!
ROG SWIFT, the best monitor ever made, thanks Asus and nVidia for G-Sync, it's a total gamechanger, I love it :D