[quote="Shinra358"]I don't get any of these issues either. Vsync input delay? Never heard of it nor experienced it.[/quote]
I have in a 100 different games. Most games are not usable with it turned on with the power of video card i usually am able to afford, which is medium/high.
Shinra358 said:I don't get any of these issues either. Vsync input delay? Never heard of it nor experienced it.
I have in a 100 different games. Most games are not usable with it turned on with the power of video card i usually am able to afford, which is medium/high.
Humm.... looking at the specification it appears that G-Sync is not purely a dynamic 'the display will follow whatever random refresh rate the GPU card tosses out' (IE: recall the old NEC "multisync" monitors of the 1980's-1990's) sort of thing but rather is an expanded list of possibilities (if I read it right). Therefore instead of 30/60/120hz there's a few other refresh rates tossed in there.
With the amount of hardware most of us have and games many of us play I'll agree its mostly a non-issue for us but if the concept can be applied to active shutter glasses, then LCD users might not be stuck with 15/30/60/120 rates and in some cases get smoother game play.
On the other hand those of us with DLP projectors are SOL as the G-SYNC would have to be built into the projector itself along with someway of controlling the speed of the color wheel. I'm not knowledgeable enough on TI DLP technology to know if this is even a viable option/possibility or not.
Humm.... looking at the specification it appears that G-Sync is not purely a dynamic 'the display will follow whatever random refresh rate the GPU card tosses out' (IE: recall the old NEC "multisync" monitors of the 1980's-1990's) sort of thing but rather is an expanded list of possibilities (if I read it right). Therefore instead of 30/60/120hz there's a few other refresh rates tossed in there.
With the amount of hardware most of us have and games many of us play I'll agree its mostly a non-issue for us but if the concept can be applied to active shutter glasses, then LCD users might not be stuck with 15/30/60/120 rates and in some cases get smoother game play.
On the other hand those of us with DLP projectors are SOL as the G-SYNC would have to be built into the projector itself along with someway of controlling the speed of the color wheel. I'm not knowledgeable enough on TI DLP technology to know if this is even a viable option/possibility or not.
I've always really hated irregular framerates. But maybe what I was hating all along is the stutter caused by a non-60hz image being displayed on a 60hz screen?
I hope so. Because it seems that with G-sync, we're entering a world of permanently irregular framerates.
[quote="Shinra358"]I don't get any of these issues either. Vsync input delay? Never heard of it nor experienced it.[/quote] It's a very real thing. If you try a mouse-based shooter with vsync off, and then on, you'll notice the difference. Moving in the former feels instantaneous. Moving in the latter doesn't.
It might not be applicable to projectors, and it's probably much harder to detect with a gamepad. But with a mouse, it's enough to make many gamers detest vsync. (not me, I detest screen tearing even more, so I always leave vsync on)
[quote="Cookybiscuit"]Another Nvidia technology that had hope yet is rendered completely worthless because its proprietary.[/quote]
At least this will work on all games, unlike something like TXAA or HBAO which gets supported by like 3 games per year.
[quote="mbloof"]Humm.... looking at the specification it appears....................instead of 30/60/120hz there's a few other refresh rates tossed in there.[/quote] If so, that certainly makes their marketing rhetoric seem very misleading. Is there any way to know which ones, or how many extra ones there are?
I've always really hated irregular framerates. But maybe what I was hating all along is the stutter caused by a non-60hz image being displayed on a 60hz screen?
I hope so. Because it seems that with G-sync, we're entering a world of permanently irregular framerates.
Shinra358 said:I don't get any of these issues either. Vsync input delay? Never heard of it nor experienced it.
It's a very real thing. If you try a mouse-based shooter with vsync off, and then on, you'll notice the difference. Moving in the former feels instantaneous. Moving in the latter doesn't.
It might not be applicable to projectors, and it's probably much harder to detect with a gamepad. But with a mouse, it's enough to make many gamers detest vsync. (not me, I detest screen tearing even more, so I always leave vsync on)
Cookybiscuit said:Another Nvidia technology that had hope yet is rendered completely worthless because its proprietary.
At least this will work on all games, unlike something like TXAA or HBAO which gets supported by like 3 games per year.
mbloof said:Humm.... looking at the specification it appears....................instead of 30/60/120hz there's a few other refresh rates tossed in there.
If so, that certainly makes their marketing rhetoric seem very misleading. Is there any way to know which ones, or how many extra ones there are?
[s]Reading over it, I'm not sure what they are trying to achieve. A 120Hz display doesn't have any noticeable input lag anyway, if they are planning to moving the refresh rate along with framerate all thats going to do is introduce input lag, since you are going to be down to getting new frames once every 60th of a second rather than 120th, or whatever.[/s]
Reading over it some more... it seems like the frame is created, then the GPU tells the monitor to refresh, theres no wait time. So in theory, its like having an infinite refresh rate, since assuming theres no delays the frame will be shown upon its creation. It's an interesting concept, but I have my doubts as to its use outside of 60Hz displays.
Reading over it, I'm not sure what they are trying to achieve. A 120Hz display doesn't have any noticeable input lag anyway, if they are planning to moving the refresh rate along with framerate all thats going to do is introduce input lag, since you are going to be down to getting new frames once every 60th of a second rather than 120th, or whatever.
Reading over it some more... it seems like the frame is created, then the GPU tells the monitor to refresh, theres no wait time. So in theory, its like having an infinite refresh rate, since assuming theres no delays the frame will be shown upon its creation. It's an interesting concept, but I have my doubts as to its use outside of 60Hz displays.
[quote="Volnaiskra"]I've always really hated irregular framerates. But maybe what I was hating all along is the stutter caused by a non-60hz image being displayed on a 60hz screen?[/quote]
I think maybe so because i've been a gamer for a long time now and i've only recently realized just how choppy un-synced content is. It hit me when i enabled it in Prince of Persia (2008), all the menu art animations were so smooth looking. Now in Metro Last Light, which is unusually playable with v-sync on, it definitely helps the immersion a bit.
Follow-up comment on motion interpolation: Just occurred to me that motion interpolation has been so good on my Sony and now Samsung, that id bet its accurate enough that it could be used to double the 3D refresh rate to 240hz. LCD can't switch that fast im assuming, but maybe with OLED.
Volnaiskra said:I've always really hated irregular framerates. But maybe what I was hating all along is the stutter caused by a non-60hz image being displayed on a 60hz screen?
I think maybe so because i've been a gamer for a long time now and i've only recently realized just how choppy un-synced content is. It hit me when i enabled it in Prince of Persia (2008), all the menu art animations were so smooth looking. Now in Metro Last Light, which is unusually playable with v-sync on, it definitely helps the immersion a bit.
Follow-up comment on motion interpolation: Just occurred to me that motion interpolation has been so good on my Sony and now Samsung, that id bet its accurate enough that it could be used to double the 3D refresh rate to 240hz. LCD can't switch that fast im assuming, but maybe with OLED.
I have to admit its a bit confusing and/or missleading.
G-Sync refresh rate: 30-144hz
2D refresh rate: 60, 85, 100, 120, 144hz
Considering a LCD is persistent they could list 0.05-144Hz as the refresh rate. Not sure why the 2D rate(s) would be different - (I was thinking of the FIXED 2D rates applying to everything but 3DVision but now realize that marketing mumbojumbo long ago equated '3D' to non-stereoscopic graphics).
Sorry, confusion on my part. It appears that *some kind of content* will operate at 30-144Hz which could be a very big deal for some.
Considering a LCD is persistent they could list 0.05-144Hz as the refresh rate. Not sure why the 2D rate(s) would be different - (I was thinking of the FIXED 2D rates applying to everything but 3DVision but now realize that marketing mumbojumbo long ago equated '3D' to non-stereoscopic graphics).
Sorry, confusion on my part. It appears that *some kind of content* will operate at 30-144Hz which could be a very big deal for some.
[quote="mbloof"]I have to admit its a bit confusing and/or missleading.
G-Sync refresh rate: 30-144hz
2D refresh rate: 60, 85, 100, 120, 144hz
Considering a LCD is persistent they could list 0.05-144Hz as the refresh rate. Not sure why the 2D rate(s) would be different - (I was thinking of the FIXED 2D rates applying to everything but 3DVision but now realize that marketing mumbojumbo long ago equated '3D' to non-stereoscopic graphics).
Sorry, confusion on my part. It appears that *some kind of content* will operate at 30-144Hz which could be a very big deal for some.[/quote]Oh, ok. So it seems that any kind of GPU-powered content will be 30-144 (but Microsoft Office will have to make do with 60, 85, 100, 120, 144).
The more I think about it, the more I like the idea of this. It always struck me as odd that I've always been able to tell the difference between 60fps and 59fps, almost every single time there's a fps dip.
It didn't seem right that I'd notice such a tiny change in speed. But of course, now that I think about it, what I was noticing was not a change in speed, but completely dropped and/or doubled frames (ie. a bonafide glitch in the image, rather than just a tiny slowdown).
Considering a LCD is persistent they could list 0.05-144Hz as the refresh rate. Not sure why the 2D rate(s) would be different - (I was thinking of the FIXED 2D rates applying to everything but 3DVision but now realize that marketing mumbojumbo long ago equated '3D' to non-stereoscopic graphics).
Sorry, confusion on my part. It appears that *some kind of content* will operate at 30-144Hz which could be a very big deal for some.
Oh, ok. So it seems that any kind of GPU-powered content will be 30-144 (but Microsoft Office will have to make do with 60, 85, 100, 120, 144).
The more I think about it, the more I like the idea of this. It always struck me as odd that I've always been able to tell the difference between 60fps and 59fps, almost every single time there's a fps dip.
It didn't seem right that I'd notice such a tiny change in speed. But of course, now that I think about it, what I was noticing was not a change in speed, but completely dropped and/or doubled frames (ie. a bonafide glitch in the image, rather than just a tiny slowdown).
All new idea, and it seems terrific. I jumped the gun earlier talking about projectors, and looking at it a little deeper, it should match projectors too. It's changing when the buffer is shown, not changing anything about the native refresh, like the spinning wheel.
It looks to me like it will only only ever slow the sync, and cannot speed up the sync, to avoid serious problems with the physics of displays. Minimum frame rate on a monitor is 30 fps, at which point it gives up and allows stuttering.
So frame rate is variable, but only slows down to match GPU output, and caps at native refresh rate.
No reason it wouldn't work on projectors too, although currently it appears to require Displayport.
Pretty good writeup:
[url]http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness[/url]
From there, check out this slide. 3D Vision in a slide, from NVidia! Always nice to know we aren't dead yet. More to the point, G-Sync is supposed to help 3D Vision playing in the same way. That also means that the glasses infrared sync will be variable to match.
[img]http://images.anandtech.com/doci/7436/GEFORCE-G-SYNC-Performance_Chart.jpg[/img]
All new idea, and it seems terrific. I jumped the gun earlier talking about projectors, and looking at it a little deeper, it should match projectors too. It's changing when the buffer is shown, not changing anything about the native refresh, like the spinning wheel.
It looks to me like it will only only ever slow the sync, and cannot speed up the sync, to avoid serious problems with the physics of displays. Minimum frame rate on a monitor is 30 fps, at which point it gives up and allows stuttering.
So frame rate is variable, but only slows down to match GPU output, and caps at native refresh rate.
No reason it wouldn't work on projectors too, although currently it appears to require Displayport.
From there, check out this slide. 3D Vision in a slide, from NVidia! Always nice to know we aren't dead yet. More to the point, G-Sync is supposed to help 3D Vision playing in the same way. That also means that the glasses infrared sync will be variable to match.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
That Anandtech article has me convinced. I hope they release a DIY kit for my monitor. Otherwise, that VG248QE is yet another upgrade that's going to tempt me... :/
But why is 3Dvision only 100 or 120Hz?
That Anandtech article has me convinced. I hope they release a DIY kit for my monitor. Otherwise, that VG248QE is yet another upgrade that's going to tempt me... :/
This was a no brainer, very easy to understand as every Pc gamer know what the hell stutter/lag/tearing is and to finaly get rid of it is a dream come true and you heard it from the genuises themself, the BIGGEST thing happen since HD bla bla!
This is totaly revolutionary in every way and for us 3d gamers it will be a hell of a lot better then we could ever dream of.as said all these lag/stutter is the worst thing we have ever had to put up with but now we finaly will get rid of it.
Together with the other good stuff nvidia showed just says how frikking awesome they are and have allways been.
AMD Mantle bla bla who the fuck is AMD heh.
This was a no brainer, very easy to understand as every Pc gamer know what the hell stutter/lag/tearing is and to finaly get rid of it is a dream come true and you heard it from the genuises themself, the BIGGEST thing happen since HD bla bla!
This is totaly revolutionary in every way and for us 3d gamers it will be a hell of a lot better then we could ever dream of.as said all these lag/stutter is the worst thing we have ever had to put up with but now we finaly will get rid of it.
Together with the other good stuff nvidia showed just says how frikking awesome they are and have allways been.
[quote="Volnaiskra"]That Anandtech article has me convinced. I hope they release a DIY kit for my monitor. Otherwise, that VG248QE is yet another upgrade that's going to tempt me... :/
But why is 3Dvision only 100 or 120Hz? [/quote]Pretty sure those are native monitor refresh rates. So, if you have 100Hz monitor it will still work. For 2D for example, it goes as low as 60Hz, which would be a normal monitor maximum rate. G-Sync would still help in your example of hitting 59 fps.
Volnaiskra said:That Anandtech article has me convinced. I hope they release a DIY kit for my monitor. Otherwise, that VG248QE is yet another upgrade that's going to tempt me... :/
But why is 3Dvision only 100 or 120Hz?
Pretty sure those are native monitor refresh rates. So, if you have 100Hz monitor it will still work. For 2D for example, it goes as low as 60Hz, which would be a normal monitor maximum rate. G-Sync would still help in your example of hitting 59 fps.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
But why doesn't it say
3D Vision: 100-120hz (or, perhaps, 60-120hz)?
It sounds to me like 3Dvision will remain on a static refresh rate (either 100 or 120), but I don't understand why.
[quote="Volnaiskra"]But why doesn't it say
3D Vision: 100-120hz (or, perhaps, 60-120hz)?
It sounds to me like 3Dvision will remain on a static refresh rate (either 100 or 120), but I don't understand why.[/quote]
Because even as low as 100Hz you can see the flicker plain as day, you can try it yourself by making a custom resolution in the Nvidia control panel.
I still don't see its value on 120/144Hz displays, though that said I obviously haven't tried it. I think the impact of this would be massive at 60Hz, but apparently it only works on TN displays currently.
It sounds to me like 3Dvision will remain on a static refresh rate (either 100 or 120), but I don't understand why.
Because even as low as 100Hz you can see the flicker plain as day, you can try it yourself by making a custom resolution in the Nvidia control panel.
I still don't see its value on 120/144Hz displays, though that said I obviously haven't tried it. I think the impact of this would be massive at 60Hz, but apparently it only works on TN displays currently.
Yeah. It doesn't seem like it'll really help with 3D Vision using shutter glasses. Though it would be excellent, I would imagine, for polarized, glasses free, or VR 3D.
Also, totally unrelated to 3D, but it also seems awesome for emulation of arcade games that all run at different refresh rates. Native hz emulation.
Yeah. It doesn't seem like it'll really help with 3D Vision using shutter glasses. Though it would be excellent, I would imagine, for polarized, glasses free, or VR 3D.
Also, totally unrelated to 3D, but it also seems awesome for emulation of arcade games that all run at different refresh rates. Native hz emulation.
I have in a 100 different games. Most games are not usable with it turned on with the power of video card i usually am able to afford, which is medium/high.
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
With the amount of hardware most of us have and games many of us play I'll agree its mostly a non-issue for us but if the concept can be applied to active shutter glasses, then LCD users might not be stuck with 15/30/60/120 rates and in some cases get smoother game play.
On the other hand those of us with DLP projectors are SOL as the G-SYNC would have to be built into the projector itself along with someway of controlling the speed of the color wheel. I'm not knowledgeable enough on TI DLP technology to know if this is even a viable option/possibility or not.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
I hope so. Because it seems that with G-sync, we're entering a world of permanently irregular framerates.
It's a very real thing. If you try a mouse-based shooter with vsync off, and then on, you'll notice the difference. Moving in the former feels instantaneous. Moving in the latter doesn't.
It might not be applicable to projectors, and it's probably much harder to detect with a gamepad. But with a mouse, it's enough to make many gamers detest vsync. (not me, I detest screen tearing even more, so I always leave vsync on)
At least this will work on all games, unlike something like TXAA or HBAO which gets supported by like 3 games per year.
If so, that certainly makes their marketing rhetoric seem very misleading. Is there any way to know which ones, or how many extra ones there are?
Reading over it, I'm not sure what they are trying to achieve. A 120Hz display doesn't have any noticeable input lag anyway, if they are planning to moving the refresh rate along with framerate all thats going to do is introduce input lag, since you are going to be down to getting new frames once every 60th of a second rather than 120th, or whatever.Reading over it some more... it seems like the frame is created, then the GPU tells the monitor to refresh, theres no wait time. So in theory, its like having an infinite refresh rate, since assuming theres no delays the frame will be shown upon its creation. It's an interesting concept, but I have my doubts as to its use outside of 60Hz displays.
I think maybe so because i've been a gamer for a long time now and i've only recently realized just how choppy un-synced content is. It hit me when i enabled it in Prince of Persia (2008), all the menu art animations were so smooth looking. Now in Metro Last Light, which is unusually playable with v-sync on, it definitely helps the immersion a bit.
Follow-up comment on motion interpolation: Just occurred to me that motion interpolation has been so good on my Sony and now Samsung, that id bet its accurate enough that it could be used to double the 3D refresh rate to 240hz. LCD can't switch that fast im assuming, but maybe with OLED.
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
G-Sync refresh rate: 30-144hz
2D refresh rate: 60, 85, 100, 120, 144hz
Considering a LCD is persistent they could list 0.05-144Hz as the refresh rate. Not sure why the 2D rate(s) would be different - (I was thinking of the FIXED 2D rates applying to everything but 3DVision but now realize that marketing mumbojumbo long ago equated '3D' to non-stereoscopic graphics).
Sorry, confusion on my part. It appears that *some kind of content* will operate at 30-144Hz which could be a very big deal for some.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
The more I think about it, the more I like the idea of this. It always struck me as odd that I've always been able to tell the difference between 60fps and 59fps, almost every single time there's a fps dip.
It didn't seem right that I'd notice such a tiny change in speed. But of course, now that I think about it, what I was noticing was not a change in speed, but completely dropped and/or doubled frames (ie. a bonafide glitch in the image, rather than just a tiny slowdown).
It looks to me like it will only only ever slow the sync, and cannot speed up the sync, to avoid serious problems with the physics of displays. Minimum frame rate on a monitor is 30 fps, at which point it gives up and allows stuttering.
So frame rate is variable, but only slows down to match GPU output, and caps at native refresh rate.
No reason it wouldn't work on projectors too, although currently it appears to require Displayport.
Pretty good writeup:
http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
From there, check out this slide. 3D Vision in a slide, from NVidia! Always nice to know we aren't dead yet. More to the point, G-Sync is supposed to help 3D Vision playing in the same way. That also means that the glasses infrared sync will be variable to match.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
But why is 3Dvision only 100 or 120Hz?
This is totaly revolutionary in every way and for us 3d gamers it will be a hell of a lot better then we could ever dream of.as said all these lag/stutter is the worst thing we have ever had to put up with but now we finaly will get rid of it.
Together with the other good stuff nvidia showed just says how frikking awesome they are and have allways been.
AMD Mantle bla bla who the fuck is AMD heh.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
3D Vision: 100-120hz (or, perhaps, 60-120hz)?
It sounds to me like 3Dvision will remain on a static refresh rate (either 100 or 120), but I don't understand why.
Because even as low as 100Hz you can see the flicker plain as day, you can try it yourself by making a custom resolution in the Nvidia control panel.
I still don't see its value on 120/144Hz displays, though that said I obviously haven't tried it. I think the impact of this would be massive at 60Hz, but apparently it only works on TN displays currently.
Also, totally unrelated to 3D, but it also seems awesome for emulation of arcade games that all run at different refresh rates. Native hz emulation.