http://www.neogaf.com/forum/showpost.php?p=86533216&postcount=1003
There's an Nvidia rep posting live in the neogaf thread and he mentions that 3D Vision is even better with G-Sync. So that's a major relief.
[url]http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming[/url]
Should also be noted it's going to be sold as a DIY module that most likely will be compatible with existing 3D Vision monitors.
As someone who never expected them to show VR, this probably exceeds my expectations. This is pretty cool.
EDIT: Carmack is going to be on stage with Sweeney and Dice guy in a little bit. Hopefully something like this will be allowed to get added onto the Rift. This way they could ship with a 90 - 120hz panel, but wouldn't have lag/tearing if the frame drop then and now (huge no-nos for VR).
Should also be noted it's going to be sold as a DIY module that most likely will be compatible with existing 3D Vision monitors.
As someone who never expected them to show VR, this probably exceeds my expectations. This is pretty cool.
EDIT: Carmack is going to be on stage with Sweeney and Dice guy in a little bit. Hopefully something like this will be allowed to get added onto the Rift. This way they could ship with a 90 - 120hz panel, but wouldn't have lag/tearing if the frame drop then and now (huge no-nos for VR).
WTF. I don't "get" exactly what this gsync actually does? I don't get screen tearing or stutering, etc and I'm fine with vsync (adaptive) Is it mainly a latency issue? I don't play competitive MP and often use a projector which has more delay then my benq mointors. Sounds like a bunch of Shit. No mention of 3D or VR. BOOOOO
WTF. I don't "get" exactly what this gsync actually does? I don't get screen tearing or stutering, etc and I'm fine with vsync (adaptive) Is it mainly a latency issue? I don't play competitive MP and often use a projector which has more delay then my benq mointors. Sounds like a bunch of Shit. No mention of 3D or VR. BOOOOO
[quote="Conan481"]WTF. I don't "get" exactly what this gsync actually does? I don't get screen tearing or stutering, etc and I'm fine with vsync (adaptive) Is it mainly a latency issue? I don't play competitive MP and often use a projector which has more delay then my benq mointors. Sounds like a bunch of Shit. No mention of 3D or VR. BOOOOO [/quote]
This GSync has the potential - if it works like they say it does - to be one of the biggest innovations in the PC market in a long while.
Basically, VSync and screen tearing will be a thing of the past.
Rather than forcing your GPU to sync frames with your monitors refresh rate, your monitor will sync refresh rate with whatever you're capable of rendering.
So if you can only push 47fps out, your monitors refresh rate will be 47hz and it will dynamically adapt to whatever your GPU is capable of.
This means that Vsync will be completely unneeded which means there's no need to deal with the sloppy input lag it introduces.
It also means that screen tearing will be entirely gone, as the screen will always show every individual frame with no need to cut a frame short or display it torn.
This is pretty huge. It solves one of the big issues in gaming of "30fps vs 60fps" as well as "sluggish Vsync with no tears" or "fluid responsive controls with tearing."
Conan481 said:WTF. I don't "get" exactly what this gsync actually does? I don't get screen tearing or stutering, etc and I'm fine with vsync (adaptive) Is it mainly a latency issue? I don't play competitive MP and often use a projector which has more delay then my benq mointors. Sounds like a bunch of Shit. No mention of 3D or VR. BOOOOO
This GSync has the potential - if it works like they say it does - to be one of the biggest innovations in the PC market in a long while.
Basically, VSync and screen tearing will be a thing of the past.
Rather than forcing your GPU to sync frames with your monitors refresh rate, your monitor will sync refresh rate with whatever you're capable of rendering.
So if you can only push 47fps out, your monitors refresh rate will be 47hz and it will dynamically adapt to whatever your GPU is capable of.
This means that Vsync will be completely unneeded which means there's no need to deal with the sloppy input lag it introduces.
It also means that screen tearing will be entirely gone, as the screen will always show every individual frame with no need to cut a frame short or display it torn.
This is pretty huge. It solves one of the big issues in gaming of "30fps vs 60fps" as well as "sluggish Vsync with no tears" or "fluid responsive controls with tearing."
What if the super secret reveal (that's related to G-Sync) is that they've licensed it to Oculus to be included in Rift.
Really looking forward to them finishing lunch so we can hear the roundtable with Carmack. I want to see just how much relevancy he thinks it'll have for VR.
What if the super secret reveal (that's related to G-Sync) is that they've licensed it to Oculus to be included in Rift.
Really looking forward to them finishing lunch so we can hear the roundtable with Carmack. I want to see just how much relevancy he thinks it'll have for VR.
I thought the tearing was because you were getting 'too many' FPS, not too little ... either way it does nothing for me as it can't compete with my 65" 3D DLP. :)
I thought the tearing was because you were getting 'too many' FPS, not too little ... either way it does nothing for me as it can't compete with my 65" 3D DLP. :)
It has a low persistence mode that's superior to even the LightBoost hack. I wish my Asus was the monitor they were going to let you DIY upgrade. Might have to just ditch it for a new one next year.
P.S. Low persistance mode is something that Carmack has been harping on as being essential for VR. So in a perfect world, they either come up with something similiar, or they can source a panel that's compatible (he already tweeted none of the current panels under consideration would work with it).
It has a low persistence mode that's superior to even the LightBoost hack. I wish my Asus was the monitor they were going to let you DIY upgrade. Might have to just ditch it for a new one next year.
P.S. Low persistance mode is something that Carmack has been harping on as being essential for VR. So in a perfect world, they either come up with something similiar, or they can source a panel that's compatible (he already tweeted none of the current panels under consideration would work with it).
That 3D vision "comment" doesn't make sense to me. "Superior 3d quality." Prob. just marketing bs. The good news is they will probably be rolling out more 120/144hz 2d/3d monitors. Though it might not last since they got to pay nvidia for this chip + 3d vision license....
Not saying it is or whatever but its important to note that nvidia does have a research + development towards making a hmd.
That 3D vision "comment" doesn't make sense to me. "Superior 3d quality." Prob. just marketing bs. The good news is they will probably be rolling out more 120/144hz 2d/3d monitors. Though it might not last since they got to pay nvidia for this chip + 3d vision license....
Not saying it is or whatever but its important to note that nvidia does have a research + development towards making a hmd.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
Maybe not just fluff. If I understand this right, they will also sync with LightBoost on the right monitors, and so that will sync with the actual frames delivered as well. Should be dramatically better for blur. This would help 3D also, on monitors.
[s]For projectors this seems like it won't work. The spinning wheel isn't going to be able to spin up and down to match the frame rates.
[/s]
OK, now G-Sync at least qualifies as 'something big.'
Edit: projectors will work too. It doesn't affect the native refresh rate.
Maybe not just fluff. If I understand this right, they will also sync with LightBoost on the right monitors, and so that will sync with the actual frames delivered as well. Should be dramatically better for blur. This would help 3D also, on monitors.
For projectors this seems like it won't work. The spinning wheel isn't going to be able to spin up and down to match the frame rates.
OK, now G-Sync at least qualifies as 'something big.'
Edit: projectors will work too. It doesn't affect the native refresh rate.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="eqzitara"]
Not saying it is or whatever but its important to note that nvidia does have a research + development towards making a hmd.[/quote]
Oh no doubt. That prototype is pretty cool too.
I'm not sure how to take that quote, but this is definitely what Rein was talking about. They had a round table with Sweeney and Carmack and both said it was the most significant advancement in display tech since the transition to HD screens. This is definitely what the tweet was about.
Bonus points for Mark Rein. LOL. During the roundtable discussion, he asked Johan Andersson how much Dice was paid to incorporate Mantle tech. A moment of humor in a great discussion.
Not saying it is or whatever but its important to note that nvidia does have a research + development towards making a hmd.
Oh no doubt. That prototype is pretty cool too.
I'm not sure how to take that quote, but this is definitely what Rein was talking about. They had a round table with Sweeney and Carmack and both said it was the most significant advancement in display tech since the transition to HD screens. This is definitely what the tweet was about.
Bonus points for Mark Rein. LOL. During the roundtable discussion, he asked Johan Andersson how much Dice was paid to incorporate Mantle tech. A moment of humor in a great discussion.
[quote="Paul33993"]http://www.neogaf.com/forum/showpost.php?p=86533216&postcount=1003
There's an Nvidia rep posting live in the neogaf thread and he mentions that 3D Vision is even better with G-Sync. So that's a major relief.[/quote]
I wonder how 3D would be better beyond v-sync + no lag?
Btw: if they added toggleable frame interpolation that had no lag, that would be sweet. It would smooth out youtube videos as well as games, helping framerate and making them look even more realistic.
There's an Nvidia rep posting live in the neogaf thread and he mentions that 3D Vision is even better with G-Sync. So that's a major relief.
I wonder how 3D would be better beyond v-sync + no lag?
Btw: if they added toggleable frame interpolation that had no lag, that would be sweet. It would smooth out youtube videos as well as games, helping framerate and making them look even more realistic.
There's an Nvidia rep posting live in the neogaf thread and he mentions that 3D Vision is even better with G-Sync. So that's a major relief.
Is G-sync working on something big ?
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
Should also be noted it's going to be sold as a DIY module that most likely will be compatible with existing 3D Vision monitors.
As someone who never expected them to show VR, this probably exceeds my expectations. This is pretty cool.
EDIT: Carmack is going to be on stage with Sweeney and Dice guy in a little bit. Hopefully something like this will be allowed to get added onto the Rift. This way they could ship with a 90 - 120hz panel, but wouldn't have lag/tearing if the frame drop then and now (huge no-nos for VR).
Model: Clevo P570WM Laptop
GPU: GeForce GTX 980M ~8GB GDDR5
CPU: Intel Core i7-4960X CPU +4.2GHz (12 CPUs)
Memory: 32GB Corsair Vengeance DDR3L 1600MHz, 4x8gb
OS: Microsoft Windows 7 Ultimate
This GSync has the potential - if it works like they say it does - to be one of the biggest innovations in the PC market in a long while.
Basically, VSync and screen tearing will be a thing of the past.
Rather than forcing your GPU to sync frames with your monitors refresh rate, your monitor will sync refresh rate with whatever you're capable of rendering.
So if you can only push 47fps out, your monitors refresh rate will be 47hz and it will dynamically adapt to whatever your GPU is capable of.
This means that Vsync will be completely unneeded which means there's no need to deal with the sloppy input lag it introduces.
It also means that screen tearing will be entirely gone, as the screen will always show every individual frame with no need to cut a frame short or display it torn.
This is pretty huge. It solves one of the big issues in gaming of "30fps vs 60fps" as well as "sluggish Vsync with no tears" or "fluid responsive controls with tearing."
Really looking forward to them finishing lunch so we can hear the roundtable with Carmack. I want to see just how much relevancy he thinks it'll have for VR.
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
P.S. Low persistance mode is something that Carmack has been harping on as being essential for VR. So in a perfect world, they either come up with something similiar, or they can source a panel that's compatible (he already tweeted none of the current panels under consideration would work with it).
Not saying it is or whatever but its important to note that nvidia does have a research + development towards making a hmd.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
For projectors this seems like it won't work. The spinning wheel isn't going to be able to spin up and down to match the frame rates.
OK, now G-Sync at least qualifies as 'something big.'
Edit: projectors will work too. It doesn't affect the native refresh rate.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Oh no doubt. That prototype is pretty cool too.
I'm not sure how to take that quote, but this is definitely what Rein was talking about. They had a round table with Sweeney and Carmack and both said it was the most significant advancement in display tech since the transition to HD screens. This is definitely what the tweet was about.
Bonus points for Mark Rein. LOL. During the roundtable discussion, he asked Johan Andersson how much Dice was paid to incorporate Mantle tech. A moment of humor in a great discussion.
I wonder how 3D would be better beyond v-sync + no lag?
Btw: if they added toggleable frame interpolation that had no lag, that would be sweet. It would smooth out youtube videos as well as games, helping framerate and making them look even more realistic.
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530