PcPer's live stream about G-sync and 3Dvision connection.
  2 / 2    
[quote="Paul33993"][quote="Shinra358"]I've seen that not all games are compatible with it. That's what I meant by "another feature that doesn't work with everything".[/quote] The only games that aren't compatible are terrible console ports that are locked at 30fps or have their animation tied to framerate. Both of those scenarios are extremely rare and not even worth worrying about. Of course there needs to be an asterisk explaining why not everything will work, but 99.999 of the games will work no problem.[/quote] Well considering 3Dvision, Geforce Experience, 3D Video Player, SLI, Streaming, Shadowplay, etc incompatibilities...........................................................................
Paul33993 said:
Shinra358 said:I've seen that not all games are compatible with it. That's what I meant by "another feature that doesn't work with everything".


The only games that aren't compatible are terrible console ports that are locked at 30fps or have their animation tied to framerate. Both of those scenarios are extremely rare and not even worth worrying about. Of course there needs to be an asterisk explaining why not everything will work, but 99.999 of the games will work no problem.


Well considering 3Dvision, Geforce Experience, 3D Video Player, SLI, Streaming, Shadowplay, etc incompatibilities...........................................................................

Model: Clevo P570WM Laptop
GPU: GeForce GTX 980M ~8GB GDDR5
CPU: Intel Core i7-4960X CPU +4.2GHz (12 CPUs)
Memory: 32GB Corsair Vengeance DDR3L 1600MHz, 4x8gb
OS: Microsoft Windows 7 Ultimate

#16
Posted 10/23/2013 06:46 PM   
The explanation for how it works has already been explained. It's pretty simple. And quite frankly, it's stunning that LCDs didn't do this from day one. Instead of copying CRT's limitations (which were done for valid technical reasons), they should have exploited LCD advantages from day one. They didn't. Thankfully somebody has decided to finally question why these were being treated like CRTs. So I don't really see what this has to do with anything you just listed. It'll be incompatible with games with a locked framerate. Those are thankfully very rare. This is just something that should have happened eons ago.
The explanation for how it works has already been explained. It's pretty simple. And quite frankly, it's stunning that LCDs didn't do this from day one. Instead of copying CRT's limitations (which were done for valid technical reasons), they should have exploited LCD advantages from day one. They didn't. Thankfully somebody has decided to finally question why these were being treated like CRTs. So I don't really see what this has to do with anything you just listed. It'll be incompatible with games with a locked framerate. Those are thankfully very rare. This is just something that should have happened eons ago.

#17
Posted 10/23/2013 07:36 PM   
[quote="mdrejhon"]Nope, due to flicker issues of variable-rate shutter glasses. It's not technically impossible, but definitely not on a VG248QE. [/quote]What makes you say that? NVidia glasses simply respond to the infra-red from the pyramid, which is triggered by a USB command. The glasses could easily run at 100Hz. In fact, I've personally run them at 85Hz with no trouble. The VG248QE also supports a 100Hz refresh.
mdrejhon said:Nope, due to flicker issues of variable-rate shutter glasses.
It's not technically impossible, but definitely not on a VG248QE.
What makes you say that? NVidia glasses simply respond to the infra-red from the pyramid, which is triggered by a USB command. The glasses could easily run at 100Hz. In fact, I've personally run them at 85Hz with no trouble.

The VG248QE also supports a 100Hz refresh.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#18
Posted 10/24/2013 06:13 AM   
Shutter glasses run at 60Hz. G-SYNC drops in framerates, e.g. 30fps, would cause the shutter glasses to run at only 30Hz. And imagine the erratic flicker changes going up/down as the shutter glasses went 30Hz->40Hz->60Hz->35Hz->55Hz over a 1 second time period. However, there ARE other opportunities to reduce 3D Vision latency and improve 3D quality. -- Faster frame delivery times to the monitor (1/144sec) regardless of refresh rate, while the monitor keeps operating the 3D glasses at a fixed frequency. -- G-SYNC possibly behaving as an on-board (monitor) triplebuffer (x 2 = six buffers), so that frames can be delivered as quickly as possible from the GPU to monitor, while the monitor continues to operate the 3D glasses at a fixed frequency. -- Better sync of time of left eye (shutter open time period) to game engine time, and better sync of right eye (shutter open time period) to game engine time (independently of each other). Right now in older 3D Vision, both frames are rendered at the same time, but are presented to your eyes one-after-the-other. This creates vision depth-change side effects when strafing left versus strafing right. This can be fixed with G-SYNC. -- Better strobe backlight and better Y-axis-compensated panel overdrive, to reduce banding artifacts in crosstalk (e.g. more crosstalk/ghosting at the bottom edge of the screen is common on older 3D Vision monitors. This is because of less time for LCD pixel transitions before the glasses shutters open/backlight gets strobed). Better Y-axis compensated overdrive can help smooth out the pixel freshness differences a bit. etc. Basically, G-SYNC will help 3D in other ways, but the frequency of the 3D shutters will remain fixed. The 3D with G-SYNC will be much, much, much better, I predict, yes. You may have variable-rate frame delivery to the monitor. But the shutters in the glasses won't be dynamically variable rate. (My knowledge above is gained from a Casio 1000fps high-speed camera pointing through shutter glasses, plus also playing with LightBoost monitors with different motion tests at www.testufo.com, especially in full screen mode, including the Flicker Test, as well as the Blur Trail test in Yellow/Blue mode, which immediately revealed the basic Y-axis-compensated overdrive algorithm that exists in earlier LightBoost monitors. There's lots of room for improvement.)
Shutter glasses run at 60Hz. G-SYNC drops in framerates, e.g. 30fps, would cause the shutter glasses to run at only 30Hz. And imagine the erratic flicker changes going up/down as the shutter glasses went 30Hz->40Hz->60Hz->35Hz->55Hz over a 1 second time period.

However, there ARE other opportunities to reduce 3D Vision latency and improve 3D quality.
-- Faster frame delivery times to the monitor (1/144sec) regardless of refresh rate, while the monitor keeps operating the 3D glasses at a fixed frequency.
-- G-SYNC possibly behaving as an on-board (monitor) triplebuffer (x 2 = six buffers), so that frames can be delivered as quickly as possible from the GPU to monitor, while the monitor continues to operate the 3D glasses at a fixed frequency.
-- Better sync of time of left eye (shutter open time period) to game engine time, and better sync of right eye (shutter open time period) to game engine time (independently of each other). Right now in older 3D Vision, both frames are rendered at the same time, but are presented to your eyes one-after-the-other. This creates vision depth-change side effects when strafing left versus strafing right. This can be fixed with G-SYNC.
-- Better strobe backlight and better Y-axis-compensated panel overdrive, to reduce banding artifacts in crosstalk (e.g. more crosstalk/ghosting at the bottom edge of the screen is common on older 3D Vision monitors. This is because of less time for LCD pixel transitions before the glasses shutters open/backlight gets strobed). Better Y-axis compensated overdrive can help smooth out the pixel freshness differences a bit.
etc.

Basically, G-SYNC will help 3D in other ways, but the frequency of the 3D shutters will remain fixed. The 3D with G-SYNC will be much, much, much better, I predict, yes. You may have variable-rate frame delivery to the monitor. But the shutters in the glasses won't be dynamically variable rate.

(My knowledge above is gained from a Casio 1000fps high-speed camera pointing through shutter glasses, plus also playing with LightBoost monitors with different motion tests at www.testufo.com, especially in full screen mode, including the Flicker Test, as well as the Blur Trail test in Yellow/Blue mode, which immediately revealed the basic Y-axis-compensated overdrive algorithm that exists in earlier LightBoost monitors. There's lots of room for improvement.)

#19
Posted 10/24/2013 11:00 AM   
[quote="mdrejhon"]Basically, G-SYNC will help 3D in other ways, but the frequency of the 3D shutters will remain fixed. The 3D with G-SYNC will be much, much, much better, I predict, yes. You may have variable-rate frame delivery to the monitor. But the shutters in the glasses won't be dynamically variable rate.[/quote]Well, I totally disagree. The NVidia shutter glasses already know how to do variable frequency. As I said before, I've personally run them on monitors that were refreshing at 85Hz. That makes 42.5Hz for each eye. When I said they would run from 144Hz to 100Hz, that's using the monitor reference. Of course each eye gets half of that rate. Going down to 100Hz, 50Hz per eye would not be a problem for flicker. Nvidia really does not like to go down to 30Hz per eye, which is why the early info suggests 100Hz as the minimum.
mdrejhon said:Basically, G-SYNC will help 3D in other ways, but the frequency of the 3D shutters will remain fixed. The 3D with G-SYNC will be much, much, much better, I predict, yes. You may have variable-rate frame delivery to the monitor. But the shutters in the glasses won't be dynamically variable rate.
Well, I totally disagree. The NVidia shutter glasses already know how to do variable frequency. As I said before, I've personally run them on monitors that were refreshing at 85Hz. That makes 42.5Hz for each eye.

When I said they would run from 144Hz to 100Hz, that's using the monitor reference. Of course each eye gets half of that rate. Going down to 100Hz, 50Hz per eye would not be a problem for flicker. Nvidia really does not like to go down to 30Hz per eye, which is why the early info suggests 100Hz as the minimum.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#20
Posted 10/24/2013 11:11 AM   
[quote][quote="bo3b"][quote="mdrejhon"]Basically, G-SYNC will help 3D in other ways, but the frequency of the 3D shutters will remain fixed. The 3D with G-SYNC will be much, much, much better, I predict, yes. You may have variable-rate frame delivery to the monitor. But the shutters in the glasses won't be dynamically variable rate.[/quote]Well, I totally disagree. The NVidia shutter glasses already know how to do variable frequency. As I said before, I've personally run them on monitors that were refreshing at 85Hz. That makes 42.5Hz for each eye.[/quote]I know, but that's not the point. I know the shutter glasses can do it. And yes, it is technically/theoretically possible. But it's not going to be enabled with THIS generation of G-SYNC with existing shutter glasses. Imagine the game running 33fps->45fps->60fps->37fps->44fps; very rapidly, in a single 1/4sec period; that is going to create visible flicker if you modulate the variable refresh rate quickly. And you can't do continuously dynamic variable refresh with shutter glasses without objectionable flicker, since: - Shutter glasses require LightBoost to look good - G-SYNC's LightBoost is fixed-refresh-rate [url=http://www.blurbusters.com/confirmed-nvidia-g-sync-includes-a-strobe-backlight-upgrade/]according to John Carmack[/url] Therefore, since LightBoost can't be combined with G-SYNC, you are not going to get variable refresh rate shutter glasses unless you want poor non-LightBoost 3DVision1 crosstalk [i]COMBINED[/i] with erratic flickering in the shutter glasses. Good user experience is not possible. The user experience would be poor, that nVidia would obviously not include continuously dynamic variable-rate shutter operation with this round of G-SYNC release. But, definitely, G-SYNC 3D will be vastly superior, and might have found a way to do variable-framerate 3D (on the frame-delivery chain side of things, from cable to monitor), but with fixed-rate shutters *and* fixed-rate LightBoost enabled.
bo3b said:
mdrejhon said:Basically, G-SYNC will help 3D in other ways, but the frequency of the 3D shutters will remain fixed. The 3D with G-SYNC will be much, much, much better, I predict, yes. You may have variable-rate frame delivery to the monitor. But the shutters in the glasses won't be dynamically variable rate.
Well, I totally disagree. The NVidia shutter glasses already know how to do variable frequency. As I said before, I've personally run them on monitors that were refreshing at 85Hz. That makes 42.5Hz for each eye.
I know, but that's not the point.
I know the shutter glasses can do it.
And yes, it is technically/theoretically possible.
But it's not going to be enabled with THIS generation of G-SYNC with existing shutter glasses.

Imagine the game running 33fps->45fps->60fps->37fps->44fps; very rapidly, in a single 1/4sec period; that is going to create visible flicker if you modulate the variable refresh rate quickly. And you can't do continuously dynamic variable refresh with shutter glasses without objectionable flicker, since:
- Shutter glasses require LightBoost to look good
- G-SYNC's LightBoost is fixed-refresh-rate according to John Carmack

Therefore, since LightBoost can't be combined with G-SYNC, you are not going to get variable refresh rate shutter glasses unless you want poor non-LightBoost 3DVision1 crosstalk COMBINED with erratic flickering in the shutter glasses. Good user experience is not possible. The user experience would be poor, that nVidia would obviously not include continuously dynamic variable-rate shutter operation with this round of G-SYNC release.

But, definitely, G-SYNC 3D will be vastly superior, and might have found a way to do variable-framerate 3D (on the frame-delivery chain side of things, from cable to monitor), but with fixed-rate shutters *and* fixed-rate LightBoost enabled.

#21
Posted 10/24/2013 12:23 PM   
@mdrejhon I have a question about lightboost monitors that you might be able to answer. Sent you a PM.
@mdrejhon

I have a question about lightboost monitors that you might be able to answer. Sent you a PM.

Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.

#22
Posted 10/24/2013 01:20 PM   
[quote="mdrejhon"]Imagine the game running 33fps->45fps->60fps->37fps->44fps; very rapidly, in a single 1/4sec period; that is going to create visible flicker if you modulate the variable refresh rate quickly. [/quote]This doesn't make sense to me. Why would varying the shuttering make flicker? It's already flickering at 50Hz per eye, and NVidia is not going to drop the refresh rate to 30fps in 3D Vision mode to avoid that obvious flicker at 15Hz per eye. My take on their announcement is that they won't allow 3D Vision to drop below 100Hz. So in your example of 33fps-60fps, G-Sync would not apply because it's below the monitor minimum refresh. As long as we keep the shuttering above 100Hz, 50 per eye, I don't see how g-sync will cause any added flicker, even if it varies wildly between 100-144Hz. [quote="mdrejhon"]And you can't do continuously dynamic variable refresh with shutter glasses without objectionable flicker, since: - Shutter glasses require LightBoost to look good - G-SYNC's LightBoost is fixed-refresh-rate [url=http://www.blurbusters.com/confirmed-nvidia-g-sync-includes-a-strobe-backlight-upgrade/]according to John Carmack[/url] Therefore, since LightBoost can't be combined with G-SYNC, you are not going to get variable refresh rate shutter glasses unless you want poor non-LightBoost 3DVision1 crosstalk [i]COMBINED[/i] with erratic flickering in the shutter glasses. Good user experience is not possible. The user experience would be poor, that nVidia would obviously not include continuously dynamic variable-rate shutter operation with this round of G-SYNC release.[/quote]Pretty much still disagree. You seem to be requiring Lightboost with 3D Vision. Maybe other people can comment, since I use a projector and have limited experience with the monitors, but I know quite a few people here use non-Lightboost monitors with 3D Vision and are quite happy with the experience. I read your link to the Carmack tweet, and it doesn't really answer the question as to whether Lightboost can adapt to variable refresh rate. Pretty sure we can agree NVidia is looking at this, since they make both technologies. I still don't understand why you think it's going to lead to more flickering with shutter glasses to have variable frequency refresh. As long as we continue to stay above a minimum refresh rate of 100Hz, I don't see how it can add flicker. Please elaborate.
mdrejhon said:Imagine the game running 33fps->45fps->60fps->37fps->44fps; very rapidly, in a single 1/4sec period; that is going to create visible flicker if you modulate the variable refresh rate quickly.
This doesn't make sense to me. Why would varying the shuttering make flicker? It's already flickering at 50Hz per eye, and NVidia is not going to drop the refresh rate to 30fps in 3D Vision mode to avoid that obvious flicker at 15Hz per eye. My take on their announcement is that they won't allow 3D Vision to drop below 100Hz. So in your example of 33fps-60fps, G-Sync would not apply because it's below the monitor minimum refresh.

As long as we keep the shuttering above 100Hz, 50 per eye, I don't see how g-sync will cause any added flicker, even if it varies wildly between 100-144Hz.

mdrejhon said:And you can't do continuously dynamic variable refresh with shutter glasses without objectionable flicker, since:
- Shutter glasses require LightBoost to look good
- G-SYNC's LightBoost is fixed-refresh-rate according to John Carmack

Therefore, since LightBoost can't be combined with G-SYNC, you are not going to get variable refresh rate shutter glasses unless you want poor non-LightBoost 3DVision1 crosstalk COMBINED with erratic flickering in the shutter glasses. Good user experience is not possible. The user experience would be poor, that nVidia would obviously not include continuously dynamic variable-rate shutter operation with this round of G-SYNC release.
Pretty much still disagree. You seem to be requiring Lightboost with 3D Vision. Maybe other people can comment, since I use a projector and have limited experience with the monitors, but I know quite a few people here use non-Lightboost monitors with 3D Vision and are quite happy with the experience.

I read your link to the Carmack tweet, and it doesn't really answer the question as to whether Lightboost can adapt to variable refresh rate. Pretty sure we can agree NVidia is looking at this, since they make both technologies.

I still don't understand why you think it's going to lead to more flickering with shutter glasses to have variable frequency refresh. As long as we continue to stay above a minimum refresh rate of 100Hz, I don't see how it can add flicker. Please elaborate.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#23
Posted 10/25/2013 05:15 AM   
[quote="mdrejhon"]And you can't do continuously dynamic variable refresh with shutter glasses without objectionable flicker, since: - Shutter glasses require LightBoost to look good - G-SYNC's LightBoost is fixed-refresh-rate [url=http://www.blurbusters.com/confirmed-nvidia-g-sync-includes-a-strobe-backlight-upgrade/]according to John Carmack[/url][/quote]Also of note, on that neoGAF quote, you have misquoted AndyBNV. He's talking about your ToastyX as the unofficial implementation, [i]not [/i]Lightboost.
mdrejhon said:And you can't do continuously dynamic variable refresh with shutter glasses without objectionable flicker, since:
- Shutter glasses require LightBoost to look good
- G-SYNC's LightBoost is fixed-refresh-rate according to John Carmack
Also of note, on that neoGAF quote, you have misquoted AndyBNV. He's talking about your ToastyX as the unofficial implementation, not Lightboost.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#24
Posted 10/25/2013 05:22 AM   
Guys I just asked Asus ROG, when they will make higher resolution (1440p and 4K) monitors with 3D and G-Sync and they poisoned me with this comment on Facebook: ASUS REPUBLIC OF GAMERS PG278Q? G-Sync is not compatible with 3D Vision sorry. Like · Reply · 45 minutes ago I know that this monitor PG278Q isn't 3D monitor but it is 1440p and is G-Sync integrated. But saying that G-Sync isn't compatible with 3D, i won't believe this ever unless they give me and article stating why. I dont understand why they just lied to me.
Guys I just asked Asus ROG, when they will make higher resolution (1440p and 4K) monitors with 3D and G-Sync and they poisoned me with this comment on Facebook:


ASUS REPUBLIC OF GAMERS PG278Q? G-Sync is not compatible with 3D Vision sorry.
Like · Reply · 45 minutes ago

I know that this monitor PG278Q isn't 3D monitor but it is 1440p and is G-Sync integrated. But saying that G-Sync isn't compatible with 3D, i won't believe this ever unless they give me and article stating why.
I dont understand why they just lied to me.

#25
Posted 01/13/2014 02:47 AM   
I tried increasing and decreasing the refresh rate on my 3D monitor before, it made the glasses act wierd. They flickered fine for a few seconds, then would either blank out for a second or two or just stop flickering for a second or two. I think this is because it can't do more than a certain amount of flickers in a set period of time and so had to wait behind and resync again.
I tried increasing and decreasing the refresh rate on my 3D monitor before, it made the glasses act wierd. They flickered fine for a few seconds, then would either blank out for a second or two or just stop flickering for a second or two. I think this is because it can't do more than a certain amount of flickers in a set period of time and so had to wait behind and resync again.

#26
Posted 01/13/2014 04:00 AM   
[quote="WhiteSkyMage"]Guys I just asked Asus ROG, when they will make higher resolution (1440p and 4K) monitors with 3D and G-Sync and they poisoned me with this comment on Facebook: ASUS REPUBLIC OF GAMERS PG278Q? G-Sync is not compatible with 3D Vision sorry. Like · Reply · 45 minutes ago I know that this monitor PG278Q isn't 3D monitor but it is 1440p and is G-Sync integrated. But saying that G-Sync isn't compatible with 3D, i won't believe this ever unless they give me and article stating why. I dont understand why they just lied to me.[/quote] Specs never did show it as being compatible with G-Sync. You can use 3D Vision on compatible monitors but it'll be either/or. Do you want G-Sync? Or do you want 3D Vision? It's just not practical. Even if they released a 3D Vision 3 kit that could sync with the variable refresh of the monitor, glasses flickering at 60hz would give most people a major headache. It'd be bad publicity and a bunch of returned products. G-Sync only makes practical sense on passive screens.
WhiteSkyMage said:Guys I just asked Asus ROG, when they will make higher resolution (1440p and 4K) monitors with 3D and G-Sync and they poisoned me with this comment on Facebook:


ASUS REPUBLIC OF GAMERS PG278Q? G-Sync is not compatible with 3D Vision sorry.
Like · Reply · 45 minutes ago

I know that this monitor PG278Q isn't 3D monitor but it is 1440p and is G-Sync integrated. But saying that G-Sync isn't compatible with 3D, i won't believe this ever unless they give me and article stating why.
I dont understand why they just lied to me.


Specs never did show it as being compatible with G-Sync. You can use 3D Vision on compatible monitors but it'll be either/or. Do you want G-Sync? Or do you want 3D Vision? It's just not practical. Even if they released a 3D Vision 3 kit that could sync with the variable refresh of the monitor, glasses flickering at 60hz would give most people a major headache. It'd be bad publicity and a bunch of returned products. G-Sync only makes practical sense on passive screens.

#27
Posted 01/13/2014 12:13 PM   
Not sure where this idea of GSync or 3D Vision as an either/or came from, but it sure seems wrong to me. If you look at this slide directly from NVidia: [img]http://images.anandtech.com/doci/7436/GEFORCE-G-SYNC-Performance_Chart_575px.jpg[/img] It specifically calls out 3D Vision as being "Superior" with Gsync. Why would they add that if it had no effect, or was mutually exclusive? If it had no effect, why put it on the slide at all?
Not sure where this idea of GSync or 3D Vision as an either/or came from, but it sure seems wrong to me.

If you look at this slide directly from NVidia:

Image

It specifically calls out 3D Vision as being "Superior" with Gsync. Why would they add that if it had no effect, or was mutually exclusive? If it had no effect, why put it on the slide at all?

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#28
Posted 01/14/2014 04:53 AM   
[quote="bo3b"]It specifically calls out 3D Vision as being "Superior" with Gsync. Why would they add that if it had no effect, or was mutually exclusive? If it had no effect, why put it on the slide at all?[/quote] I saw that spec sheet too... but it sure looks like 3D is still fixed at 100hz or 120hz. Does not appear to be variable like the G-Sync spec of 30 - 144 hz. That being said, Id love to know how G-Sync then improves 3D quality. It would be fabulous to actually hear something from NVidia about how 3D actually works and fits in with G-Sync. (Surround too for that matter.)
bo3b said:It specifically calls out 3D Vision as being "Superior" with Gsync. Why would they add that if it had no effect, or was mutually exclusive? If it had no effect, why put it on the slide at all?


I saw that spec sheet too... but it sure looks like 3D is still fixed at 100hz or 120hz. Does not appear to be variable like the G-Sync spec of 30 - 144 hz. That being said, Id love to know how G-Sync then improves 3D quality.

It would be fabulous to actually hear something from NVidia about how 3D actually works and fits in with G-Sync. (Surround too for that matter.)

3D Vision Surround | Driver 359.00 | Windows 7
GTX 980 SLI | i7 3770K @ 4.2 GHz | 16 GB RAM
3x ASUS VG248QE w/ G-SYNC

#29
Posted 01/14/2014 06:48 AM   
  2 / 2    
Scroll To Top