Locked was probably a poor choice of phrasing, because I know you can change the frequency when setting up 3d vision. But the point was they operate at "fixed" frequencies, I'd be surprised if they could sync at variable rates. I suppose it depends on whether the emitter signal is something that tells the glasses to operate at (X)Hz and just keeps them in sync, or if it instructs every single shutter operation.
Then there's the question of how a variable refresh rate would appear when viewing stereoscopically.
Locked was probably a poor choice of phrasing, because I know you can change the frequency when setting up 3d vision. But the point was they operate at "fixed" frequencies, I'd be surprised if they could sync at variable rates. I suppose it depends on whether the emitter signal is something that tells the glasses to operate at (X)Hz and just keeps them in sync, or if it instructs every single shutter operation.
Then there's the question of how a variable refresh rate would appear when viewing stereoscopically.
The simplest way to design something like this is to have it just respond to a sync signal being sent. The pyramid also supports that VESA 3 pin sync from other hardware, so I'm pretty sure that it just sends a pulse when it gets one.
Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal. Putting in timing circuits, and trying to keep them running in exact sync with a monitor would be a poor design.
If you hold down the power switch, you can see that the lenses have the ability to lock at a specific flip, and not keep toggling. This would be 0Hz.
So, my best guess is that the glasses already can do arbitrary sync rates, and my reading indicates that they are already ready to do GSync. The only constraint that I see is whatever NVidia decides is the lowest frequency they'll allow. 100Hz (50 per eye) would not be a problem. I'd personally like to see it allow as low as 85Hz (42.5 per eye), to really take advantage of GSync.
As far as variable refresh in the glasses, I really doubt that we would be able to tell- because the GSync enforces a strict minimum. If the minimum is not noticeable or distracting, and as long as the GSync delay frames are still coming at least as fast as the minimum, then how would you notice?
Some people can notice flickering at 85Hz, I'm not one of them. I couldn't see any problems at all.
So, my take is that GSync would be great for 3D, especially because of the frame rate hit we take for high quality 3D. GSync would help for the games where you cannot sustain 120Hz as the minimum bar. Any drops below 120Hz cause stuttering. With GSync, those temporary glitches would simply be delayed until ready, and then shown.
Other people don't agree with my understanding of how GSync works, but we'll see.
The simplest way to design something like this is to have it just respond to a sync signal being sent. The pyramid also supports that VESA 3 pin sync from other hardware, so I'm pretty sure that it just sends a pulse when it gets one.
Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal. Putting in timing circuits, and trying to keep them running in exact sync with a monitor would be a poor design.
If you hold down the power switch, you can see that the lenses have the ability to lock at a specific flip, and not keep toggling. This would be 0Hz.
So, my best guess is that the glasses already can do arbitrary sync rates, and my reading indicates that they are already ready to do GSync. The only constraint that I see is whatever NVidia decides is the lowest frequency they'll allow. 100Hz (50 per eye) would not be a problem. I'd personally like to see it allow as low as 85Hz (42.5 per eye), to really take advantage of GSync.
As far as variable refresh in the glasses, I really doubt that we would be able to tell- because the GSync enforces a strict minimum. If the minimum is not noticeable or distracting, and as long as the GSync delay frames are still coming at least as fast as the minimum, then how would you notice?
Some people can notice flickering at 85Hz, I'm not one of them. I couldn't see any problems at all.
So, my take is that GSync would be great for 3D, especially because of the frame rate hit we take for high quality 3D. GSync would help for the games where you cannot sustain 120Hz as the minimum bar. Any drops below 120Hz cause stuttering. With GSync, those temporary glitches would simply be delayed until ready, and then shown.
Other people don't agree with my understanding of how GSync works, but we'll see.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
There is a mention of lightboost for the ROG swift on this site. Maybe they will announce it is 3d capable later on as 3D is a bad word right now at ces.
http://www.blurbusters.com/asus-pg278q-rog-monitor-1440p-120hz-gsync/
There is a mention of lightboost for the ROG swift on this site. Maybe they will announce it is 3d capable later on as 3D is a bad word right now at ces.
[quote="bo3b"]Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal.[/quote]
With DLP-link glasses, if you cover the light sensor on the glasses with your hand, robbing it of the sync signal, they stay in sync for one to three seconds. After that just stop shuttering as they determine you don't have a signal. This would seem to indicate that the sync signal is indeed to ensure sync, and not simply a command to toggle right and left lenses.
On the other hand, this might be unique to DLP-link, and/or just an emergency procedure when it's obvious to the glasses that the signal is lost.
bo3b said:Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal.
With DLP-link glasses, if you cover the light sensor on the glasses with your hand, robbing it of the sync signal, they stay in sync for one to three seconds. After that just stop shuttering as they determine you don't have a signal. This would seem to indicate that the sync signal is indeed to ensure sync, and not simply a command to toggle right and left lenses.
On the other hand, this might be unique to DLP-link, and/or just an emergency procedure when it's obvious to the glasses that the signal is lost.
[quote="Airion"][quote="bo3b"]Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal.[/quote]With DLP-link glasses, if you cover the light sensor on the glasses with your hand, robbing it of the sync signal, they stay in sync for one to three seconds. After that just stop shuttering as they determine you don't have a signal. This would seem to indicate that the sync signal is indeed to ensure sync, and not simply a command to toggle right and left lenses.
On the other hand, this might be unique to DLP-link, and/or just an emergency procedure when it's obvious to the glasses that the signal is lost.[/quote]Interesting that DLP-Link continue to run.
For 3D Vision 1 glasses, once the sync is lost the glasses stop shuttering instantly. If you cover the infra-red input on the glasses, they drop out right away. For 3D Vision 2, the same effect is seen.
In any case, I'm keenly interested in how GSync might work with 3D.
bo3b said:Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal.
With DLP-link glasses, if you cover the light sensor on the glasses with your hand, robbing it of the sync signal, they stay in sync for one to three seconds. After that just stop shuttering as they determine you don't have a signal. This would seem to indicate that the sync signal is indeed to ensure sync, and not simply a command to toggle right and left lenses.
On the other hand, this might be unique to DLP-link, and/or just an emergency procedure when it's obvious to the glasses that the signal is lost.
Interesting that DLP-Link continue to run.
For 3D Vision 1 glasses, once the sync is lost the glasses stop shuttering instantly. If you cover the infra-red input on the glasses, they drop out right away. For 3D Vision 2, the same effect is seen.
In any case, I'm keenly interested in how GSync might work with 3D.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="Pirateguybrush"]Mine (3DV2) don't drop out right away, it takes about a second for it to die.[/quote]I was seeing that too, and sometimes I couldn't seem to get it to drop at all. Turn them away from the emitter, then cover the center with your finger. In my case it drops out immediately.
Could be some sort of try to stay sync mechanism, but I would still submit it's a poor design. As it drifts off sync you'll see more ghosting. DLP-Link might be doing this to avoid sync loss for people walking in front of the glasses, since sync signal comes from wall/screen.
Pirateguybrush said:Mine (3DV2) don't drop out right away, it takes about a second for it to die.
I was seeing that too, and sometimes I couldn't seem to get it to drop at all. Turn them away from the emitter, then cover the center with your finger. In my case it drops out immediately.
Could be some sort of try to stay sync mechanism, but I would still submit it's a poor design. As it drifts off sync you'll see more ghosting. DLP-Link might be doing this to avoid sync loss for people walking in front of the glasses, since sync signal comes from wall/screen.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]Could be some sort of try to stay sync mechanism, but I would still submit it's a poor design. As it drifts off sync you'll see more ghosting. DLP-Link might be doing this to avoid sync loss for people walking in front of the glasses, since sync signal comes from wall/screen.[/quote]
What you're saying about design makes a lot of sense. And it seems like this should be a simple question- how exactly do active shutter glasses work? Unfortunately I've found Google searches can't answer the question.
I will say that DLP-link sync is quite robust with quality glasses. They will detect the DLP-link flash even if they're not directly aligned with the screen. If they lose sync because someone is walking in front of the screen, it's because that person is blocking almost all of the image. In other words, there's nothing to see even if the glasses were in sync.
bo3b said:Could be some sort of try to stay sync mechanism, but I would still submit it's a poor design. As it drifts off sync you'll see more ghosting. DLP-Link might be doing this to avoid sync loss for people walking in front of the glasses, since sync signal comes from wall/screen.
What you're saying about design makes a lot of sense. And it seems like this should be a simple question- how exactly do active shutter glasses work? Unfortunately I've found Google searches can't answer the question.
I will say that DLP-link sync is quite robust with quality glasses. They will detect the DLP-link flash even if they're not directly aligned with the screen. If they lose sync because someone is walking in front of the screen, it's because that person is blocking almost all of the image. In other words, there's nothing to see even if the glasses were in sync.
Then there's the question of how a variable refresh rate would appear when viewing stereoscopically.
Similarly, I'd be super surprised if the glasses did anything except simply toggle the lenses R/L at every sync signal. Putting in timing circuits, and trying to keep them running in exact sync with a monitor would be a poor design.
If you hold down the power switch, you can see that the lenses have the ability to lock at a specific flip, and not keep toggling. This would be 0Hz.
So, my best guess is that the glasses already can do arbitrary sync rates, and my reading indicates that they are already ready to do GSync. The only constraint that I see is whatever NVidia decides is the lowest frequency they'll allow. 100Hz (50 per eye) would not be a problem. I'd personally like to see it allow as low as 85Hz (42.5 per eye), to really take advantage of GSync.
As far as variable refresh in the glasses, I really doubt that we would be able to tell- because the GSync enforces a strict minimum. If the minimum is not noticeable or distracting, and as long as the GSync delay frames are still coming at least as fast as the minimum, then how would you notice?
Some people can notice flickering at 85Hz, I'm not one of them. I couldn't see any problems at all.
So, my take is that GSync would be great for 3D, especially because of the frame rate hit we take for high quality 3D. GSync would help for the games where you cannot sustain 120Hz as the minimum bar. Any drops below 120Hz cause stuttering. With GSync, those temporary glitches would simply be delayed until ready, and then shown.
Other people don't agree with my understanding of how GSync works, but we'll see.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
http://www.blurbusters.com/asus-pg278q-rog-monitor-1440p-120hz-gsync/
Intel 5960x, Asus RVE, 16 Gb Ram
Gtx 980
Samsung 850 pro 1TB
Win 10 64
With DLP-link glasses, if you cover the light sensor on the glasses with your hand, robbing it of the sync signal, they stay in sync for one to three seconds. After that just stop shuttering as they determine you don't have a signal. This would seem to indicate that the sync signal is indeed to ensure sync, and not simply a command to toggle right and left lenses.
On the other hand, this might be unique to DLP-link, and/or just an emergency procedure when it's obvious to the glasses that the signal is lost.
For 3D Vision 1 glasses, once the sync is lost the glasses stop shuttering instantly. If you cover the infra-red input on the glasses, they drop out right away. For 3D Vision 2, the same effect is seen.
In any case, I'm keenly interested in how GSync might work with 3D.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Could be some sort of try to stay sync mechanism, but I would still submit it's a poor design. As it drifts off sync you'll see more ghosting. DLP-Link might be doing this to avoid sync loss for people walking in front of the glasses, since sync signal comes from wall/screen.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
What you're saying about design makes a lot of sense. And it seems like this should be a simple question- how exactly do active shutter glasses work? Unfortunately I've found Google searches can't answer the question.
I will say that DLP-link sync is quite robust with quality glasses. They will detect the DLP-link flash even if they're not directly aligned with the screen. If they lose sync because someone is walking in front of the screen, it's because that person is blocking almost all of the image. In other words, there's nothing to see even if the glasses were in sync.