3D Vision will work together with G-Sync Technology?
[b]Hi[/b], I salute you all from Romania!
I'm attempting to make an upgrade for mine monitor BenQ XL2410T, to something with 1440p resolution, or even 4K with G-Sync technology.
Will be G-Sync compatible with 3D Vision from Nvidia?
Thank you!
Yep, at present Nvidia have said that G-Sync and 3D Vision cannot be run at the same time. Both will work on a specific monitor like the ROG, but not at the same time.
If they did both work at once, I'd buy the VG248QE today.
Yep, at present Nvidia have said that G-Sync and 3D Vision cannot be run at the same time. Both will work on a specific monitor like the ROG, but not at the same time.
If they did both work at once, I'd buy the VG248QE today.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Thank you guys for answer.
Maybe they will fix that, or G-Sync is only available on display port, wich cannot go upper 60Hz?
So still be able to run Nvidia 3D, but ONLY on DVI port, and without G-Sync enabled?
I'm intending to buy this monitor:
http://www.anandtech.com/show/8085/asus-rog-swift-pg278q-launched-1440p-144hz-panel-with-gsync
The big problem is that 3D Vision works by rapidly toggling the screen between the left and right eye images, while the whole point of G-SYNC is to stop updating the screen when nothing is changing. They're pretty much fundamentally incompatible at a technological level.
The big problem is that 3D Vision works by rapidly toggling the screen between the left and right eye images, while the whole point of G-SYNC is to stop updating the screen when nothing is changing. They're pretty much fundamentally incompatible at a technological level.
[quote="aplattner"]The big problem is that 3D Vision works by rapidly toggling the screen between the left and right eye images, while the whole point of G-SYNC is to stop updating the screen when nothing is changing. They're pretty much fundamentally incompatible at a technological level.[/quote]I've heard this before, but I'm still not sure this is true.
I've personally used 3D Vision glasses at 100Hz, when I ran CRTs a long while back. That means that the glasses frequency is not fundamentally fixed at 120Hz.
The only real question then is whether the frequency can be dynamic or not. Again, I really don't see any reason why not. If the emitter gets a USB signal to swap eyes, is it a requirement that it be at some fixed frequency?
My thought is that if the glasses just respond to the emitter, and the emitter just responds to the driver, then I don't see why we couldn't get G-Sync from say 100-144Hz, and still maintain a decent refresh on the glasses.
The only reason I can see for limiting it is the fear of flicker, but not everyone is susceptible to shutter glass flicker, and I used Elsa Revelators at 60Hz (30 per eye) a long time ago, and had no problems. In my case, I can probably accept a 60Hz-144Hz G-Sync range with 3D Vision.
I would love to see more flexibility here instead of assuming we know what is best for everyone. Please allow (maybe opt-in only) us to experiment with G-Sync and 3D Vision.
aplattner said:The big problem is that 3D Vision works by rapidly toggling the screen between the left and right eye images, while the whole point of G-SYNC is to stop updating the screen when nothing is changing. They're pretty much fundamentally incompatible at a technological level.
I've heard this before, but I'm still not sure this is true.
I've personally used 3D Vision glasses at 100Hz, when I ran CRTs a long while back. That means that the glasses frequency is not fundamentally fixed at 120Hz.
The only real question then is whether the frequency can be dynamic or not. Again, I really don't see any reason why not. If the emitter gets a USB signal to swap eyes, is it a requirement that it be at some fixed frequency?
My thought is that if the glasses just respond to the emitter, and the emitter just responds to the driver, then I don't see why we couldn't get G-Sync from say 100-144Hz, and still maintain a decent refresh on the glasses.
The only reason I can see for limiting it is the fear of flicker, but not everyone is susceptible to shutter glass flicker, and I used Elsa Revelators at 60Hz (30 per eye) a long time ago, and had no problems. In my case, I can probably accept a 60Hz-144Hz G-Sync range with 3D Vision.
I would love to see more flexibility here instead of assuming we know what is best for everyone. Please allow (maybe opt-in only) us to experiment with G-Sync and 3D Vision.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I'm attempting to make an upgrade for mine monitor BenQ XL2410T, to something with 1440p resolution, or even 4K with G-Sync technology.
Will be G-Sync compatible with 3D Vision from Nvidia?
Thank you!
GAMING™:Ivy i7 3770K/4600Mhz/1,18V||Zalman Reserator Moded Watercooling GPU+CPU||Asus TITAN & Nvidia 3D Vision 2||Asrock OC Formula||16 Gb Gskill@2400Mhz CL9||2xSamsung 830 S-Ata 3 SSD+8TB||Creative X-Fi Titanium Fatal1ty Champion||Corsair TX850W||BenQ XL2410T 3D LED 120Hz||Antec 1200 case, all on Win 7 & Win 8 Licences.
If they did both work at once, I'd buy the VG248QE today.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Maybe they will fix that, or G-Sync is only available on display port, wich cannot go upper 60Hz?
So still be able to run Nvidia 3D, but ONLY on DVI port, and without G-Sync enabled?
I'm intending to buy this monitor:
http://www.anandtech.com/show/8085/asus-rog-swift-pg278q-launched-1440p-144hz-panel-with-gsync
GAMING™:Ivy i7 3770K/4600Mhz/1,18V||Zalman Reserator Moded Watercooling GPU+CPU||Asus TITAN & Nvidia 3D Vision 2||Asrock OC Formula||16 Gb Gskill@2400Mhz CL9||2xSamsung 830 S-Ata 3 SSD+8TB||Creative X-Fi Titanium Fatal1ty Champion||Corsair TX850W||BenQ XL2410T 3D LED 120Hz||Antec 1200 case, all on Win 7 & Win 8 Licences.
Aaron Plattner
NVIDIA Linux Graphics
I've personally used 3D Vision glasses at 100Hz, when I ran CRTs a long while back. That means that the glasses frequency is not fundamentally fixed at 120Hz.
The only real question then is whether the frequency can be dynamic or not. Again, I really don't see any reason why not. If the emitter gets a USB signal to swap eyes, is it a requirement that it be at some fixed frequency?
My thought is that if the glasses just respond to the emitter, and the emitter just responds to the driver, then I don't see why we couldn't get G-Sync from say 100-144Hz, and still maintain a decent refresh on the glasses.
The only reason I can see for limiting it is the fear of flicker, but not everyone is susceptible to shutter glass flicker, and I used Elsa Revelators at 60Hz (30 per eye) a long time ago, and had no problems. In my case, I can probably accept a 60Hz-144Hz G-Sync range with 3D Vision.
I would love to see more flexibility here instead of assuming we know what is best for everyone. Please allow (maybe opt-in only) us to experiment with G-Sync and 3D Vision.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)