[quote name='Zloth' date='03 October 2010 - 02:04 PM' timestamp='1286143454' post='1125895']
Hmmm, is that even possible? If this is set up so the monitor tells the video card that the frame is ready, then the video card tells the USB port, then the port tells the IR box, then the IR box tells the glasses, what would you tweak? They are all going as fast as they can already. Or is this set up with some sort of common clock so the IR box and monitor know they are supposed to switch every 1/120 of a second starting at a certain time?
[/quote]
It's not that simple, NVidia already adjust the timing on a per monitor basis for "optimal" configuration. It's the primary reason they only support monitors with known EDID's. It's probably more like frame ready, DPC call in driver starts timer, DPC call in reponse to timer sends signal to the USB port etc. In fact it's possible that the code running in response to the timer signal is in user mode.
The issue is that even kernel level drivers depend on being invoked by the OS in response to hardware signals, and that invocation time does vary, although in a well behaved system it should be of the order on 20-30uS which is irrelevant in this context.
The problem is badly behaved drivers can take excessive time inside a DPC call and this can delay the invocation of other drivers once that delay starts to get into the 200uS-300uS range it's likely very visible in 3D Vision. Those badly written drivers could be anything in the system not just the Video driver from chipset drivers through network drivers/sound cards/USB etc etc etc.
Poor hardware can do the same, adding to latency, but I'd be surprised if the issues were really hardware related rather than driver related.
In terms of adding timing control it would still be dependant on constant latency in the system, but that is probably true.
[quote name='Zloth' date='03 October 2010 - 02:04 PM' timestamp='1286143454' post='1125895']
Hmmm, is that even possible? If this is set up so the monitor tells the video card that the frame is ready, then the video card tells the USB port, then the port tells the IR box, then the IR box tells the glasses, what would you tweak? They are all going as fast as they can already. Or is this set up with some sort of common clock so the IR box and monitor know they are supposed to switch every 1/120 of a second starting at a certain time?
It's not that simple, NVidia already adjust the timing on a per monitor basis for "optimal" configuration. It's the primary reason they only support monitors with known EDID's. It's probably more like frame ready, DPC call in driver starts timer, DPC call in reponse to timer sends signal to the USB port etc. In fact it's possible that the code running in response to the timer signal is in user mode.
The issue is that even kernel level drivers depend on being invoked by the OS in response to hardware signals, and that invocation time does vary, although in a well behaved system it should be of the order on 20-30uS which is irrelevant in this context.
The problem is badly behaved drivers can take excessive time inside a DPC call and this can delay the invocation of other drivers once that delay starts to get into the 200uS-300uS range it's likely very visible in 3D Vision. Those badly written drivers could be anything in the system not just the Video driver from chipset drivers through network drivers/sound cards/USB etc etc etc.
Poor hardware can do the same, adding to latency, but I'd be surprised if the issues were really hardware related rather than driver related.
In terms of adding timing control it would still be dependant on constant latency in the system, but that is probably true.
All nVidia need to do is to a) give a damn, and b) give us some sync controls. They must read these threads, I don't get it. Do they not want people to enjoy 3D Vision, do they want it to fail?
It'd be really easy to add a final stage to the 3D Vision setup process where you can fine tune sync for your system. I'm sure fine tuning sync would give the majority of 3D Vision users a better experience, and for some it would make a huge difference.
All nVidia need to do is to a) give a damn, and b) give us some sync controls. They must read these threads, I don't get it. Do they not want people to enjoy 3D Vision, do they want it to fail?
It'd be really easy to add a final stage to the 3D Vision setup process where you can fine tune sync for your system. I'm sure fine tuning sync would give the majority of 3D Vision users a better experience, and for some it would make a huge difference.
Well I asked for help from nvidia with this ghosting and this was part of the reply I got:
"Sorry to hear about the issue you are currently having with our 3D Vision, but I could use a little more information as I haven't seen any similar issues reported:"
Now if they haven't heard about the ghosting issues they need to pull their heads out of the sand.
Well I asked for help from nvidia with this ghosting and this was part of the reply I got:
"Sorry to hear about the issue you are currently having with our 3D Vision, but I could use a little more information as I haven't seen any similar issues reported:"
Now if they haven't heard about the ghosting issues they need to pull their heads out of the sand.
What they should do is issue a beta driver with some really basic sync controls, just operated via hotkeys, so we can all see for ourselves whether it would make a difference.
What they should do is issue a beta driver with some really basic sync controls, just operated via hotkeys, so we can all see for ourselves whether it would make a difference.
Gosh... I couldn't agree more. But, look up my posts and you will get an idea how old the plead for duty/delay actually is - they haven't reacted way back then and I doubt they ever will. I rather believe they taking the 3DTV Play route and pass the ball to the TV manufacturers.
It's a crying shame. All the limitations in the software like freshrate limitation in CRT mode, the mandatory EDID check etc a completely arbitrary and even more annoying since they never were present with the original Metabyte drivers from the start. This is still the same core code after all those years! Might well be these limitations were initially intended to maintain control over quality of experience but at this time they are rather counter productive.
Having said this, these findings are [i]very[/i] interesting to say the least since they prove that ther can much be done to improve the experience....
Gosh... I couldn't agree more. But, look up my posts and you will get an idea how old the plead for duty/delay actually is - they haven't reacted way back then and I doubt they ever will. I rather believe they taking the 3DTV Play route and pass the ball to the TV manufacturers.
It's a crying shame. All the limitations in the software like freshrate limitation in CRT mode, the mandatory EDID check etc a completely arbitrary and even more annoying since they never were present with the original Metabyte drivers from the start. This is still the same core code after all those years! Might well be these limitations were initially intended to maintain control over quality of experience but at this time they are rather counter productive.
Having said this, these findings are very interesting to say the least since they prove that ther can much be done to improve the experience....
But what has a CRT got to do with a motherboard ?
But what has a CRT got to do with a motherboard ?
Intel I7 3820 3.8 Ghz,MSI MS7760 Motherboard, 6GB )2x MSI GTX670 (SLI),OCZ Vertex 230Gb SSD,OCZ Agility 120Gb SSD, Asus 3D VG278HR ,Optoma HD67 3D DLP Beamer with 95inch 2.5 gain screen.
Hmmm, is that even possible? If this is set up so the monitor tells the video card that the frame is ready, then the video card tells the USB port, then the port tells the IR box, then the IR box tells the glasses, what would you tweak? They are all going as fast as they can already. Or is this set up with some sort of common clock so the IR box and monitor know they are supposed to switch every 1/120 of a second starting at a certain time?
[/quote]
It's not that simple, NVidia already adjust the timing on a per monitor basis for "optimal" configuration. It's the primary reason they only support monitors with known EDID's. It's probably more like frame ready, DPC call in driver starts timer, DPC call in reponse to timer sends signal to the USB port etc. In fact it's possible that the code running in response to the timer signal is in user mode.
The issue is that even kernel level drivers depend on being invoked by the OS in response to hardware signals, and that invocation time does vary, although in a well behaved system it should be of the order on 20-30uS which is irrelevant in this context.
The problem is badly behaved drivers can take excessive time inside a DPC call and this can delay the invocation of other drivers once that delay starts to get into the 200uS-300uS range it's likely very visible in 3D Vision. Those badly written drivers could be anything in the system not just the Video driver from chipset drivers through network drivers/sound cards/USB etc etc etc.
Poor hardware can do the same, adding to latency, but I'd be surprised if the issues were really hardware related rather than driver related.
In terms of adding timing control it would still be dependant on constant latency in the system, but that is probably true.
Hmmm, is that even possible? If this is set up so the monitor tells the video card that the frame is ready, then the video card tells the USB port, then the port tells the IR box, then the IR box tells the glasses, what would you tweak? They are all going as fast as they can already. Or is this set up with some sort of common clock so the IR box and monitor know they are supposed to switch every 1/120 of a second starting at a certain time?
It's not that simple, NVidia already adjust the timing on a per monitor basis for "optimal" configuration. It's the primary reason they only support monitors with known EDID's. It's probably more like frame ready, DPC call in driver starts timer, DPC call in reponse to timer sends signal to the USB port etc. In fact it's possible that the code running in response to the timer signal is in user mode.
The issue is that even kernel level drivers depend on being invoked by the OS in response to hardware signals, and that invocation time does vary, although in a well behaved system it should be of the order on 20-30uS which is irrelevant in this context.
The problem is badly behaved drivers can take excessive time inside a DPC call and this can delay the invocation of other drivers once that delay starts to get into the 200uS-300uS range it's likely very visible in 3D Vision. Those badly written drivers could be anything in the system not just the Video driver from chipset drivers through network drivers/sound cards/USB etc etc etc.
Poor hardware can do the same, adding to latency, but I'd be surprised if the issues were really hardware related rather than driver related.
In terms of adding timing control it would still be dependant on constant latency in the system, but that is probably true.
My Blog
It'd be really easy to add a final stage to the 3D Vision setup process where you can fine tune sync for your system. I'm sure fine tuning sync would give the majority of 3D Vision users a better experience, and for some it would make a huge difference.
It'd be really easy to add a final stage to the 3D Vision setup process where you can fine tune sync for your system. I'm sure fine tuning sync would give the majority of 3D Vision users a better experience, and for some it would make a huge difference.
"Sorry to hear about the issue you are currently having with our 3D Vision, but I could use a little more information as I haven't seen any similar issues reported:"
Now if they haven't heard about the ghosting issues they need to pull their heads out of the sand.
"Sorry to hear about the issue you are currently having with our 3D Vision, but I could use a little more information as I haven't seen any similar issues reported:"
Now if they haven't heard about the ghosting issues they need to pull their heads out of the sand.
It's a crying shame. All the limitations in the software like freshrate limitation in CRT mode, the mandatory EDID check etc a completely arbitrary and even more annoying since they never were present with the original Metabyte drivers from the start. This is still the same core code after all those years! Might well be these limitations were initially intended to maintain control over quality of experience but at this time they are rather counter productive.
Having said this, these findings are [i]very[/i] interesting to say the least since they prove that ther can much be done to improve the experience....
It's a crying shame. All the limitations in the software like freshrate limitation in CRT mode, the mandatory EDID check etc a completely arbitrary and even more annoying since they never were present with the original Metabyte drivers from the start. This is still the same core code after all those years! Might well be these limitations were initially intended to maintain control over quality of experience but at this time they are rather counter productive.
Having said this, these findings are very interesting to say the least since they prove that ther can much be done to improve the experience....