[quote]The beauty of 1080p checkerboard is that it actually looks much better that you would suppose. It's hard to tell the difference between 1080p checkerboard and true 1080p 3D. The human brain can handle the checkerboard pattern very well and adds the missing information per view.[/quote]
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
The beauty of 1080p checkerboard is that it actually looks much better that you would suppose. It's hard to tell the difference between 1080p checkerboard and true 1080p 3D. The human brain can handle the checkerboard pattern very well and adds the missing information per view.
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
[quote name='tritosine' date='23 January 2011 - 01:44 PM' timestamp='1295786666' post='1182133']
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
Cybereality can't you keep your FUD elsewhere.
[/quote]
Checkerboard has been considered superior to [b]upscaled 720p[/b] numerous times already. The quincunx sampling of checkerboard downsampling just works great, it's what people who actually have the stuff in their hands tell you. Of course it is not as good as the real native thing but it's really efficient at hiding the resolution loss.
No matter how much you love your 720p projector you can't deny the facts that :
-1080p is the benchmark for TVs and the inevitable next step for projectors because of the reference 2D resolution of BluRay. People have 1080p displays and will keep to using them, even with the incomplete hdmi1.4 implementation.
-Checkerboard 1080p contains more resolution per eye than full resolution 720p
-upscaling with an integer factor (x2 for checkerboard) is known to always give better quality than a non integer factor (x1.5 fr 720p->1080)
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
[quote name='tritosine' date='23 January 2011 - 01:44 PM' timestamp='1295786666' post='1182133']
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
Cybereality can't you keep your FUD elsewhere.
Checkerboard has been considered superior to upscaled 720p numerous times already. The quincunx sampling of checkerboard downsampling just works great, it's what people who actually have the stuff in their hands tell you. Of course it is not as good as the real native thing but it's really efficient at hiding the resolution loss.
No matter how much you love your 720p projector you can't deny the facts that :
-1080p is the benchmark for TVs and the inevitable next step for projectors because of the reference 2D resolution of BluRay. People have 1080p displays and will keep to using them, even with the incomplete hdmi1.4 implementation.
-Checkerboard 1080p contains more resolution per eye than full resolution 720p
-upscaling with an integer factor (x2 for checkerboard) is known to always give better quality than a non integer factor (x1.5 fr 720p->1080)
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
It's reasonable to ask the question "why is 1280x720 so bad? DLP FP projectors use this res and they achieve very good image quality." It all goes to what Nobsi and Blackshark are saying. 720P 'per se' isn't the prob, it's that it's being non-integer scaled to 1920x1080. 720P on a 720P panel isn't bad at all. 720P on a 1080P panel looks muddy, cloudy. If you 2x upscale to 2560x1440, then you would have in effect 1 to 1 pixel mapping. But using non-integral scaling results in errors, it's a mathematical certainy.
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
It's reasonable to ask the question "why is 1280x720 so bad? DLP FP projectors use this res and they achieve very good image quality." It all goes to what Nobsi and Blackshark are saying. 720P 'per se' isn't the prob, it's that it's being non-integer scaled to 1920x1080. 720P on a 720P panel isn't bad at all. 720P on a 1080P panel looks muddy, cloudy. If you 2x upscale to 2560x1440, then you would have in effect 1 to 1 pixel mapping. But using non-integral scaling results in errors, it's a mathematical certainy.
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
[quote name='BlackSharkfr' date='23 January 2011 - 12:15 PM' timestamp='1295802943' post='1182233']
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
[/quote]
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.
Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
[quote name='BlackSharkfr' date='23 January 2011 - 12:15 PM' timestamp='1295802943' post='1182233']
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.
Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
I think checkerboard is DLP exclusive tech, and Sammy was such smartass they put it onto flatscreens.
This whole situation is similar to the early 90's CD -> HDCD stuff, when 16bit 44.1khz was suddenly inadequate once high resolution AD converters started to appear.
HDCD 's answer was to try a new variations on pulse code modulation.
That situation is not really true here, however, cause most PC games are consolebuilds hence don't have the equivalent of "high resolution" source, and we don't have means to play with video signals.
I think checkerboard is DLP exclusive tech, and Sammy was such smartass they put it onto flatscreens.
This whole situation is similar to the early 90's CD -> HDCD stuff, when 16bit 44.1khz was suddenly inadequate once high resolution AD converters started to appear.
HDCD 's answer was to try a new variations on pulse code modulation.
That situation is not really true here, however, cause most PC games are consolebuilds hence don't have the equivalent of "high resolution" source, and we don't have means to play with video signals.
[quote name='disolitude' date='23 January 2011 - 10:56 AM' timestamp='1295805401' post='1182263']
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.[/quote]Just one manufacturer, but Samsung is the biggest. 2 million world wide and counting HDMI1.4 HDTVs sold. That's potentially 2 million people screwed by Nvidia's incomprehensible design blunder. Tough sell? Only if there was a downside to allowing superiority image quality. This isn't a "to get A you must give up B " deal, it amounts to a bug fix, and Nvidia should not mind fixing a bug that severely affects so many people, especially when the fix is so easy to implement.
[quote]Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
[/quote]
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
If Samsung stops doing checkerboard on their 2011 TVs, then I'll be keeping my 2010 till it wears out. I wonder when we'll see 2011 users manuals on the Samsung site?
[quote name='disolitude' date='23 January 2011 - 10:56 AM' timestamp='1295805401' post='1182263']
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.Just one manufacturer, but Samsung is the biggest. 2 million world wide and counting HDMI1.4 HDTVs sold. That's potentially 2 million people screwed by Nvidia's incomprehensible design blunder. Tough sell? Only if there was a downside to allowing superiority image quality. This isn't a "to get A you must give up B " deal, it amounts to a bug fix, and Nvidia should not mind fixing a bug that severely affects so many people, especially when the fix is so easy to implement.
Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
If Samsung stops doing checkerboard on their 2011 TVs, then I'll be keeping my 2010 till it wears out. I wonder when we'll see 2011 users manuals on the Samsung site?
[quote name='roller11' date='23 January 2011 - 06:41 PM' timestamp='1295804479' post='1182254']
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
[/quote]
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues.
Hdmi 1.4 provides this but the current chips can only do it at 1080p24, as required by the minimum specification.
TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.
My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link. If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.
[quote name='roller11' date='23 January 2011 - 06:41 PM' timestamp='1295804479' post='1182254']
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues.
Hdmi 1.4 provides this but the current chips can only do it at 1080p24, as required by the minimum specification.
TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.
My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link. If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
Both 3D PC and 1080p significance is way overblown.
Sure we all would like to see 3d PC as the fourth platform, but that's not the case, and IMHO it's not like it's going to be any more relevant in 2013 either. : ))
Both 3D PC and 1080p significance is way overblown.
Sure we all would like to see 3d PC as the fourth platform, but that's not the case, and IMHO it's not like it's going to be any more relevant in 2013 either. : ))
[quote name='BlackSharkfr' date='23 January 2011 - 12:28 PM' timestamp='1295810888' post='1182315']
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues.[/quote] This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
[quote]TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.[/quote] This only means they can use Dual Link DVI, they just lack the will.
[quote]My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link.[/quote]Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. [quote]If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.[/quote] No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
[quote name='BlackSharkfr' date='23 January 2011 - 12:28 PM' timestamp='1295810888' post='1182315']
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues. This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.
This only means they can use Dual Link DVI, they just lack the will.
My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link.
Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster.
If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.
No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
[quote name='roller11' date='23 January 2011 - 10:04 PM' timestamp='1295816659' post='1182369']
This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
This only means they can use Dual Link DVI, they just lack the will.
Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
[/quote]
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
[quote name='roller11' date='23 January 2011 - 10:04 PM' timestamp='1295816659' post='1182369']
This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
This only means they can use Dual Link DVI, they just lack the will.
Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
[quote name='BlackSharkfr' date='23 January 2011 - 02:51 PM' timestamp='1295823091' post='1182423']
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
[/quote]
You're getting all fancy in your talking, but the simple fact is that far cheaper 120Hz monitors have more functionality in this respect than very high end big screen TV's. It's not hard to do, nor would it violate some sort of policy ... why would adding a dual link DVI input violate a policy? Big screen TV's could operate just as 3D Vision ready branded monitors do today ... if they had a 120Hz mode dual link input of some sort. It's really unfortunate that they don't seem to care about this scenario at all and are leaving it for the PC market. So gamers like ourselves are left unable to use the full potential of our gaming rigs.
[quote name='BlackSharkfr' date='23 January 2011 - 02:51 PM' timestamp='1295823091' post='1182423']
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
You're getting all fancy in your talking, but the simple fact is that far cheaper 120Hz monitors have more functionality in this respect than very high end big screen TV's. It's not hard to do, nor would it violate some sort of policy ... why would adding a dual link DVI input violate a policy? Big screen TV's could operate just as 3D Vision ready branded monitors do today ... if they had a 120Hz mode dual link input of some sort. It's really unfortunate that they don't seem to care about this scenario at all and are leaving it for the PC market. So gamers like ourselves are left unable to use the full potential of our gaming rigs.
[quote name='BlackSharkfr' date='23 January 2011 - 03:51 PM' timestamp='1295823091' post='1182423']
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. [/quote]How so? A seperate cable isn't required now, the TV knows when Vblanking occurs, so it knows when to trigger the glasses. The sync signal is, in effect, embedded in V Blank.
[quote] what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.[/quote]So dual link is in the HDMI spec, and at the same time it goes against HDMI policy, and HDMI policy is to not have two channels of RGB data? Two channels that are already in the pinout, unused?[quote]
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
[/quote] Why would the HDMI people want to avoid implementing dual channel in their own standard?
Anyway, what's the stopping the TV makers from implementing 3D the same way the monitor guys are, and on the same dual link DVI connection? Or, add displayport 1.2 in addition to HDMI1.4?
[quote name='BlackSharkfr' date='23 January 2011 - 03:51 PM' timestamp='1295823091' post='1182423']
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. How so? A seperate cable isn't required now, the TV knows when Vblanking occurs, so it knows when to trigger the glasses. The sync signal is, in effect, embedded in V Blank.
what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
So dual link is in the HDMI spec, and at the same time it goes against HDMI policy, and HDMI policy is to not have two channels of RGB data? Two channels that are already in the pinout, unused?
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
Why would the HDMI people want to avoid implementing dual channel in their own standard?
Anyway, what's the stopping the TV makers from implementing 3D the same way the monitor guys are, and on the same dual link DVI connection? Or, add displayport 1.2 in addition to HDMI1.4?
[quote name='roller11' date='23 January 2011 - 02:14 PM' timestamp='1295810094' post='1182305']
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
[/quote]
My guess is that they will use the Mitsubishi HDMI EDID to recognize its a Mitsu DLP and allow for checkerboard. They've done this in the past as a Mitsubishi DLP owner gets his TV automatically rerognized by 3D vision software while a Samsung DLP owner has to select "Generic DLP" option.
[quote name='roller11' date='23 January 2011 - 02:14 PM' timestamp='1295810094' post='1182305']
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
My guess is that they will use the Mitsubishi HDMI EDID to recognize its a Mitsu DLP and allow for checkerboard. They've done this in the past as a Mitsubishi DLP owner gets his TV automatically rerognized by 3D vision software while a Samsung DLP owner has to select "Generic DLP" option.
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
Cybereality can't you keep your FUD elsewhere.
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
Cybereality can't you keep your FUD elsewhere.
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
Cybereality can't you keep your FUD elsewhere.
[/quote]
Checkerboard has been considered superior to [b]upscaled 720p[/b] numerous times already. The quincunx sampling of checkerboard downsampling just works great, it's what people who actually have the stuff in their hands tell you. Of course it is not as good as the real native thing but it's really efficient at hiding the resolution loss.
No matter how much you love your 720p projector you can't deny the facts that :
-1080p is the benchmark for TVs and the inevitable next step for projectors because of the reference 2D resolution of BluRay. People have 1080p displays and will keep to using them, even with the incomplete hdmi1.4 implementation.
-Checkerboard 1080p contains more resolution per eye than full resolution 720p
-upscaling with an integer factor (x2 for checkerboard) is known to always give better quality than a non integer factor (x1.5 fr 720p->1080)
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
-can't you show it somehow instead
Checkerboard was developed with diamond shaped pixels in mind, I think square pixels must suck with checkerboard, and no upscaling anomalies ,huh? Whole checkerboard process is upscaling man. No amount of wishful thinking changes the fact that it's half res plus upscaling.
Cybereality can't you keep your FUD elsewhere.
Checkerboard has been considered superior to upscaled 720p numerous times already. The quincunx sampling of checkerboard downsampling just works great, it's what people who actually have the stuff in their hands tell you. Of course it is not as good as the real native thing but it's really efficient at hiding the resolution loss.
No matter how much you love your 720p projector you can't deny the facts that :
-1080p is the benchmark for TVs and the inevitable next step for projectors because of the reference 2D resolution of BluRay. People have 1080p displays and will keep to using them, even with the incomplete hdmi1.4 implementation.
-Checkerboard 1080p contains more resolution per eye than full resolution 720p
-upscaling with an integer factor (x2 for checkerboard) is known to always give better quality than a non integer factor (x1.5 fr 720p->1080)
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
I don't really care, I just want some proof, you presented none.
I don't really care, I just want some proof, you presented none.
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
that only depends how much processing you throw at it, its scaled either way.
DLP can be easily modified with Dual DVI , not sure aobut TV's.
that only depends how much processing you throw at it, its scaled either way.
DLP can be easily modified with Dual DVI , not sure aobut TV's.
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
[/quote]
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.
Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
Checkerboard is the best thing we can have at the moment with these hdmi 1.4 TVs which don't support stereo 1080p60 input.
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.
Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
This whole situation is similar to the early 90's CD -> HDCD stuff, when 16bit 44.1khz was suddenly inadequate once high resolution AD converters started to appear.
HDCD 's answer was to try a new variations on pulse code modulation.
That situation is not really true here, however, cause most PC games are consolebuilds hence don't have the equivalent of "high resolution" source, and we don't have means to play with video signals.
This whole situation is similar to the early 90's CD -> HDCD stuff, when 16bit 44.1khz was suddenly inadequate once high resolution AD converters started to appear.
HDCD 's answer was to try a new variations on pulse code modulation.
That situation is not really true here, however, cause most PC games are consolebuilds hence don't have the equivalent of "high resolution" source, and we don't have means to play with video signals.
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.[/quote]Just one manufacturer, but Samsung is the biggest. 2 million world wide and counting HDMI1.4 HDTVs sold. That's potentially 2 million people screwed by Nvidia's incomprehensible design blunder. Tough sell? Only if there was a downside to allowing superiority image quality. This isn't a "to get A you must give up B " deal, it amounts to a bug fix, and Nvidia should not mind fixing a bug that severely affects so many people, especially when the fix is so easy to implement.
[quote]Andrew stated that they will support the 2011 mitsubishi DLPs, so I guess there is hope that the same fix may work for samsungs in checkerboard. New mitsubishis are going to be HDMI 1.4 so...
[/quote]
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
If Samsung stops doing checkerboard on their 2011 TVs, then I'll be keeping my 2010 till it wears out. I wonder when we'll see 2011 users manuals on the Samsung site?
But only Samsung HDMI 1.4 TVs support checkerboard... If everyone supported it, i'd understand the outrage here. But since its just one manufacturer (who can also take it away with their 2011 line of TVs), its kind of tough sell to Nvidia.Just one manufacturer, but Samsung is the biggest. 2 million world wide and counting HDMI1.4 HDTVs sold. That's potentially 2 million people screwed by Nvidia's incomprehensible design blunder. Tough sell? Only if there was a downside to allowing superiority image quality. This isn't a "to get A you must give up B " deal, it amounts to a bug fix, and Nvidia should not mind fixing a bug that severely affects so many people, especially when the fix is so easy to implement.
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
If Samsung stops doing checkerboard on their 2011 TVs, then I'll be keeping my 2010 till it wears out. I wonder when we'll see 2011 users manuals on the Samsung site?
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
[/quote]
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues.
Hdmi 1.4 provides this but the current chips can only do it at 1080p24, as required by the minimum specification.
TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.
My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link. If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.
Seems crazy to me that the TV Makers are spending all this money on "apps", while at the same time they won't put a $2.00 Dual DVI input so that full res/framerate 3D apps can run. The only hope I have of this changing is that the console makers will start pushing 3D gaming in which case they will want to claim "full 1920x1080/60P resolution". This will require the needed addition of dual link and us PC gamers will finally have what we want. Consoles are very popular and therefore have a lot of clout.
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues.
Hdmi 1.4 provides this but the current chips can only do it at 1080p24, as required by the minimum specification.
TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.
My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link. If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
Sure we all would like to see 3d PC as the fourth platform, but that's not the case, and IMHO it's not like it's going to be any more relevant in 2013 either. : ))
Sure we all would like to see 3d PC as the fourth platform, but that's not the case, and IMHO it's not like it's going to be any more relevant in 2013 either. : ))
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues.[/quote] This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
[quote]TV manufacturers have already clearly stated they did not want to use Nvidia 3D-vision-like 120Hz transmission.[/quote] This only means they can use Dual Link DVI, they just lack the will.
[quote]My crystal ball says TV manufacturers will wait until new hdmi 1.4 chips with full bandwidth are available rather than go back to DVI Dual-link.[/quote]Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. [quote]If it gets too long, they will maybe (and I really put a big "maybe" here) add Display Port on some limited models in order to put pressure on hdmi chip manufacturers (or make hdmi licensing cheaper) but the primary method will remain hdmi 1.4. Display port just doesn't have any traction among TV manufacturers at the moment and it is unlikely to change quickly enough to provide products in time, and there's the entire home cinema hardware ecosystem that already uses hdmi that would have to be replaced which puts even more pressure against the change.[/quote] No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
It's not that simple.
Dual Link DVI and 1080p120Hz alone don't make 3D plug and play as required by living-room consumer electronics.
TV manufacturers want to control the sync themselves, this means they want a proper way to mark when the streams are 3D and carry the left/right stereo-pair information integrated within the video stream and they want it without submarine patent issues. This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
This only means they can use Dual Link DVI, they just lack the will.
Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
This only means they can use Dual Link DVI, they just lack the will.
Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
[/quote]
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
This is being done on cheap computer monitors, so it can be done on HDTVs cheaply and easily with the same dual link DVI.
This only means they can use Dual Link DVI, they just lack the will.
Dual link HDMI (type 'A')is already in the specification. All that's needed is to provide a second pixel channel, use the three pins that are already there. No need to make the chips faster. No one's suggesting doing away with HDMI1.4, just add another dual link input.
There's no technical hump to overcome here, the TV guys just need a reason to add a dual link input. They lack that reason because everything works fine as is....well, except for 3D gaming which apparently isn't a problem.
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
[/quote]
You're getting all fancy in your talking, but the simple fact is that far cheaper 120Hz monitors have more functionality in this respect than very high end big screen TV's. It's not hard to do, nor would it violate some sort of policy ... why would adding a dual link DVI input violate a policy? Big screen TV's could operate just as 3D Vision ready branded monitors do today ... if they had a 120Hz mode dual link input of some sort. It's really unfortunate that they don't seem to care about this scenario at all and are leaving it for the PC market. So gamers like ourselves are left unable to use the full potential of our gaming rigs.
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. They don't want this for political reasons.
I know there are unused pairs hdmi keeps for future features but I have never heard about dual-link hdmi with a type A connector though I know about the dual-link type B connector : which was a total failure which nobody used and which was ditched in favor of hdmi1.3 which provides the same bandwidth as dual link but through the same type A connector. (hdmi 1.4 abandoned type B definitely).
But basically since nobody uses dual-link hdmi chips, what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
You're getting all fancy in your talking, but the simple fact is that far cheaper 120Hz monitors have more functionality in this respect than very high end big screen TV's. It's not hard to do, nor would it violate some sort of policy ... why would adding a dual link DVI input violate a policy? Big screen TV's could operate just as 3D Vision ready branded monitors do today ... if they had a 120Hz mode dual link input of some sort. It's really unfortunate that they don't seem to care about this scenario at all and are leaving it for the PC market. So gamers like ourselves are left unable to use the full potential of our gaming rigs.
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. [/quote]How so? A seperate cable isn't required now, the TV knows when Vblanking occurs, so it knows when to trigger the glasses. The sync signal is, in effect, embedded in V Blank.
[quote] what you are doing is suggesting to create completely new hdmi chips with completely new 3D transmission that goes against the hdmi policy and specification to avoid creating the real official standard thing the industry agreed to do.[/quote]So dual link is in the HDMI spec, and at the same time it goes against HDMI policy, and HDMI policy is to not have two channels of RGB data? Two channels that are already in the pinout, unused?[quote]
It's not a lack of good will from the manufacturers : you are asking them to do exactly what they want to avoid.
[/quote] Why would the HDMI people want to avoid implementing dual channel in their own standard?
Anyway, what's the stopping the TV makers from implementing 3D the same way the monitor guys are, and on the same dual link DVI connection? Or, add displayport 1.2 in addition to HDMI1.4?
The use of Dual-link DVI for 3D vision isn't enough to provide 3D transmission in a standard way. You also need a separate way to activate/deactivate the 3D mode and transmit the eye sync, on a TV that uses it's own emitter, that means a second cable from the computer to TV just for the sync. How so? A seperate cable isn't required now, the TV knows when Vblanking occurs, so it knows when to trigger the glasses. The sync signal is, in effect, embedded in V Blank.
So dual link is in the HDMI spec, and at the same time it goes against HDMI policy, and HDMI policy is to not have two channels of RGB data? Two channels that are already in the pinout, unused? Why would the HDMI people want to avoid implementing dual channel in their own standard?
Anyway, what's the stopping the TV makers from implementing 3D the same way the monitor guys are, and on the same dual link DVI connection? Or, add displayport 1.2 in addition to HDMI1.4?
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
[/quote]
My guess is that they will use the Mitsubishi HDMI EDID to recognize its a Mitsu DLP and allow for checkerboard. They've done this in the past as a Mitsubishi DLP owner gets his TV automatically rerognized by 3D vision software while a Samsung DLP owner has to select "Generic DLP" option.
Hmmm.....I wonder how Andrew reconciles that? He won't allow support for any mode that is not automatically selected, yet HDMI1.4 automatically selects 720P framepacking. Does that mean the 2011 Mits will be forced to 720P as is the case now?
My guess is that they will use the Mitsubishi HDMI EDID to recognize its a Mitsu DLP and allow for checkerboard. They've done this in the past as a Mitsubishi DLP owner gets his TV automatically rerognized by 3D vision software while a Samsung DLP owner has to select "Generic DLP" option.