[quote name='cybereality' date='04 December 2011 - 12:22 PM' timestamp='1323030125' post='1337548']
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
[/quote]
I figured it was but spent 10 minutes looking through the faq and did some searches with the words hertz, Tv and a few others and came up with nothing. That's enough and I got my answer this way. All the technical aspects aside I can play 3d games without having to go switch the hertz on my monitor with Tridef, so I'll probably use that.
[quote name='cybereality' date='04 December 2011 - 12:22 PM' timestamp='1323030125' post='1337548']
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
I figured it was but spent 10 minutes looking through the faq and did some searches with the words hertz, Tv and a few others and came up with nothing. That's enough and I got my answer this way. All the technical aspects aside I can play 3d games without having to go switch the hertz on my monitor with Tridef, so I'll probably use that.
[quote name='cybereality' date='04 December 2011 - 08:22 PM' timestamp='1323030125' post='1337548']
Wow! Ton of misinformation going on here.
First off, when people refer 1080P60 in 3D, it means 2 full resolution 1080P images (one left, one right) formatted in HDMI 1.4a frame-packing (which is similar to over/under with some blank space in between). This does not refer to "legacy" formats like interlaced, checkerboard, side-by-side, etc. These are generally called "frame-compatible" formats, and result in a reduction of resolution. I know a lot of people here seem to think these are full-resolution, but they are not. For example, using interlaced 1080P will result in an actual resolution of 1920x540 per eye, clearly not full HD. The only way to get full resolution 1080P is either with Nvidia's proprietary format over Dual-Link DVI, or using the HDMI 1.4a standard, however limited to 24Hz.
In addition, it is not the HDMI cable that limits this. The actual limitation is with the HDMI chipsets that process the signal. The current ones do not provide enough bandwidth for the full resolution 1080P60 signal. There are new versions of these chips that will support 1080P60, but they are not used in any HDTVs yet (maybe next year, I don't know). However the HDMI 1.4a spec already allows for this, although it is an optional format. So this will be supported at some point, just not today.
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
[/quote]
I was pretty sure checkerboard was 1920x540(interlaced). People swear its almost as good as 1080p though I never understood why. Genuinely curious.
Cant wait for new hdmi spec. I know that the Hobbit movie will be filmed in that spec.
[quote name='cybereality' date='04 December 2011 - 08:22 PM' timestamp='1323030125' post='1337548']
Wow! Ton of misinformation going on here.
First off, when people refer 1080P60 in 3D, it means 2 full resolution 1080P images (one left, one right) formatted in HDMI 1.4a frame-packing (which is similar to over/under with some blank space in between). This does not refer to "legacy" formats like interlaced, checkerboard, side-by-side, etc. These are generally called "frame-compatible" formats, and result in a reduction of resolution. I know a lot of people here seem to think these are full-resolution, but they are not. For example, using interlaced 1080P will result in an actual resolution of 1920x540 per eye, clearly not full HD. The only way to get full resolution 1080P is either with Nvidia's proprietary format over Dual-Link DVI, or using the HDMI 1.4a standard, however limited to 24Hz.
In addition, it is not the HDMI cable that limits this. The actual limitation is with the HDMI chipsets that process the signal. The current ones do not provide enough bandwidth for the full resolution 1080P60 signal. There are new versions of these chips that will support 1080P60, but they are not used in any HDTVs yet (maybe next year, I don't know). However the HDMI 1.4a spec already allows for this, although it is an optional format. So this will be supported at some point, just not today.
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
I was pretty sure checkerboard was 1920x540(interlaced). People swear its almost as good as 1080p though I never understood why. Genuinely curious.
Cant wait for new hdmi spec. I know that the Hobbit movie will be filmed in that spec.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
[quote name='eqzitara' date='04 December 2011 - 12:39 PM' timestamp='1323027545' post='1337530']
I was under impression as well that their was 50% degradation.[/quote]
I was refering to human visual perception, not pixel count. That's why I was careful to word my comment as I did.
[quote]Isnt it essentially 1080i in 3d format?[/quote]
That analogy is accurate only in terms of pixel count. 3D interleaved is more like 1080i, but it doesn't look as good as CB (more visual aliasing) even though they have the same pixel count.
The reason you have the wrong impression is due to a few people who have an interest in trivializing their TVs inability to do native resolution 3D gaming. These people have never seen CB mode, but to defend their buying decision they "close the gap" between their TV vs CB capable TV with the standard half truth trick. People who don't know any better sometimes buy into this. If 720P was "just as good" as CB, you wouldn't have the massive interest in CB as shown in threads such as this one:
http://forums.nvidia.com/index.php?showtopic=200925
You also would not see 422,000 plus views in the "CB for all" thread.
[quote name='eqzitara' date='04 December 2011 - 12:39 PM' timestamp='1323027545' post='1337530']
I was under impression as well that their was 50% degradation.
I was refering to human visual perception, not pixel count. That's why I was careful to word my comment as I did.
Isnt it essentially 1080i in 3d format?
That analogy is accurate only in terms of pixel count. 3D interleaved is more like 1080i, but it doesn't look as good as CB (more visual aliasing) even though they have the same pixel count.
The reason you have the wrong impression is due to a few people who have an interest in trivializing their TVs inability to do native resolution 3D gaming. These people have never seen CB mode, but to defend their buying decision they "close the gap" between their TV vs CB capable TV with the standard half truth trick. People who don't know any better sometimes buy into this. If 720P was "just as good" as CB, you wouldn't have the massive interest in CB as shown in threads such as this one:
The hdmi situation is a tad ridiculous though. The hdmi standard (at least 1.3 and upwards) is enough to handle 1080@120Hz but the manufacturers of the chips didn't think higher res/fps than 1080@60Hz should be required. Either it's stupid or yet another annoying way of doing business...
The hdmi situation is a tad ridiculous though. The hdmi standard (at least 1.3 and upwards) is enough to handle 1080@120Hz but the manufacturers of the chips didn't think higher res/fps than 1080@60Hz should be required. Either it's stupid or yet another annoying way of doing business...
I still haven’t seen these new monitors as they were just announced, but the interesting thing is that although they support HDMI 1.4, the products like these two monitors that are “Optimized for GeForce†will not be limited to 1080p 24Hz 3D mode only. When used for gaming in stereo 3D mode the driver licensed from Nvidia will be outputting row interleaved 1080p at 60Hz. The HDMI 1.4 limitations for stereo 3D will be applied only when using consumer electronic devices…
I still haven’t seen these new monitors as they were just announced, but the interesting thing is that although they support HDMI 1.4, the products like these two monitors that are “Optimized for GeForce†will not be limited to 1080p 24Hz 3D mode only. When used for gaming in stereo 3D mode the driver licensed from Nvidia will be outputting row interleaved 1080p at 60Hz. The HDMI 1.4 limitations for stereo 3D will be applied only when using consumer electronic devices…
[quote name='roller11' date='04 December 2011 - 08:30 PM' timestamp='1323027005' post='1337526']
Just for clarification, HDMI1.4 has sufficient raw bandwidth for 1920x1080 /120 Hz. The reason we can't get framepacking or frame sequential at 2X 1920x1080 @ 60 frame per second is because the VESA commitee doesn't spec it.[/QUOTE]
Not true.
[quote name='roller11' date='04 December 2011 - 08:30 PM' timestamp='1323027005' post='1337526']
Just for clarification, HDMI1.4 has sufficient raw bandwidth for 1920x1080 /120 Hz. The reason we can't get framepacking or frame sequential at 2X 1920x1080 @ 60 frame per second is because the VESA commitee doesn't spec it.[/QUOTE]
[quote name='Likay' date='04 December 2011 - 09:58 PM' timestamp='1323032337' post='1337574']
The hdmi situation is a tad ridiculous though. The hdmi standard (at least 1.3 and upwards) is enough to handle 1080@120Hz but the manufacturers of the chips didn't think higher res/fps than 1080@60Hz should be required. Either it's stupid or yet another annoying way of doing business...
[/quote]
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
[quote name='Likay' date='04 December 2011 - 09:58 PM' timestamp='1323032337' post='1337574']
The hdmi situation is a tad ridiculous though. The hdmi standard (at least 1.3 and upwards) is enough to handle 1080@120Hz but the manufacturers of the chips didn't think higher res/fps than 1080@60Hz should be required. Either it's stupid or yet another annoying way of doing business...
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
[quote name='D-Man11' date='04 December 2011 - 10:21 PM' timestamp='1323033697' post='1337583']
Use TriDef, it will give you 1920x1080P@60x1, your TV will do the rest. Vs 1920x1080@24x2
The new Acer 27inch passive monitor and I believe the 23inch model as well, will be supported at 1920x1080P@60x1
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
[quote name='Grestorn' date='05 December 2011 - 02:10 AM' timestamp='1323069059' post='1337749']
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
[/quote]
I've also read HDMI is already capable of more, its just being artificially limited right now so the HDMI SIG can create a new spec down the road and force everyone to replace their expensive HT hardware again. From everything I've seen, the signaling components in the hardware/TVs are the bottleneck right now, not the cable itself or the actual interconnect.
You can see here HDMI 1.3 is already capable of 10.2 Gbps, which already exceeds DL-DVI and is enough for 2x1080p@60Hz for stereo 3D. Expectation is that the HDMI money printing cartel will again expand the spec for 2x1080p@60Hz and/or 4K x 2K Quad FullHD. http://www.hdmi.org/learningcenter/faq.aspx
"Q. What’s new in the HDMI 1.3 Specification?
•Higher speed: Although all previous versions of HDMI have had more than enough bandwidth to support all current HDTV formats, including full, uncompressed 1080p signals, [b]HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps)[/b] to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, [b]built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds[/b]."
Basically, HDMI board is always going to take a [i]de minimis [/i]approach to the spec so they can sell update new hardware to spur hardware sales.
As for DVI, I actually think the limitation of single-link bandwidth is on the DVI side only. If there was actually a standards group pushing it, I think they could update the spec to support DL-DVI bandwidth over a single DVI link and 4x DVI bandwidth over 2x physical DVI links. Maybe slightly higher quality cabling needed, but mainly need updated signaling components on the hardware side. But DVI has an uncertain future because it has no upgrade path and hardware makers are flocking to HDMI. DVI's only saving grace right now is that no one bothers messing with Display Port other than Dell and ATI, along with my suspicion the HDMI board loves dragging its feet on new specs so they can ensure more sales down the road. Once HDMI bandwidth exceeds DL-DVI officially (1.4a maybe?), we may finally see the beginning of the end of DVI in the PC space.
[quote name='Grestorn' date='05 December 2011 - 02:10 AM' timestamp='1323069059' post='1337749']
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
I've also read HDMI is already capable of more, its just being artificially limited right now so the HDMI SIG can create a new spec down the road and force everyone to replace their expensive HT hardware again. From everything I've seen, the signaling components in the hardware/TVs are the bottleneck right now, not the cable itself or the actual interconnect.
You can see here HDMI 1.3 is already capable of 10.2 Gbps, which already exceeds DL-DVI and is enough for 2x1080p@60Hz for stereo 3D. Expectation is that the HDMI money printing cartel will again expand the spec for 2x1080p@60Hz and/or 4K x 2K Quad FullHD. http://www.hdmi.org/learningcenter/faq.aspx
"Q. What’s new in the HDMI 1.3 Specification?
•Higher speed: Although all previous versions of HDMI have had more than enough bandwidth to support all current HDTV formats, including full, uncompressed 1080p signals, HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds."
Basically, HDMI board is always going to take a de minimis approach to the spec so they can sell update new hardware to spur hardware sales.
As for DVI, I actually think the limitation of single-link bandwidth is on the DVI side only. If there was actually a standards group pushing it, I think they could update the spec to support DL-DVI bandwidth over a single DVI link and 4x DVI bandwidth over 2x physical DVI links. Maybe slightly higher quality cabling needed, but mainly need updated signaling components on the hardware side. But DVI has an uncertain future because it has no upgrade path and hardware makers are flocking to HDMI. DVI's only saving grace right now is that no one bothers messing with Display Port other than Dell and ATI, along with my suspicion the HDMI board loves dragging its feet on new specs so they can ensure more sales down the road. Once HDMI bandwidth exceeds DL-DVI officially (1.4a maybe?), we may finally see the beginning of the end of DVI in the PC space.
TriDef can do exactly what Nvidia is offering for the Acer, they have been for some time. For that very reason, I used TriDef more than Nvidia when I had a passive TV. Way better option than what 3DTV Play offered. Nvidia is just shortchanging the LG Passive owners and should support them to the extent of their capabilities. Their best option is TriDef, LG is not going to pay for the support that Acer purchased. I personally recommend TriDef to anyone using a passive display, of course this will not be the case with owners of the Acer displays, I will tell them that Nvidia gives them all the support they need.
BTW, unless you've gamed on one, you can not say how terrible they are. They are not that bad, what the Passive TVs give up in a little lost resolution, they make up with vibrant colors. Nevertheless, the LG had too much lag, clear ghosting in interleave and colorbug in checkerboard. Not a great display. It has been suggested that the Toshiba Passive does not experience the lag or clear ghosting like LG
Also, LG uses an algorythm, that further improves the illusion of full 1080P that is achieved via their post processing of the input signal http://www.flatpanelshd.com/news.php?subaction=showfull&id=1313938337
Myself, I recommend using a DLP Projector. No lag, HUGE screen, more immersive, friends and family can easily enjoy it, only shortfall is 3D gaming is limited to 720P on a single projector. But 3D Movies kick booty on screens that range from 90 inches and up.
TriDef can do exactly what Nvidia is offering for the Acer, they have been for some time. For that very reason, I used TriDef more than Nvidia when I had a passive TV. Way better option than what 3DTV Play offered. Nvidia is just shortchanging the LG Passive owners and should support them to the extent of their capabilities. Their best option is TriDef, LG is not going to pay for the support that Acer purchased. I personally recommend TriDef to anyone using a passive display, of course this will not be the case with owners of the Acer displays, I will tell them that Nvidia gives them all the support they need.
BTW, unless you've gamed on one, you can not say how terrible they are. They are not that bad, what the Passive TVs give up in a little lost resolution, they make up with vibrant colors. Nevertheless, the LG had too much lag, clear ghosting in interleave and colorbug in checkerboard. Not a great display. It has been suggested that the Toshiba Passive does not experience the lag or clear ghosting like LG
Also, LG uses an algorythm, that further improves the illusion of full 1080P that is achieved via their post processing of the input signal http://www.flatpanelshd.com/news.php?subaction=showfull&id=1313938337
Myself, I recommend using a DLP Projector. No lag, HUGE screen, more immersive, friends and family can easily enjoy it, only shortfall is 3D gaming is limited to 720P on a single projector. But 3D Movies kick booty on screens that range from 90 inches and up.
@Grestorn: There was a discussion at mtbs3d about this but i can't find the thread for now. /confused.gif' class='bbc_emoticon' alt=':confused:' />
A user named Dimitryko had all things straight and it is a situation where the chipmanufacturers is making chips which does not comply with the original standards.
Why this is allowed at all goes above my head.
@Grestorn: There was a discussion at mtbs3d about this but i can't find the thread for now. /confused.gif' class='bbc_emoticon' alt=':confused:' />
A user named Dimitryko had all things straight and it is a situation where the chipmanufacturers is making chips which does not comply with the original standards.
[quote name='Grestorn' date='05 December 2011 - 02:19 AM' timestamp='1323069570' post='1337752']
Not true. Don't be mislead!
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
[/quote]
Sigh..
The resolution perceived by your brain is not halved. It is 1080p with some interlacing artifacts. It is not as good as real 1080p @ 60hz of course. But it looks very good if done right (tridef). It is definetly not half resolution despite many peoples inability to grasp the concept.
[quote name='Grestorn' date='05 December 2011 - 02:19 AM' timestamp='1323069570' post='1337752']
Not true. Don't be mislead!
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
Sigh..
The resolution perceived by your brain is not halved. It is 1080p with some interlacing artifacts. It is not as good as real 1080p @ 60hz of course. But it looks very good if done right (tridef). It is definetly not half resolution despite many peoples inability to grasp the concept.
The discussion is mainly about the hdmi limits. As many says, the hdmi connection is not capable to anything more than a practical res/fps of 1080p@60Hz. Want full frames, then it is 1080L+1080R@30Hz (yet pulled down to 24Hz...). Tv-sets with their own shutters can accept 1080 interleaved (checkerboard, horisontal, vertical etc means half res frames in one direction), half-sbs and more @60,30,24 whatever Hz and still output 120Hz (60Hz per eye). There's no magic in this since the tv processes the signal before it's presented.
What then looks best is different to who views it seems.
The discussion is mainly about the hdmi limits. As many says, the hdmi connection is not capable to anything more than a practical res/fps of 1080p@60Hz. Want full frames, then it is 1080L+1080R@30Hz (yet pulled down to 24Hz...). Tv-sets with their own shutters can accept 1080 interleaved (checkerboard, horisontal, vertical etc means half res frames in one direction), half-sbs and more @60,30,24 whatever Hz and still output 120Hz (60Hz per eye). There's no magic in this since the tv processes the signal before it's presented.
What then looks best is different to who views it seems.
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
[/quote]
I figured it was but spent 10 minutes looking through the faq and did some searches with the words hertz, Tv and a few others and came up with nothing. That's enough and I got my answer this way. All the technical aspects aside I can play 3d games without having to go switch the hertz on my monitor with Tridef, so I'll probably use that.
ty all for the help
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
I figured it was but spent 10 minutes looking through the faq and did some searches with the words hertz, Tv and a few others and came up with nothing. That's enough and I got my answer this way. All the technical aspects aside I can play 3d games without having to go switch the hertz on my monitor with Tridef, so I'll probably use that.
ty all for the help
Wow! Ton of misinformation going on here.
First off, when people refer 1080P60 in 3D, it means 2 full resolution 1080P images (one left, one right) formatted in HDMI 1.4a frame-packing (which is similar to over/under with some blank space in between). This does not refer to "legacy" formats like interlaced, checkerboard, side-by-side, etc. These are generally called "frame-compatible" formats, and result in a reduction of resolution. I know a lot of people here seem to think these are full-resolution, but they are not. For example, using interlaced 1080P will result in an actual resolution of 1920x540 per eye, clearly not full HD. The only way to get full resolution 1080P is either with Nvidia's proprietary format over Dual-Link DVI, or using the HDMI 1.4a standard, however limited to 24Hz.
In addition, it is not the HDMI cable that limits this. The actual limitation is with the HDMI chipsets that process the signal. The current ones do not provide enough bandwidth for the full resolution 1080P60 signal. There are new versions of these chips that will support 1080P60, but they are not used in any HDTVs yet (maybe next year, I don't know). However the HDMI 1.4a spec already allows for this, although it is an optional format. So this will be supported at some point, just not today.
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
[/quote]
I was pretty sure checkerboard was 1920x540(interlaced). People swear its almost as good as 1080p though I never understood why. Genuinely curious.
Cant wait for new hdmi spec. I know that the Hobbit movie will be filmed in that spec.
Wow! Ton of misinformation going on here.
First off, when people refer 1080P60 in 3D, it means 2 full resolution 1080P images (one left, one right) formatted in HDMI 1.4a frame-packing (which is similar to over/under with some blank space in between). This does not refer to "legacy" formats like interlaced, checkerboard, side-by-side, etc. These are generally called "frame-compatible" formats, and result in a reduction of resolution. I know a lot of people here seem to think these are full-resolution, but they are not. For example, using interlaced 1080P will result in an actual resolution of 1920x540 per eye, clearly not full HD. The only way to get full resolution 1080P is either with Nvidia's proprietary format over Dual-Link DVI, or using the HDMI 1.4a standard, however limited to 24Hz.
In addition, it is not the HDMI cable that limits this. The actual limitation is with the HDMI chipsets that process the signal. The current ones do not provide enough bandwidth for the full resolution 1080P60 signal. There are new versions of these chips that will support 1080P60, but they are not used in any HDTVs yet (maybe next year, I don't know). However the HDMI 1.4a spec already allows for this, although it is an optional format. So this will be supported at some point, just not today.
Also, this question is asked every single day. Please take a second to read the forum, or search for 2 seconds and the answers will be revealed.
I was pretty sure checkerboard was 1920x540(interlaced). People swear its almost as good as 1080p though I never understood why. Genuinely curious.
Cant wait for new hdmi spec. I know that the Hobbit movie will be filmed in that spec.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
I was under impression as well that their was 50% degradation.[/quote]
I was refering to human visual perception, not pixel count. That's why I was careful to word my comment as I did.
[quote]Isnt it essentially 1080i in 3d format?[/quote]
That analogy is accurate only in terms of pixel count. 3D interleaved is more like 1080i, but it doesn't look as good as CB (more visual aliasing) even though they have the same pixel count.
The reason you have the wrong impression is due to a few people who have an interest in trivializing their TVs inability to do native resolution 3D gaming. These people have never seen CB mode, but to defend their buying decision they "close the gap" between their TV vs CB capable TV with the standard half truth trick. People who don't know any better sometimes buy into this. If 720P was "just as good" as CB, you wouldn't have the massive interest in CB as shown in threads such as this one:
http://forums.nvidia.com/index.php?showtopic=200925
You also would not see 422,000 plus views in the "CB for all" thread.
I was under impression as well that their was 50% degradation.
I was refering to human visual perception, not pixel count. That's why I was careful to word my comment as I did.
That analogy is accurate only in terms of pixel count. 3D interleaved is more like 1080i, but it doesn't look as good as CB (more visual aliasing) even though they have the same pixel count.
The reason you have the wrong impression is due to a few people who have an interest in trivializing their TVs inability to do native resolution 3D gaming. These people have never seen CB mode, but to defend their buying decision they "close the gap" between their TV vs CB capable TV with the standard half truth trick. People who don't know any better sometimes buy into this. If 720P was "just as good" as CB, you wouldn't have the massive interest in CB as shown in threads such as this one:
http://forums.nvidia.com/index.php?showtopic=200925
You also would not see 422,000 plus views in the "CB for all" thread.
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.
The new Acer 27inch passive monitor and I believe the 23inch model as well, will be supported at 1920x1080P@60x1
http://3dvision-blog.com/acer-hr274h-is-a-27-passive-3d-monitor-with-3d-vision-support/
To quote Bloody
Bloody // Nov 30, 2011 at 21:04
I still haven’t seen these new monitors as they were just announced, but the interesting thing is that although they support HDMI 1.4, the products like these two monitors that are “Optimized for GeForce†will not be limited to 1080p 24Hz 3D mode only. When used for gaming in stereo 3D mode the driver licensed from Nvidia will be outputting row interleaved 1080p at 60Hz. The HDMI 1.4 limitations for stereo 3D will be applied only when using consumer electronic devices…
The new Acer 27inch passive monitor and I believe the 23inch model as well, will be supported at 1920x1080P@60x1
http://3dvision-blog.com/acer-hr274h-is-a-27-passive-3d-monitor-with-3d-vision-support/
To quote Bloody
Bloody // Nov 30, 2011 at 21:04
I still haven’t seen these new monitors as they were just announced, but the interesting thing is that although they support HDMI 1.4, the products like these two monitors that are “Optimized for GeForce†will not be limited to 1080p 24Hz 3D mode only. When used for gaming in stereo 3D mode the driver licensed from Nvidia will be outputting row interleaved 1080p at 60Hz. The HDMI 1.4 limitations for stereo 3D will be applied only when using consumer electronic devices…
Just for clarification, HDMI1.4 has sufficient raw bandwidth for 1920x1080 /120 Hz. The reason we can't get framepacking or frame sequential at 2X 1920x1080 @ 60 frame per second is because the VESA commitee doesn't spec it.[/QUOTE]
Not true.
HDMI 1.4 is identical to single-link DVI.
Just for clarification, HDMI1.4 has sufficient raw bandwidth for 1920x1080 /120 Hz. The reason we can't get framepacking or frame sequential at 2X 1920x1080 @ 60 frame per second is because the VESA commitee doesn't spec it.[/QUOTE]
Not true.
HDMI 1.4 is identical to single-link DVI.
The hdmi situation is a tad ridiculous though. The hdmi standard (at least 1.3 and upwards) is enough to handle 1080@120Hz but the manufacturers of the chips didn't think higher res/fps than 1080@60Hz should be required. Either it's stupid or yet another annoying way of doing business...
[/quote]
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
The hdmi situation is a tad ridiculous though. The hdmi standard (at least 1.3 and upwards) is enough to handle 1080@120Hz but the manufacturers of the chips didn't think higher res/fps than 1080@60Hz should be required. Either it's stupid or yet another annoying way of doing business...
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
Use TriDef, it will give you 1920x1080P@60x1, your TV will do the rest. Vs 1920x1080@24x2
The new Acer 27inch passive monitor and I believe the 23inch model as well, will be supported at 1920x1080P@60x1
[url="http://3dvision-blog.com/acer-hr274h-is-a-27-passive-3d-monitor-with-3d-vision-support/"]http://3dvision-blog...vision-support/[/url]
[/quote]
Not true. Don't be mislead!
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
Use TriDef, it will give you 1920x1080P@60x1, your TV will do the rest. Vs 1920x1080@24x2
The new Acer 27inch passive monitor and I believe the 23inch model as well, will be supported at 1920x1080P@60x1
http://3dvision-blog...vision-support/
Not true. Don't be mislead!
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
[/quote]
I've also read HDMI is already capable of more, its just being artificially limited right now so the HDMI SIG can create a new spec down the road and force everyone to replace their expensive HT hardware again. From everything I've seen, the signaling components in the hardware/TVs are the bottleneck right now, not the cable itself or the actual interconnect.
You can see here HDMI 1.3 is already capable of 10.2 Gbps, which already exceeds DL-DVI and is enough for 2x1080p@60Hz for stereo 3D. Expectation is that the HDMI money printing cartel will again expand the spec for 2x1080p@60Hz and/or 4K x 2K Quad FullHD. http://www.hdmi.org/learningcenter/faq.aspx
"Q. What’s new in the HDMI 1.3 Specification?
•Higher speed: Although all previous versions of HDMI have had more than enough bandwidth to support all current HDTV formats, including full, uncompressed 1080p signals, [b]HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps)[/b] to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, [b]built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds[/b]."
Basically, HDMI board is always going to take a [i]de minimis [/i]approach to the spec so they can sell update new hardware to spur hardware sales.
As for DVI, I actually think the limitation of single-link bandwidth is on the DVI side only. If there was actually a standards group pushing it, I think they could update the spec to support DL-DVI bandwidth over a single DVI link and 4x DVI bandwidth over 2x physical DVI links. Maybe slightly higher quality cabling needed, but mainly need updated signaling components on the hardware side. But DVI has an uncertain future because it has no upgrade path and hardware makers are flocking to HDMI. DVI's only saving grace right now is that no one bothers messing with Display Port other than Dell and ATI, along with my suspicion the HDMI board loves dragging its feet on new specs so they can ensure more sales down the road. Once HDMI bandwidth exceeds DL-DVI officially (1.4a maybe?), we may finally see the beginning of the end of DVI in the PC space.
can you or cyberreality please provide a source for that claim?
It just doesn't make any sense. From the electronic point of view, HDMI was always been the same as DVI-D (i.e. digital single-link DVI), that's why you can use a simple adapter for that.
The bandwidth of a single link DVI is barely enough to transfer 1920x1200 (60Hz non 3D, i.e. 2,304,000 pixels per second), to achieve that you already have to use reduced blanking intervals. If you want to get higher resolutions, you have to use a dual-link DVI connection, which is practically two parallel DVI connections. It's absolutely impossible to transfer 1920x1080x2 (3D in 60Hz, i.e. 4,147,200 pixels per second) with a single-link DVI connection or HDMI.
So unless you either use different cables (HMDI cables and connectors only support one link, while DVI connectors actually had two links from the beginning, even though single-link DVI cables don't connect them) or you have to create a completely new electronic specification, which would allow to transfer twice the data using just one link. But that would be a completely new specification, with completely new hardware on both sides of the connection.
I've also read HDMI is already capable of more, its just being artificially limited right now so the HDMI SIG can create a new spec down the road and force everyone to replace their expensive HT hardware again. From everything I've seen, the signaling components in the hardware/TVs are the bottleneck right now, not the cable itself or the actual interconnect.
You can see here HDMI 1.3 is already capable of 10.2 Gbps, which already exceeds DL-DVI and is enough for 2x1080p@60Hz for stereo 3D. Expectation is that the HDMI money printing cartel will again expand the spec for 2x1080p@60Hz and/or 4K x 2K Quad FullHD. http://www.hdmi.org/learningcenter/faq.aspx
"Q. What’s new in the HDMI 1.3 Specification?
•Higher speed: Although all previous versions of HDMI have had more than enough bandwidth to support all current HDTV formats, including full, uncompressed 1080p signals, HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds."
Basically, HDMI board is always going to take a de minimis approach to the spec so they can sell update new hardware to spur hardware sales.
As for DVI, I actually think the limitation of single-link bandwidth is on the DVI side only. If there was actually a standards group pushing it, I think they could update the spec to support DL-DVI bandwidth over a single DVI link and 4x DVI bandwidth over 2x physical DVI links. Maybe slightly higher quality cabling needed, but mainly need updated signaling components on the hardware side. But DVI has an uncertain future because it has no upgrade path and hardware makers are flocking to HDMI. DVI's only saving grace right now is that no one bothers messing with Display Port other than Dell and ATI, along with my suspicion the HDMI board loves dragging its feet on new specs so they can ensure more sales down the road. Once HDMI bandwidth exceeds DL-DVI officially (1.4a maybe?), we may finally see the beginning of the end of DVI in the PC space.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
BTW, unless you've gamed on one, you can not say how terrible they are. They are not that bad, what the Passive TVs give up in a little lost resolution, they make up with vibrant colors. Nevertheless, the LG had too much lag, clear ghosting in interleave and colorbug in checkerboard. Not a great display. It has been suggested that the Toshiba Passive does not experience the lag or clear ghosting like LG
Also, LG uses an algorythm, that further improves the illusion of full 1080P that is achieved via their post processing of the input signal http://www.flatpanelshd.com/news.php?subaction=showfull&id=1313938337
Myself, I recommend using a DLP Projector. No lag, HUGE screen, more immersive, friends and family can easily enjoy it, only shortfall is 3D gaming is limited to 720P on a single projector. But 3D Movies kick booty on screens that range from 90 inches and up.
BTW, unless you've gamed on one, you can not say how terrible they are. They are not that bad, what the Passive TVs give up in a little lost resolution, they make up with vibrant colors. Nevertheless, the LG had too much lag, clear ghosting in interleave and colorbug in checkerboard. Not a great display. It has been suggested that the Toshiba Passive does not experience the lag or clear ghosting like LG
Also, LG uses an algorythm, that further improves the illusion of full 1080P that is achieved via their post processing of the input signal http://www.flatpanelshd.com/news.php?subaction=showfull&id=1313938337
Myself, I recommend using a DLP Projector. No lag, HUGE screen, more immersive, friends and family can easily enjoy it, only shortfall is 3D gaming is limited to 720P on a single projector. But 3D Movies kick booty on screens that range from 90 inches and up.
A user named Dimitryko had all things straight and it is a situation where the chipmanufacturers is making chips which does not comply with the original standards.
Why this is allowed at all goes above my head.
A user named Dimitryko had all things straight and it is a situation where the chipmanufacturers is making chips which does not comply with the original standards.
Why this is allowed at all goes above my head.
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.
Not true.
HDMI 1.4 is identical to single-link DVI.
[/quote]
As usual, you demonstrate your appauling ignorance of all things Video.
Not true.
HDMI 1.4 is identical to single-link DVI.
As usual, you demonstrate your appauling ignorance of all things Video.
Not true. Don't be mislead!
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
[/quote]
Sigh..
The resolution perceived by your brain is not halved. It is 1080p with some interlacing artifacts. It is not as good as real 1080p @ 60hz of course. But it looks very good if done right (tridef). It is definetly not half resolution despite many peoples inability to grasp the concept.
Not true. Don't be mislead!
You have 3DVision with the Asus passive 3D monitor, but the resolution will still be halved in 3D. I guess they're using an interleaved 1080p image.Which is perfectly fine, since the monitor itself uses a passive polarization, i.e. the resolution it displays is halfed anyway in 3D mode.
TriDef also can't magically overcome the limitations of HDMI.
Sigh..
The resolution perceived by your brain is not halved. It is 1080p with some interlacing artifacts. It is not as good as real 1080p @ 60hz of course. But it looks very good if done right (tridef). It is definetly not half resolution despite many peoples inability to grasp the concept.
What then looks best is different to who views it seems.
What then looks best is different to who views it seems.
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.