3d will future resolutions be improved for 3d HDTV support?
Hello, i'm new to this forum and i've looked through but didn't see what i was looking for. apparantly there is about to be a release for software that will enable you to play 3d games from your pc while hooked up to a new hdmi 1.4 3dhdtv. I noticed that it seems like this setup is limited in what resolutions you can choose. It seems that if you want to play in 1080p you would be limited to a hz level or framerate of 24. (which i assume is 24fps to each eye) And that if you want to see 50 or 60 fps to each eye that you would have to lower your resolution to 720p. I think that sucks!! i am used to playing pc games on my 1080 HDTV and dropping to 720p makes the game much less visually appealing. I was just wondering what the limitation here is, and if it is a software limitation, does nvidia plan on updating there software down the road to support 1080 at 60hz in 3d mode while interfacing with a 3D HDTV? I have a gtx 480 and i realize that even with this card in SLI that the hardware may not be capable of supporting a good framerate if trying to produce 3d at 1920 x 1080 with demanding games but that's not the point. Surely future hardware will be able to handle it and again i'm wondering if this limitation is due to the 3DTVs, the software, or perhaps the graphics card hardware. Does anybody know?
Hello, i'm new to this forum and i've looked through but didn't see what i was looking for. apparantly there is about to be a release for software that will enable you to play 3d games from your pc while hooked up to a new hdmi 1.4 3dhdtv. I noticed that it seems like this setup is limited in what resolutions you can choose. It seems that if you want to play in 1080p you would be limited to a hz level or framerate of 24. (which i assume is 24fps to each eye) And that if you want to see 50 or 60 fps to each eye that you would have to lower your resolution to 720p. I think that sucks!! i am used to playing pc games on my 1080 HDTV and dropping to 720p makes the game much less visually appealing. I was just wondering what the limitation here is, and if it is a software limitation, does nvidia plan on updating there software down the road to support 1080 at 60hz in 3d mode while interfacing with a 3D HDTV? I have a gtx 480 and i realize that even with this card in SLI that the hardware may not be capable of supporting a good framerate if trying to produce 3d at 1920 x 1080 with demanding games but that's not the point. Surely future hardware will be able to handle it and again i'm wondering if this limitation is due to the 3DTVs, the software, or perhaps the graphics card hardware. Does anybody know?
Hello, i'm new to this forum and i've looked through but didn't see what i was looking for. apparantly there is about to be a release for software that will enable you to play 3d games from your pc while hooked up to a new hdmi 1.4 3dhdtv. I noticed that it seems like this setup is limited in what resolutions you can choose. It seems that if you want to play in 1080p you would be limited to a hz level or framerate of 24. (which i assume is 24fps to each eye) And that if you want to see 50 or 60 fps to each eye that you would have to lower your resolution to 720p. I think that sucks!! i am used to playing pc games on my 1080 HDTV and dropping to 720p makes the game much less visually appealing. I was just wondering what the limitation here is, and if it is a software limitation, does nvidia plan on updating there software down the road to support 1080 at 60hz in 3d mode while interfacing with a 3D HDTV? I have a gtx 480 and i realize that even with this card in SLI that the hardware may not be capable of supporting a good framerate if trying to produce 3d at 1920 x 1080 with demanding games but that's not the point. Surely future hardware will be able to handle it and again i'm wondering if this limitation is due to the 3DTVs, the software, or perhaps the graphics card hardware. Does anybody know?
Hello, i'm new to this forum and i've looked through but didn't see what i was looking for. apparantly there is about to be a release for software that will enable you to play 3d games from your pc while hooked up to a new hdmi 1.4 3dhdtv. I noticed that it seems like this setup is limited in what resolutions you can choose. It seems that if you want to play in 1080p you would be limited to a hz level or framerate of 24. (which i assume is 24fps to each eye) And that if you want to see 50 or 60 fps to each eye that you would have to lower your resolution to 720p. I think that sucks!! i am used to playing pc games on my 1080 HDTV and dropping to 720p makes the game much less visually appealing. I was just wondering what the limitation here is, and if it is a software limitation, does nvidia plan on updating there software down the road to support 1080 at 60hz in 3d mode while interfacing with a 3D HDTV? I have a gtx 480 and i realize that even with this card in SLI that the hardware may not be capable of supporting a good framerate if trying to produce 3d at 1920 x 1080 with demanding games but that's not the point. Surely future hardware will be able to handle it and again i'm wondering if this limitation is due to the 3DTVs, the software, or perhaps the graphics card hardware. Does anybody know?
Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
Apparently it has to do with HDMI chips in the TVs themselves. They need to be 297 mhz to be able to accept 1080p @ 60 per eye...but none of them are. Most are still at 225 mhz which is what HDMI 1.3 TVs used.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
Apparently it has to do with HDMI chips in the TVs themselves. They need to be 297 mhz to be able to accept 1080p @ 60 per eye...but none of them are. Most are still at 225 mhz which is what HDMI 1.3 TVs used.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
Apparently it has to do with HDMI chips in the TVs themselves. They need to be 297 mhz to be able to accept 1080p @ 60 per eye...but none of them are. Most are still at 225 mhz which is what HDMI 1.3 TVs used.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
Apparently it has to do with HDMI chips in the TVs themselves. They need to be 297 mhz to be able to accept 1080p @ 60 per eye...but none of them are. Most are still at 225 mhz which is what HDMI 1.3 TVs used.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
[quote name='chiz' post='1111660' date='Sep 1 2010, 06:36 PM']Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.[/quote]
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
[quote name='chiz' post='1111660' date='Sep 1 2010, 06:36 PM']Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
[quote name='chiz' post='1111660' date='Sep 1 2010, 06:36 PM']Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.[/quote]
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
[quote name='chiz' post='1111660' date='Sep 1 2010, 06:36 PM']Its an HDMI 1.4 limitation on paper, but the actual technical limitations are unclear. The wire itself (the HDMI 1.4 cable) is certainly capable of 120Hz as there are some fringe examples of this and SL-DVI is also capable of this. My guess is that they are limiting it to 60Hz for backward compatibility with various components already in the channel, or parts that can't be obtained in mass quantities just yet to handle 120Hz signaling in HDTVs, AVRs, and Blu-Ray players.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
[quote name='lotusvibe' post='1111692' date='Sep 1 2010, 07:49 PM']That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.[/quote]
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
[quote name='lotusvibe' post='1111692' date='Sep 1 2010, 07:49 PM']That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
[quote name='lotusvibe' post='1111692' date='Sep 1 2010, 07:49 PM']That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.[/quote]
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
[quote name='lotusvibe' post='1111692' date='Sep 1 2010, 07:49 PM']That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
So in theory cable is able to support it...but TVs don't have the horsepower to show it.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.[/quote]
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.[/quote]
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
So right now, HDMI is limited to SL-DVI bandwidth which is 4-5Gbps (depending if you count 8b/10b encoding or not):
1920x1080x60x32 = ~4Gbps current HDMI maps pin for pin to SL-DVI so both support it without issue.
1920x1080x120x32 = ~8Gbps requires DL-DVI for 120Hz
similarly, another DL-DVI limitation:
2560x1600x60x32 = ~8Gbps requires DL-DVI, HDMI can't pass this format either. So again, probably an artificial limitation imposed by HDMI standards board because HDMI can handle 4Kx2K 40-bit color at 24Hz, which is roughly the same bandwidth as 2560x1600@60Hz.
But HDMI claims it can handle up to 10Gbps over its current pin alignment, so it should be able to handle 1080p @ 120Hz, but can't. I imagine once 4Kx2K panels becomes more mainstream and the increase in resolution requires them to forward the spec, we'll see a revision that supports higher resolutions in 3D. It is plainly obvious however that the HDMI standards board serves their members, so they're going to constantly change the spec with marginal improvements in an effort to get the end-consumer to constantly upgrade their HT components. Its just business decision, we may not agree with it but that's just the reality of it.
That is awesome data. i was wondering i've heard that 3d bluray players require an HDMI cable with atleast 10.2gbps, is that true? what would the gbps requirement be for 3d bluray and does that equate to 1920 x 1080 x 24hz x 24 bit color (i'm trying to make sure we're on the same page) Also do you have a website that i can reference for these gbps requirements that you have stated above. I work at best buy and hate selling monster cables cause i believe they are bullcrap but i would like to prove it to my fellow associates because they think they know what they are talking about. You feel monster is a scam right? obviously the really high gbps like 17.8 gbps shouldn't matter but Monster advertises an open eye signal for their cables which also makes them "better" but i'm not sure that matters. the signal the tv would process would simply be a stream of 0s and 1s correct? so what good does an "open eye signal" do unless they're are cheap HDMI cables that could cause a tv to perceive a 1 where it should have been a 0 or something. Does anything monster does with it's cables matter? I could be waaaay off if i am please educate me. I would also love to know the exact gbps requirement for standard 2d bluray. Please i'm loving your previous response some of the best data i've ever seen in reference to gbps. Also if you don't feel like explaining everything i'd be more than happy to accept a link to somewhere that already has it spelled out.
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
Haha well there's certainly plenty of sources that would agree with you about Monster cables being a scam. While its certainly true Monster cables are generally high quality, the points of dispute typically come into play whether or not they're worth it or required for functionality. Typically that answer is no. But there are definitely benefits to better quality cables, as higher gauge reduces signal attenuation (important for longer run cables) and better interconnect quality means less chance for wear or artifacts resulting from faulty contacts or wires. This is similar to the benefits of speaker wire when it comes to wiring.
I wouldn't worry about requiring certain amounts of bandwidth, keep in mind, all these ratings for bandwidth are just what a cable is "rated" for, but a cable that can pass the signal that isn't rated for that bandwidth may work just fine. For example, HDMI 1.3a cables are exactly the same as HDMI 1.4 cables, so if you already had perfectly good, high quality, HDMI 1.3a cables there's no need to rush out and buy HDMI 1.4(a) cables for Blu-Ray even if Monster tells you the 1.3a cable isn't rated for 3D etc.
But ya unless there's actual defects in the cable, you shouldn't see any difference as the signal is just a digital stream of 1s and 0s as you stated. A bad cable can result in pixel popping or off-colors though, so buying decent quality cables is always a good idea. Generally I go with and recommend Monoprice.com for good quality, reliable and affordable cables. Over there, you actually get what you pay for in actual cable quality/gauge and don't pay for more than that in the form of a compression mold logo on the connector.
As for the different references, I just look at the spec sheets (Wikipedia is a good start, for something like this its pretty reliable because its backed by white paper/spec sheets) and compare them to what is available in the actual marketplace. When I see a specsheet or white paper say one thing, but nothing in the market place taking advantage of it or demonstrating its possible, that's when I start to ask why. As you can see with HDMI 1.3a to 1.4a, there shouldn't be any technical limitation for not being able to support 1080p @ 120Hz for the wire and spec itself unless there's something else in the pipeline holding things up. Disolitude's explanation makes sense if its the signaling/DSP on the HDTV's themselves not being able to handle the bandwidth requirements, and I imagine that will change but probably not for just 3D (4Kx2K most likely).
Sorry I haven't really seen any comprehensive guide, most of this is just me fiddling with different available resolutions in both 2D and 3D, 60Hz and 120Hz and consistently seeing the same various bandwidth limits from different formats like SL-DVI, HDMI and DL-DVI compared to what's advertised as max bandwidth for HDMI, DP etc.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W