Help with 3d Vision FAQ and threads don't really give the type of answers I am used to
Is their anyone here, or even working for NVIDIA that understands this technology? I am tired of reading crap that says none of the good TVs on the market work with 3d vision because they don't accept 120hz signals. Last time I checked DVI was a standard, so unless they are saying they use a different standard then that answer doesn't make sense. So what exactly is the difference that allows the DVI inputs on monitors they certify (which are all crap) and the DVI inputs on any other 120hz tv? Nothing on the page goes into specifics of how the system hooks up or works either. Why is DVI even required? Since the computer is telling the TV exactly what to display, it shouldn't require any feedback from the TV for the glasses to function. I think this is a great idea, and it works great from what I have read. Right now I am heavily leaning toward this being a scam by forcing people to only buy certain monitors/tvs not becuase of a technical reason but because they purposefully designed it that way. So, can anyone give a technical reason as to how DVI inputs on TVs are different...and why their "special DVI" they require (which isn't available on a single decent TV) even exists in the first place...?
Is their anyone here, or even working for NVIDIA that understands this technology? I am tired of reading crap that says none of the good TVs on the market work with 3d vision because they don't accept 120hz signals. Last time I checked DVI was a standard, so unless they are saying they use a different standard then that answer doesn't make sense. So what exactly is the difference that allows the DVI inputs on monitors they certify (which are all crap) and the DVI inputs on any other 120hz tv? Nothing on the page goes into specifics of how the system hooks up or works either. Why is DVI even required? Since the computer is telling the TV exactly what to display, it shouldn't require any feedback from the TV for the glasses to function. I think this is a great idea, and it works great from what I have read. Right now I am heavily leaning toward this being a scam by forcing people to only buy certain monitors/tvs not becuase of a technical reason but because they purposefully designed it that way. So, can anyone give a technical reason as to how DVI inputs on TVs are different...and why their "special DVI" they require (which isn't available on a single decent TV) even exists in the first place...?
[quote name='Jarsyl' post='588439' date='Sep 15 2009, 06:37 PM']Is their anyone here, or even working for NVIDIA that understands this technology? I am tired of reading crap that says none of the good TVs on the market work with 3d vision because they don't accept 120hz signals. Last time I checked DVI was a standard, so unless they are saying they use a different standard then that answer doesn't make sense. So what exactly is the difference that allows the DVI inputs on monitors they certify (which are all crap) and the DVI inputs on any other 120hz tv? Nothing on the page goes into specifics of how the system hooks up or works either. Why is DVI even required? Since the computer is telling the TV exactly what to display, it shouldn't require any feedback from the TV for the glasses to function. I think this is a great idea, and it works great from what I have read. Right now I am heavily leaning toward this being a scam by forcing people to only buy certain monitors/tvs not becuase of a technical reason but because they purposefully designed it that way. So, can anyone give a technical reason as to how DVI inputs on TVs are different...and why their "special DVI" they require (which isn't available on a single decent TV) even exists in the first place...?[/quote]
I'm not sure what answer you are looking for, but I will try and answer.
Basically, most HDTVs on the market can only accept a 60hz signal, which is the standard refresh for 720p and 1080p. Some HDTVs have a 120hz smoothing mode which uses a software algorithm to create "between" frames to create the look of a very smooth 120hz, but it is just basically a trick as they aren't capable of accepting a 120hz signal and displaying it.
The monitors that are stereo compatible are able to natively accept and display a 120hz signal and display 60hz per eye when the shutter glasses are worn. Most TVs simply don't contain the technology to accept 120hz signals. It isn't because of the DVI standard, but rather it is the circuitry inside of the TVs that need to be able to accept and use the signal in a meaningful manner.
As far as I can tell, it really isn't a scam, it is just that most HDTVs don't have the capability to accept and display the 3d signal. I don't know if the DLP stuff is 120hz (I'm pretty sure not all of them are), but DLP has pretty much always been able to work with 3d stereo. I had a couple of DLP projectors years ago that I used with the old stereo drivers and they worked great.
Concievibly it would be possible to display 3d stereo on a normal HDTV at 30hz per eye, but that is not a pleasant experience from my observation, so I can understand why nV wouldn't allow that.
I am sorry if that isn't the answer you are looking for. I'm pretty sure it isn't a scam (well, the 120hz HDTVs that cannot accept a 120hz signal kind of are I suppose).
The DLP 3d ready HDTVs are reasonably inexpensive and I have thought more than once about selling my current TV and picking one up.
[quote name='Jarsyl' post='588439' date='Sep 15 2009, 06:37 PM']Is their anyone here, or even working for NVIDIA that understands this technology? I am tired of reading crap that says none of the good TVs on the market work with 3d vision because they don't accept 120hz signals. Last time I checked DVI was a standard, so unless they are saying they use a different standard then that answer doesn't make sense. So what exactly is the difference that allows the DVI inputs on monitors they certify (which are all crap) and the DVI inputs on any other 120hz tv? Nothing on the page goes into specifics of how the system hooks up or works either. Why is DVI even required? Since the computer is telling the TV exactly what to display, it shouldn't require any feedback from the TV for the glasses to function. I think this is a great idea, and it works great from what I have read. Right now I am heavily leaning toward this being a scam by forcing people to only buy certain monitors/tvs not becuase of a technical reason but because they purposefully designed it that way. So, can anyone give a technical reason as to how DVI inputs on TVs are different...and why their "special DVI" they require (which isn't available on a single decent TV) even exists in the first place...?
I'm not sure what answer you are looking for, but I will try and answer.
Basically, most HDTVs on the market can only accept a 60hz signal, which is the standard refresh for 720p and 1080p. Some HDTVs have a 120hz smoothing mode which uses a software algorithm to create "between" frames to create the look of a very smooth 120hz, but it is just basically a trick as they aren't capable of accepting a 120hz signal and displaying it.
The monitors that are stereo compatible are able to natively accept and display a 120hz signal and display 60hz per eye when the shutter glasses are worn. Most TVs simply don't contain the technology to accept 120hz signals. It isn't because of the DVI standard, but rather it is the circuitry inside of the TVs that need to be able to accept and use the signal in a meaningful manner.
As far as I can tell, it really isn't a scam, it is just that most HDTVs don't have the capability to accept and display the 3d signal. I don't know if the DLP stuff is 120hz (I'm pretty sure not all of them are), but DLP has pretty much always been able to work with 3d stereo. I had a couple of DLP projectors years ago that I used with the old stereo drivers and they worked great.
Concievibly it would be possible to display 3d stereo on a normal HDTV at 30hz per eye, but that is not a pleasant experience from my observation, so I can understand why nV wouldn't allow that.
I am sorry if that isn't the answer you are looking for. I'm pretty sure it isn't a scam (well, the 120hz HDTVs that cannot accept a 120hz signal kind of are I suppose).
The DLP 3d ready HDTVs are reasonably inexpensive and I have thought more than once about selling my current TV and picking one up.
The human race divides politically into those who want people to be controlled and those who have no such desire.
That is along the lines of what I was asking, and I appreciate the effort...but I am hoping for a more technical answer. My problem with that can be summed up in a few small parts. First, if the TFT panel is capable of 120hz and the input is capable of 120hz but only the middle man is the problem...well that simply isn't logical. The DLP TVs were put out in 2005ish. If they could process a 120hz signal, and LCD with a 120hz refresh rate could also. Mostly what gets me about this is that taking a 60hz signal and converting it to 120hz or 240hz requires post processing by the tv to convert the signal....this always requires more processing power then just passing the native signal through. So, if DVI is capable of 120hz, and the TFT panel can display 120hz, and converting a signal is more difficult then displaying it native...then how can you say those TVs aren't capable of it? At the very least it seems to me that they are "capable" of it, they for some reason just aren't doing it. I just find it hard to believe that an older technology that is all but dead (most manufacturer's stopped making DLP tvs because they were too cheap) is capable of somehow doing super duper fancy processing of the signal to keep it in 120hz mode when other TVs aren't.
That is along the lines of what I was asking, and I appreciate the effort...but I am hoping for a more technical answer. My problem with that can be summed up in a few small parts. First, if the TFT panel is capable of 120hz and the input is capable of 120hz but only the middle man is the problem...well that simply isn't logical. The DLP TVs were put out in 2005ish. If they could process a 120hz signal, and LCD with a 120hz refresh rate could also. Mostly what gets me about this is that taking a 60hz signal and converting it to 120hz or 240hz requires post processing by the tv to convert the signal....this always requires more processing power then just passing the native signal through. So, if DVI is capable of 120hz, and the TFT panel can display 120hz, and converting a signal is more difficult then displaying it native...then how can you say those TVs aren't capable of it? At the very least it seems to me that they are "capable" of it, they for some reason just aren't doing it. I just find it hard to believe that an older technology that is all but dead (most manufacturer's stopped making DLP tvs because they were too cheap) is capable of somehow doing super duper fancy processing of the signal to keep it in 120hz mode when other TVs aren't.
[quote name='Jarsyl' post='588458' date='Sep 15 2009, 07:28 PM']That is along the lines of what I was asking, and I appreciate the effort...but I am hoping for a more technical answer. My problem with that can be summed up in a few small parts. First, if the TFT panel is capable of 120hz and the input is capable of 120hz but only the middle man is the problem...well that simply isn't logical. The DLP TVs were put out in 2005ish. If they could process a 120hz signal, and LCD with a 120hz refresh rate could also. Mostly what gets me about this is that taking a 60hz signal and converting it to 120hz or 240hz requires post processing by the tv to convert the signal....this always requires more processing power then just passing the native signal through. So, if DVI is capable of 120hz, and the TFT panel can display 120hz, and converting a signal is more difficult then displaying it native...then how can you say those TVs aren't capable of it? At the very least it seems to me that they are "capable" of it, they for some reason just aren't doing it. I just find it hard to believe that an older technology that is all but dead (most manufacturer's stopped making DLP tvs because they were too cheap) is capable of somehow doing super duper fancy processing of the signal to keep it in 120hz mode when other TVs aren't.[/quote]
DLPs are still being made. In fact I really want this one [url="http://www.mitsubishi-tv.com/product/WD82737"]http://www.mitsubishi-tv.com/product/WD82737[/url] as it looks beautiful. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
If I remember correctly, DLPs refresh in a different manner more suited to 3d, though damned if I can remember what the exact difference is. I do know that being able to accept a 120hz signal requires twice the bandwidth in the LCD screens, so I imagine it is just a problem with the cost of the input components that accept and decode the signal.
This isn't new however, it has always been an issue. I had an LCD projector way back when that I couldn't use for 3d, but the two cheaper DLP projectors I bought worked just great for 3d. So basically, if it is a scam, it is not a new one.
I am an Electrical Engineering junion in college, but I don't have a lot of specific knowledge about LCD and DLP circuitry.
[quote name='Jarsyl' post='588458' date='Sep 15 2009, 07:28 PM']That is along the lines of what I was asking, and I appreciate the effort...but I am hoping for a more technical answer. My problem with that can be summed up in a few small parts. First, if the TFT panel is capable of 120hz and the input is capable of 120hz but only the middle man is the problem...well that simply isn't logical. The DLP TVs were put out in 2005ish. If they could process a 120hz signal, and LCD with a 120hz refresh rate could also. Mostly what gets me about this is that taking a 60hz signal and converting it to 120hz or 240hz requires post processing by the tv to convert the signal....this always requires more processing power then just passing the native signal through. So, if DVI is capable of 120hz, and the TFT panel can display 120hz, and converting a signal is more difficult then displaying it native...then how can you say those TVs aren't capable of it? At the very least it seems to me that they are "capable" of it, they for some reason just aren't doing it. I just find it hard to believe that an older technology that is all but dead (most manufacturer's stopped making DLP tvs because they were too cheap) is capable of somehow doing super duper fancy processing of the signal to keep it in 120hz mode when other TVs aren't.
DLPs are still being made. In fact I really want this one http://www.mitsubishi-tv.com/product/WD82737 as it looks beautiful. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
If I remember correctly, DLPs refresh in a different manner more suited to 3d, though damned if I can remember what the exact difference is. I do know that being able to accept a 120hz signal requires twice the bandwidth in the LCD screens, so I imagine it is just a problem with the cost of the input components that accept and decode the signal.
This isn't new however, it has always been an issue. I had an LCD projector way back when that I couldn't use for 3d, but the two cheaper DLP projectors I bought worked just great for 3d. So basically, if it is a scam, it is not a new one.
I am an Electrical Engineering junion in college, but I don't have a lot of specific knowledge about LCD and DLP circuitry.
The human race divides politically into those who want people to be controlled and those who have no such desire.
So because you don't understand how the technology works it equals a scam? This really isn't the best way to get a proper response. So you think just because LCDs are popular that they are the most advanced technology huh? In fact, in terms of refresh rates necessary for stereo 3D CRT was actually leaps and bounds better than LCD not to mention color reproduction but you don't see them selling CRTs no more. Your logic is just flawed and shows you have little to no understanding of the underlying technology. Most 99% of LCD screens are capped at 60Hz. DLPs can also run at 60Hz but there are many models that do 120Hz. Plasma has even higher rates sometimes in the 480Hz range. So in terms of refresh LCDs are the *worst*, they are just popular because they are thin and cheap. This glosses over the differences in the way these panels operate (which I don't claim to fully understand) but I know enough about 3D to know what works and what doesn't. Basically the monitor needs to be able to accept, process and display a 120Hz signal. Some common HDTVs/monitors can do one of those things but not all at the same time. To confuse matters some companies advertise "120Hz" displays that are bogus interpolated 120Hz and the display actually runs at 60Hz. And this has nothing to do with some "special" DVI cable, not sure where you got that from. So it is understandable that you are confused but you are going about this the wrong way. If you had bothered reading the forum here you would notice this topic has come up 100 times before all with detailed explanations but I guess that wasn't the answer you wanted to hear.
So because you don't understand how the technology works it equals a scam? This really isn't the best way to get a proper response. So you think just because LCDs are popular that they are the most advanced technology huh? In fact, in terms of refresh rates necessary for stereo 3D CRT was actually leaps and bounds better than LCD not to mention color reproduction but you don't see them selling CRTs no more. Your logic is just flawed and shows you have little to no understanding of the underlying technology. Most 99% of LCD screens are capped at 60Hz. DLPs can also run at 60Hz but there are many models that do 120Hz. Plasma has even higher rates sometimes in the 480Hz range. So in terms of refresh LCDs are the *worst*, they are just popular because they are thin and cheap. This glosses over the differences in the way these panels operate (which I don't claim to fully understand) but I know enough about 3D to know what works and what doesn't. Basically the monitor needs to be able to accept, process and display a 120Hz signal. Some common HDTVs/monitors can do one of those things but not all at the same time. To confuse matters some companies advertise "120Hz" displays that are bogus interpolated 120Hz and the display actually runs at 60Hz. And this has nothing to do with some "special" DVI cable, not sure where you got that from. So it is understandable that you are confused but you are going about this the wrong way. If you had bothered reading the forum here you would notice this topic has come up 100 times before all with detailed explanations but I guess that wasn't the answer you wanted to hear.
Well that response was kind of rude...so I will just go ahead and call you out. I did read the forums, and all you did was give the same BS non scientific answer that was given before...basically you are stupid. If a screen can interpolate and display that image at 120hz and the standard of data input DVI can input 120hz it is just a matter of processing the signal. Are you telling me that a DLP TV with basically no features has the band width and processing power to do this, but LCDs don't? That is BS, and the only reason LCDs wouldn't be able to do it is because they simply decided not to build them that way. I mean many of the new LCDs have interactive web access and netflix streaming. That should require much more processing and even a minature OS. So we can load OSes on LCD tvs, but we can't process a 120hz signal which was done back in 2005 on DLP TVs?
Well that response was kind of rude...so I will just go ahead and call you out. I did read the forums, and all you did was give the same BS non scientific answer that was given before...basically you are stupid. If a screen can interpolate and display that image at 120hz and the standard of data input DVI can input 120hz it is just a matter of processing the signal. Are you telling me that a DLP TV with basically no features has the band width and processing power to do this, but LCDs don't? That is BS, and the only reason LCDs wouldn't be able to do it is because they simply decided not to build them that way. I mean many of the new LCDs have interactive web access and netflix streaming. That should require much more processing and even a minature OS. So we can load OSes on LCD tvs, but we can't process a 120hz signal which was done back in 2005 on DLP TVs?
I think we should stop flaming in this thread to start with guys....
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
I think we should stop flaming in this thread to start with guys....
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick
Twitter: @Dr_Inkduff
<b>Processor:</b> Intel Core i7 920 D0 (4Ghz) <b>Motherboard:</b> ASUS P6T
[quote name='Dr Nick' post='588503' date='Sep 15 2009, 10:33 PM']I think we should stop flaming in this thread to start with guys....
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick[/quote]
Makes sense to me. I am eying those Mitsubishi DLPs though. They look pretty hawt.
[quote name='Dr Nick' post='588503' date='Sep 15 2009, 10:33 PM']I think we should stop flaming in this thread to start with guys....
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick
Makes sense to me. I am eying those Mitsubishi DLPs though. They look pretty hawt.
The human race divides politically into those who want people to be controlled and those who have no such desire.
The conditions for 3D vision are:
Able to display 120 truly separate images in a second (120Hz).
Television displays flicker if they refresh at low frame rates, this is the reason why many TVs in the past refreshed at twice the rate to create a smoother image. Some technologies also interpolate the images, basically anti-alias.
However this is done inside the panel logic, which is separate from the input and display processor and also separate from the LCD characteristics. Most LCD screens are fundamentally blurry and switch slowly - they can hardly display 25 truly separate images. Many of them claim 200Hz - which means they can create 200 blurry images, that basically are not more than 25 separate images.
All the other high Hz values are a 'catalogue feature'. Essentially television companies create higher and higher numbers to confuse consumers that their 200Hz or 300Hz or 480Hz refresh rates are better, in essence they are not visible to the eye.
Television marketing is basically full of crap and consumer confusion from 24Hz, 100Hz, 200Hz, 1080p, 1080i, HD Ready, FullHD - it's all massively confusing to consumers.
For 3D you need:
1. Display technology that can create truly separate images - Plasma, CRT and DLP can do this, but LCD only in the very latest incarnations.
2. Input processor and display processor that can accept high bandwidth 120Hz stream and render it on the display technology (almost all of them lack these, too)
Just remember - for every technology there are cheap chips with the basic features and expensive chips with more features.
Until there is a mass market for 3D no televisions would include a 120Hz input processor - it requires far more sophisticated and therefore expensive chip technology to work.
Right now it is a fringe market - as such only few televisions have these very expensive chips in them and are marketed as a premium to us enthusiasts.
Able to display 120 truly separate images in a second (120Hz).
Television displays flicker if they refresh at low frame rates, this is the reason why many TVs in the past refreshed at twice the rate to create a smoother image. Some technologies also interpolate the images, basically anti-alias.
However this is done inside the panel logic, which is separate from the input and display processor and also separate from the LCD characteristics. Most LCD screens are fundamentally blurry and switch slowly - they can hardly display 25 truly separate images. Many of them claim 200Hz - which means they can create 200 blurry images, that basically are not more than 25 separate images.
All the other high Hz values are a 'catalogue feature'. Essentially television companies create higher and higher numbers to confuse consumers that their 200Hz or 300Hz or 480Hz refresh rates are better, in essence they are not visible to the eye.
Television marketing is basically full of crap and consumer confusion from 24Hz, 100Hz, 200Hz, 1080p, 1080i, HD Ready, FullHD - it's all massively confusing to consumers.
For 3D you need:
1. Display technology that can create truly separate images - Plasma, CRT and DLP can do this, but LCD only in the very latest incarnations.
2. Input processor and display processor that can accept high bandwidth 120Hz stream and render it on the display technology (almost all of them lack these, too)
Just remember - for every technology there are cheap chips with the basic features and expensive chips with more features.
Until there is a mass market for 3D no televisions would include a 120Hz input processor - it requires far more sophisticated and therefore expensive chip technology to work.
Right now it is a fringe market - as such only few televisions have these very expensive chips in them and are marketed as a premium to us enthusiasts.
I'm not sure what answer you are looking for, but I will try and answer.
Basically, most HDTVs on the market can only accept a 60hz signal, which is the standard refresh for 720p and 1080p. Some HDTVs have a 120hz smoothing mode which uses a software algorithm to create "between" frames to create the look of a very smooth 120hz, but it is just basically a trick as they aren't capable of accepting a 120hz signal and displaying it.
The monitors that are stereo compatible are able to natively accept and display a 120hz signal and display 60hz per eye when the shutter glasses are worn. Most TVs simply don't contain the technology to accept 120hz signals. It isn't because of the DVI standard, but rather it is the circuitry inside of the TVs that need to be able to accept and use the signal in a meaningful manner.
As far as I can tell, it really isn't a scam, it is just that most HDTVs don't have the capability to accept and display the 3d signal. I don't know if the DLP stuff is 120hz (I'm pretty sure not all of them are), but DLP has pretty much always been able to work with 3d stereo. I had a couple of DLP projectors years ago that I used with the old stereo drivers and they worked great.
Concievibly it would be possible to display 3d stereo on a normal HDTV at 30hz per eye, but that is not a pleasant experience from my observation, so I can understand why nV wouldn't allow that.
I am sorry if that isn't the answer you are looking for. I'm pretty sure it isn't a scam (well, the 120hz HDTVs that cannot accept a 120hz signal kind of are I suppose).
The DLP 3d ready HDTVs are reasonably inexpensive and I have thought more than once about selling my current TV and picking one up.
I'm not sure what answer you are looking for, but I will try and answer.
Basically, most HDTVs on the market can only accept a 60hz signal, which is the standard refresh for 720p and 1080p. Some HDTVs have a 120hz smoothing mode which uses a software algorithm to create "between" frames to create the look of a very smooth 120hz, but it is just basically a trick as they aren't capable of accepting a 120hz signal and displaying it.
The monitors that are stereo compatible are able to natively accept and display a 120hz signal and display 60hz per eye when the shutter glasses are worn. Most TVs simply don't contain the technology to accept 120hz signals. It isn't because of the DVI standard, but rather it is the circuitry inside of the TVs that need to be able to accept and use the signal in a meaningful manner.
As far as I can tell, it really isn't a scam, it is just that most HDTVs don't have the capability to accept and display the 3d signal. I don't know if the DLP stuff is 120hz (I'm pretty sure not all of them are), but DLP has pretty much always been able to work with 3d stereo. I had a couple of DLP projectors years ago that I used with the old stereo drivers and they worked great.
Concievibly it would be possible to display 3d stereo on a normal HDTV at 30hz per eye, but that is not a pleasant experience from my observation, so I can understand why nV wouldn't allow that.
I am sorry if that isn't the answer you are looking for. I'm pretty sure it isn't a scam (well, the 120hz HDTVs that cannot accept a 120hz signal kind of are I suppose).
The DLP 3d ready HDTVs are reasonably inexpensive and I have thought more than once about selling my current TV and picking one up.
The human race divides politically into those who want people to be controlled and those who have no such desire.
--Robert A. Heinlein
DLPs are still being made. In fact I really want this one [url="http://www.mitsubishi-tv.com/product/WD82737"]http://www.mitsubishi-tv.com/product/WD82737[/url] as it looks beautiful.
If I remember correctly, DLPs refresh in a different manner more suited to 3d, though damned if I can remember what the exact difference is. I do know that being able to accept a 120hz signal requires twice the bandwidth in the LCD screens, so I imagine it is just a problem with the cost of the input components that accept and decode the signal.
This isn't new however, it has always been an issue. I had an LCD projector way back when that I couldn't use for 3d, but the two cheaper DLP projectors I bought worked just great for 3d. So basically, if it is a scam, it is not a new one.
I am an Electrical Engineering junion in college, but I don't have a lot of specific knowledge about LCD and DLP circuitry.
DLPs are still being made. In fact I really want this one http://www.mitsubishi-tv.com/product/WD82737 as it looks beautiful.
If I remember correctly, DLPs refresh in a different manner more suited to 3d, though damned if I can remember what the exact difference is. I do know that being able to accept a 120hz signal requires twice the bandwidth in the LCD screens, so I imagine it is just a problem with the cost of the input components that accept and decode the signal.
This isn't new however, it has always been an issue. I had an LCD projector way back when that I couldn't use for 3d, but the two cheaper DLP projectors I bought worked just great for 3d. So basically, if it is a scam, it is not a new one.
I am an Electrical Engineering junion in college, but I don't have a lot of specific knowledge about LCD and DLP circuitry.
The human race divides politically into those who want people to be controlled and those who have no such desire.
--Robert A. Heinlein
check my blog - cybereality.com
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick
Twitter: @Dr_Inkduff
<b>Processor:</b> Intel Core i7 920 D0 (4Ghz) <b>Motherboard:</b> ASUS P6T
<b>Memory:</b> 6GB DDR3 RAM (Kingston) <b>Graphics:</b> GTX 260 (216 cores, physX); EVGA GTX 480 SC
<b>OS:</b> Win7 Home Premium 64-bit / Vista Home Premium 64-bit
<b>Hard Disks:</b> 750GB + 500GB <b>Tower:</b> Antec 'Twelve Hundred' Gaming Tower
<b>Monitors:</b> 24" ACER GD245HQbd 120Hz 1920*1080 + 22" widescreen LCD 1680x1050
http://bit.ly/Bluesteel
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick[/quote]
Makes sense to me. I am eying those Mitsubishi DLPs though. They look pretty hawt.
Apparently nobody here knows exactly how to make an LCD monitor and understands as much as we would like to know about why they dont use 120Hz.
I believe that until now there has been no need for 120Hz to be used in LCD monitors, therefore the feature has not been added in. I am pretty sure that running at 120Hz would put more load on the graphics card, so regular uses of computers with older computers (on-board graphics etc) would not want a monitor that runs at 120Hz.
I am guessing that now, since there is a market in the idea, many new LCD's will have 120Hz capabilities.
These are just my ideas, but I thought I would contribute my part...
Nick
Makes sense to me. I am eying those Mitsubishi DLPs though. They look pretty hawt.
The human race divides politically into those who want people to be controlled and those who have no such desire.
--Robert A. Heinlein
Sadly, I'm still waiting for the 22" sammy monitor to get cheaper lol... Unless the CRT wont work.... Such a difficult decision!
Sadly, I'm still waiting for the 22" sammy monitor to get cheaper lol... Unless the CRT wont work.... Such a difficult decision!
Twitter: @Dr_Inkduff
<b>Processor:</b> Intel Core i7 920 D0 (4Ghz) <b>Motherboard:</b> ASUS P6T
<b>Memory:</b> 6GB DDR3 RAM (Kingston) <b>Graphics:</b> GTX 260 (216 cores, physX); EVGA GTX 480 SC
<b>OS:</b> Win7 Home Premium 64-bit / Vista Home Premium 64-bit
<b>Hard Disks:</b> 750GB + 500GB <b>Tower:</b> Antec 'Twelve Hundred' Gaming Tower
<b>Monitors:</b> 24" ACER GD245HQbd 120Hz 1920*1080 + 22" widescreen LCD 1680x1050
http://bit.ly/Bluesteel
Hz means Hz in different contexts.
The conditions for 3D vision are:
Able to display 120 truly separate images in a second (120Hz).
Television displays flicker if they refresh at low frame rates, this is the reason why many TVs in the past refreshed at twice the rate to create a smoother image. Some technologies also interpolate the images, basically anti-alias.
However this is done inside the panel logic, which is separate from the input and display processor and also separate from the LCD characteristics. Most LCD screens are fundamentally blurry and switch slowly - they can hardly display 25 truly separate images. Many of them claim 200Hz - which means they can create 200 blurry images, that basically are not more than 25 separate images.
All the other high Hz values are a 'catalogue feature'. Essentially television companies create higher and higher numbers to confuse consumers that their 200Hz or 300Hz or 480Hz refresh rates are better, in essence they are not visible to the eye.
Television marketing is basically full of crap and consumer confusion from 24Hz, 100Hz, 200Hz, 1080p, 1080i, HD Ready, FullHD - it's all massively confusing to consumers.
For 3D you need:
1. Display technology that can create truly separate images - Plasma, CRT and DLP can do this, but LCD only in the very latest incarnations.
2. Input processor and display processor that can accept high bandwidth 120Hz stream and render it on the display technology (almost all of them lack these, too)
Just remember - for every technology there are cheap chips with the basic features and expensive chips with more features.
Until there is a mass market for 3D no televisions would include a 120Hz input processor - it requires far more sophisticated and therefore expensive chip technology to work.
Right now it is a fringe market - as such only few televisions have these very expensive chips in them and are marketed as a premium to us enthusiasts.
Cheers, Jules
Hz means Hz in different contexts.
The conditions for 3D vision are:
Able to display 120 truly separate images in a second (120Hz).
Television displays flicker if they refresh at low frame rates, this is the reason why many TVs in the past refreshed at twice the rate to create a smoother image. Some technologies also interpolate the images, basically anti-alias.
However this is done inside the panel logic, which is separate from the input and display processor and also separate from the LCD characteristics. Most LCD screens are fundamentally blurry and switch slowly - they can hardly display 25 truly separate images. Many of them claim 200Hz - which means they can create 200 blurry images, that basically are not more than 25 separate images.
All the other high Hz values are a 'catalogue feature'. Essentially television companies create higher and higher numbers to confuse consumers that their 200Hz or 300Hz or 480Hz refresh rates are better, in essence they are not visible to the eye.
Television marketing is basically full of crap and consumer confusion from 24Hz, 100Hz, 200Hz, 1080p, 1080i, HD Ready, FullHD - it's all massively confusing to consumers.
For 3D you need:
1. Display technology that can create truly separate images - Plasma, CRT and DLP can do this, but LCD only in the very latest incarnations.
2. Input processor and display processor that can accept high bandwidth 120Hz stream and render it on the display technology (almost all of them lack these, too)
Just remember - for every technology there are cheap chips with the basic features and expensive chips with more features.
Until there is a mass market for 3D no televisions would include a 120Hz input processor - it requires far more sophisticated and therefore expensive chip technology to work.
Right now it is a fringe market - as such only few televisions have these very expensive chips in them and are marketed as a premium to us enthusiasts.
Cheers, Jules