I understand those that have the Optoma aren't happy, but this is exactly why I stayed away from it. Reading around this forum, it seems like it'd be common knowledge to stay away from devices that aren't officially supported. I wanted to get this PJ because at the time the Acer wasn't here. I decided that this on top of Blu-Ray 3D uncertainty with displays that I should just go cheap and get the cheapest certified Viewsonic PJ. I am glad I did and pleasantly surprised. I'll spend the money on a 3D display that is mainstream and is 100% compatible with everything when the time comes, that's why I skipped on this PJ (and the Acer) and went cheap (and certified).
I understand those that have the Optoma aren't happy, but this is exactly why I stayed away from it. Reading around this forum, it seems like it'd be common knowledge to stay away from devices that aren't officially supported. I wanted to get this PJ because at the time the Acer wasn't here. I decided that this on top of Blu-Ray 3D uncertainty with displays that I should just go cheap and get the cheapest certified Viewsonic PJ. I am glad I did and pleasantly surprised. I'll spend the money on a 3D display that is mainstream and is 100% compatible with everything when the time comes, that's why I skipped on this PJ (and the Acer) and went cheap (and certified).
[quote name='Chibi_Chaingun' post='1036394' date='Apr 8 2010, 04:12 PM']I understand those that have the Optoma aren't happy, but this is exactly why I stayed away from it. Reading around this forum, it seems like it'd be common knowledge to stay away from devices that aren't officially supported. I wanted to get this PJ because at the time the Acer wasn't here. I decided that this on top of Blu-Ray 3D uncertainty with displays that I should just go cheap and get the cheapest certified Viewsonic PJ. I am glad I did and pleasantly surprised. I'll spend the money on a 3D display that is mainstream and is 100% compatible with everything when the time comes, that's why I skipped on this PJ (and the Acer) and went cheap (and certified).[/quote]
Yep, good decision on your part. But I still feel the industry as a whole is steering away from the "qualify it first" model in favor of a "match the specification" model.
[quote name='Chibi_Chaingun' post='1036394' date='Apr 8 2010, 04:12 PM']I understand those that have the Optoma aren't happy, but this is exactly why I stayed away from it. Reading around this forum, it seems like it'd be common knowledge to stay away from devices that aren't officially supported. I wanted to get this PJ because at the time the Acer wasn't here. I decided that this on top of Blu-Ray 3D uncertainty with displays that I should just go cheap and get the cheapest certified Viewsonic PJ. I am glad I did and pleasantly surprised. I'll spend the money on a 3D display that is mainstream and is 100% compatible with everything when the time comes, that's why I skipped on this PJ (and the Acer) and went cheap (and certified).
Yep, good decision on your part. But I still feel the industry as a whole is steering away from the "qualify it first" model in favor of a "match the specification" model.
Maybe once ATI and IZ3D get their stuff together Nvidia will see how dumb they were when people start switching to ATI. IZ3D is already beta testing a 120Hz projector driver that may work with the Optima and other projectors Nvidia has orphaned.
Maybe once ATI and IZ3D get their stuff together Nvidia will see how dumb they were when people start switching to ATI. IZ3D is already beta testing a 120Hz projector driver that may work with the Optima and other projectors Nvidia has orphaned.
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has [i]just[/i] been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has just been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...
"AIO": Intel Xeon E5-2690 v2 @ 103.2 MHz BCLK | ASUS X79-Deluxe | SwifTech Apogee Drive II Pump and Block | 120 mm + 240 mm Push-Pull | 64 GB G.Skill PC3-12800 @ 1924 MHz | NVIDIA RTX 2070 FE | LG 25UM56 UW Monitor | Plextor 1TB PX-1TM9PeY PCIe NVMe (Windows 10 Pro x64 1809) | Plextor 1TB PX-1TM9PeY PCIe NVMe (UserData) | 4x SanDisk 500 GB SSDs in Marvell SATA3 RAID0 (C:\Games) | 2x WD 250 GB SSDs and WD 3 TB RED HDD in Marvell HyperDuo RAID (Media) | 16 GB RAMDisk (Temp Files) | WD My Book Essentials 3 TB NAS (Archives) | LG BP50NB40 ODD | eVGA Supernova G+ 1000 W PSU | Cooler Master HAF-XB
"Gaming": Intel Xeon E5-1650 v2, Turbo 44x (5-6), 45x (3-4), 46x (1-2) | ASUS Rampage IV Extreme | SwifTech Apogee Drive II Pump and Block | 120 mm + 240 mm Push/Pull | 32 GB G.Skill PC3-12800 @ 1866 MHz | NVIDIA GTX 1080 FE | NVIDIA GTX 970 RE | Samsung U28E510 UHD | 2x PNY 480 GB SSDs in Intel SATA3 RAID0 (OS) | Plextor 1TB PX-1TM9PeY PCIe NVMe (Disk Games) | 4x PNY 240 GB SSDs in Intel SATA2 RAID0 (On-Line Games) | eVGA Supernova G+ 1000 W PSU | Cooler Master HAF-XB | Windows 10 Pro x64 1809
[quote name='Crash27' post='1036292' date='Apr 8 2010, 01:47 PM']ok I will say again
" In there adds when I bought it it said 120hz display needed and that is all" It was working as a 120hz display.[/quote]
I for one support the notion that you totally made this up. I did a ELP program for EVGA when 3D Vision first went main stream and I did the research. Each resource pointed directly back to that list of supported displays.
[quote name='Crash27' post='1036292' date='Apr 8 2010, 01:47 PM']ok I will say again
" In there adds when I bought it it said 120hz display needed and that is all" It was working as a 120hz display.
I for one support the notion that you totally made this up. I did a ELP program for EVGA when 3D Vision first went main stream and I did the research. Each resource pointed directly back to that list of supported displays.
If there's a device that is technically capable of displaying 3D vision, nvidia shouldn't be locking it out, period. As other people have pointed out, there's soon going to be an ATI alternative, and the last thing nVidia should be doing is screwing it's own customers. At the moment nVidias poor support for 3D Vision (IMO) is what's going to make my mind up whether to buy an nVidia or ATI card next time.
If there's a device that is technically capable of displaying 3D vision, nvidia shouldn't be locking it out, period. As other people have pointed out, there's soon going to be an ATI alternative, and the last thing nVidia should be doing is screwing it's own customers. At the moment nVidias poor support for 3D Vision (IMO) is what's going to make my mind up whether to buy an nVidia or ATI card next time.
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has [i]just[/i] been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...[/quote]
Very well said.
For the same reason as jaafa, I haven't invested in 3D tech.
Everything that is out now are betas until someone becomes a clear standard.
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has just been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...
Very well said.
For the same reason as jaafa, I haven't invested in 3D tech.
Everything that is out now are betas until someone becomes a clear standard.
[quote name='msm903' post='1036337' date='Apr 8 2010, 02:58 PM']So let me get this straight, Optima doesn't want to pay for certification, Nvidia should still go ahead and officially provide support, then provided man power to implement that support, then provide customer support and technical support to the end user who spent 200 + cost of the vid card when something goes wrong with Optima and 3d vision to make it work?
so they have no choice how they should want to market and sell their product/technology? Maybe they should just go ahead and give us the technology all for free.
Simple if Nvidia doesn't have it as the Projector/monitor/tv as supported for 3d Vision don't buy one of these devices and expect it to work. Many companies provide some sort of licenses that other companies need to pay to use their tech. it's business. Who cares if it can work with or without my tech. I worked hard to make my tech customer friendly, you will pay me to use it with my custoemrs or you will develop your own. that's all.[/quote]
no they should put it back the way it was and stop blocking it just to screw us over.
don't want them to add a thing just take away the block no money needed or research just remove a few lines of text
information on nvidia sight said any genaric crt this works the same as crt and was working as I need it to. Don't care if they fix the ghosting or upsidedown glasses problem but why go out of there way to piss people off.
I too made the purchase of 8800gtx for more 3d power and was screwed over so I deserve everything I get because I should have been twice shy but that don't mean I'm going to take it lying down.
[quote name='msm903' post='1036337' date='Apr 8 2010, 02:58 PM']So let me get this straight, Optima doesn't want to pay for certification, Nvidia should still go ahead and officially provide support, then provided man power to implement that support, then provide customer support and technical support to the end user who spent 200 + cost of the vid card when something goes wrong with Optima and 3d vision to make it work?
so they have no choice how they should want to market and sell their product/technology? Maybe they should just go ahead and give us the technology all for free.
Simple if Nvidia doesn't have it as the Projector/monitor/tv as supported for 3d Vision don't buy one of these devices and expect it to work. Many companies provide some sort of licenses that other companies need to pay to use their tech. it's business. Who cares if it can work with or without my tech. I worked hard to make my tech customer friendly, you will pay me to use it with my custoemrs or you will develop your own. that's all.
no they should put it back the way it was and stop blocking it just to screw us over.
don't want them to add a thing just take away the block no money needed or research just remove a few lines of text
information on nvidia sight said any genaric crt this works the same as crt and was working as I need it to. Don't care if they fix the ghosting or upsidedown glasses problem but why go out of there way to piss people off.
I too made the purchase of 8800gtx for more 3d power and was screwed over so I deserve everything I get because I should have been twice shy but that don't mean I'm going to take it lying down.
[quote name='420Ryme' post='1036672' date='Apr 9 2010, 01:49 AM']I for one support the notion that you totally made this up. I did a ELP program for EVGA when 3D Vision first went main stream and I did the research. Each resource pointed directly back to that list of supported displays.[/quote]
which listed crt mode of 120hz and this display was working as such it's not a big leap to think that they will leave it alone.
[quote name='420Ryme' post='1036672' date='Apr 9 2010, 01:49 AM']I for one support the notion that you totally made this up. I did a ELP program for EVGA when 3D Vision first went main stream and I did the research. Each resource pointed directly back to that list of supported displays.
which listed crt mode of 120hz and this display was working as such it's not a big leap to think that they will leave it alone.
[quote name='BlackSharkfr' post='1036310' date='Apr 8 2010, 02:13 PM']Nvidia NEVER said it would support any 120Hz display !
That was an assumption you made in your head and that many technology news reporters also made but it was never on the product package or the official hardware requirement page.
You assumed that Nvidia would try and support every single 3D hardware, you got burned, just like all previous 3D users who used to rely blindly on Nvidia when they bought a Geforce 8 GPU 3 years ago.
Welcome to the club ![/quote]
yup kind of only....... it was working
where is the avitar of the blackshark you sport over at that other sight?
[quote name='BlackSharkfr' post='1036310' date='Apr 8 2010, 02:13 PM']Nvidia NEVER said it would support any 120Hz display !
That was an assumption you made in your head and that many technology news reporters also made but it was never on the product package or the official hardware requirement page.
You assumed that Nvidia would try and support every single 3D hardware, you got burned, just like all previous 3D users who used to rely blindly on Nvidia when they bought a Geforce 8 GPU 3 years ago.
Welcome to the club !
yup kind of only....... it was working
where is the avitar of the blackshark you sport over at that other sight?
After all its touted [i]generic[/i] CRT mode, which basically doesn't mean anything less or more than thet the galssers shutter exactly in sync with the video output, no matter of the EDID received or not (because running through an KMV switch, e.g.) VGA or DVI, period
same goes for [i]generic[/i] DLP: checkerboard output, vesa sync input, period.
Disregarding 3D Discover, the other modes are optimized for dedicated displays from manufacturers that have joined the licensing program. These optimized profiles compensate specific displays' frame lag and reduce the open shutter time to give better separation and less ghosting (which we know does not work allways out as intended).
I agree that we should not see optimized profiles without cash moving from the concerning manufacturer to nvida, but for [i]generic [/i] (gotta love italian words in italic) this simply doesn'T hold true and touting it so nvidia has allready limited the liability on support. (read "it should work, but it's not guaranteed to so properly").
Even more I would like to see the 120Hz limit fall as much as the allways active VESA input to override the sync, as a combined DLP/CRT mode, which is independant from refresh rate and you can freely select anaglyph/field sequential/checkerboard.
But that simply doesn't quite belong here unless we discuss the option to remove these limitations to paying customers which I doesn't think to be a so bad idea, quite like the 3d Play programme. But then again, nvidia would have to provide support, which wasn't the case if they opened up the options.
This would definately push sales on their (hardware) solution, which otherwise will become somewhat obsolete later this year when Bit Cauldron shows up on stage, 3dPlay is released, and IZ3D got their things set up.
So I could very well imagine they will do so once the competiotion offers something similar, so let's hope.
After all its touted generic CRT mode, which basically doesn't mean anything less or more than thet the galssers shutter exactly in sync with the video output, no matter of the EDID received or not (because running through an KMV switch, e.g.) VGA or DVI, period
same goes for generic DLP: checkerboard output, vesa sync input, period.
Disregarding 3D Discover, the other modes are optimized for dedicated displays from manufacturers that have joined the licensing program. These optimized profiles compensate specific displays' frame lag and reduce the open shutter time to give better separation and less ghosting (which we know does not work allways out as intended).
I agree that we should not see optimized profiles without cash moving from the concerning manufacturer to nvida, but for generic (gotta love italian words in italic) this simply doesn'T hold true and touting it so nvidia has allready limited the liability on support. (read "it should work, but it's not guaranteed to so properly").
Even more I would like to see the 120Hz limit fall as much as the allways active VESA input to override the sync, as a combined DLP/CRT mode, which is independant from refresh rate and you can freely select anaglyph/field sequential/checkerboard.
But that simply doesn't quite belong here unless we discuss the option to remove these limitations to paying customers which I doesn't think to be a so bad idea, quite like the 3d Play programme. But then again, nvidia would have to provide support, which wasn't the case if they opened up the options.
This would definately push sales on their (hardware) solution, which otherwise will become somewhat obsolete later this year when Bit Cauldron shows up on stage, 3dPlay is released, and IZ3D got their things set up.
So I could very well imagine they will do so once the competiotion offers something similar, so let's hope.
Yep, good decision on your part. But I still feel the industry as a whole is steering away from the "qualify it first" model in favor of a "match the specification" model.
Yep, good decision on your part. But I still feel the industry as a whole is steering away from the "qualify it first" model in favor of a "match the specification" model.
check my blog - cybereality.com
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has [i]just[/i] been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has just been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...
"Gaming": Intel Xeon E5-1650 v2, Turbo 44x (5-6), 45x (3-4), 46x (1-2) | ASUS Rampage IV Extreme | SwifTech Apogee Drive II Pump and Block | 120 mm + 240 mm Push/Pull | 32 GB G.Skill PC3-12800 @ 1866 MHz | NVIDIA GTX 1080 FE | NVIDIA GTX 970 RE | Samsung U28E510 UHD | 2x PNY 480 GB SSDs in Intel SATA3 RAID0 (OS) | Plextor 1TB PX-1TM9PeY PCIe NVMe (Disk Games) | 4x PNY 240 GB SSDs in Intel SATA2 RAID0 (On-Line Games) | eVGA Supernova G+ 1000 W PSU | Cooler Master HAF-XB | Windows 10 Pro x64 1809
Stock is Extreme now
" In there adds when I bought it it said 120hz display needed and that is all" It was working as a 120hz display.[/quote]
I for one support the notion that you totally made this up. I did a ELP program for EVGA when 3D Vision first went main stream and I did the research. Each resource pointed directly back to that list of supported displays.
" In there adds when I bought it it said 120hz display needed and that is all" It was working as a 120hz display.
I for one support the notion that you totally made this up. I did a ELP program for EVGA when 3D Vision first went main stream and I did the research. Each resource pointed directly back to that list of supported displays.
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has [i]just[/i] been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...[/quote]
Very well said.
For the same reason as jaafa, I haven't invested in 3D tech.
Everything that is out now are betas until someone becomes a clear standard.
If this is to be a debate on "standards" support, shouldn't a standard exist in the first place? Seems to me that the first iteration of the standard for 3D has just been released.
Otherwise, you may as well settle yourself into the "collateral damage" role while wondering if you have a BetaMax or an HD setup.
I haven't invested anything in 3D yet, and don't intend to until some firm set of standards is established and industry support appears. Until then, I spend enough already on the bleeding edge of the graphics engines technology.
I got enough dead, non-standard and no-longer-supported hardware around as it is...
Very well said.
For the same reason as jaafa, I haven't invested in 3D tech.
Everything that is out now are betas until someone becomes a clear standard.
eVGA Z68 SLI | Intel Core i5-3570K @ 4.5 GHz | Corsair Hydro Series H100i
16GB G.Skill Sniper Series DDR3 | MSI GTX 970 Gaming 4G SLIed | OCZ ZX 1000W
OCZ Agility3 120 GB SSD + Samsung 850 Pro 256 GB SSD
Samsung UN55D6000 + Samsung T240
Win10 Pro x64 / WEI - 8.2 - 8.2 - 8.9 - 8.9 - 7.9
3DMark Fire Strike: 15648
F@H Team: 142900
so they have no choice how they should want to market and sell their product/technology? Maybe they should just go ahead and give us the technology all for free.
Simple if Nvidia doesn't have it as the Projector/monitor/tv as supported for 3d Vision don't buy one of these devices and expect it to work. Many companies provide some sort of licenses that other companies need to pay to use their tech. it's business. Who cares if it can work with or without my tech. I worked hard to make my tech customer friendly, you will pay me to use it with my custoemrs or you will develop your own. that's all.[/quote]
no they should put it back the way it was and stop blocking it just to screw us over.
don't want them to add a thing just take away the block no money needed or research just remove a few lines of text
information on nvidia sight said any genaric crt this works the same as crt and was working as I need it to. Don't care if they fix the ghosting or upsidedown glasses problem but why go out of there way to piss people off.
I too made the purchase of 8800gtx for more 3d power and was screwed over so I deserve everything I get because I should have been twice shy but that don't mean I'm going to take it lying down.
so they have no choice how they should want to market and sell their product/technology? Maybe they should just go ahead and give us the technology all for free.
Simple if Nvidia doesn't have it as the Projector/monitor/tv as supported for 3d Vision don't buy one of these devices and expect it to work. Many companies provide some sort of licenses that other companies need to pay to use their tech. it's business. Who cares if it can work with or without my tech. I worked hard to make my tech customer friendly, you will pay me to use it with my custoemrs or you will develop your own. that's all.
no they should put it back the way it was and stop blocking it just to screw us over.
don't want them to add a thing just take away the block no money needed or research just remove a few lines of text
information on nvidia sight said any genaric crt this works the same as crt and was working as I need it to. Don't care if they fix the ghosting or upsidedown glasses problem but why go out of there way to piss people off.
I too made the purchase of 8800gtx for more 3d power and was screwed over so I deserve everything I get because I should have been twice shy but that don't mean I'm going to take it lying down.
which listed crt mode of 120hz and this display was working as such it's not a big leap to think that they will leave it alone.
which listed crt mode of 120hz and this display was working as such it's not a big leap to think that they will leave it alone.
That was an assumption you made in your head and that many technology news reporters also made but it was never on the product package or the official hardware requirement page.
You assumed that Nvidia would try and support every single 3D hardware, you got burned, just like all previous 3D users who used to rely blindly on Nvidia when they bought a Geforce 8 GPU 3 years ago.
Welcome to the club ![/quote]
yup kind of only....... it was working
where is the avitar of the blackshark you sport over at that other sight?
That was an assumption you made in your head and that many technology news reporters also made but it was never on the product package or the official hardware requirement page.
You assumed that Nvidia would try and support every single 3D hardware, you got burned, just like all previous 3D users who used to rely blindly on Nvidia when they bought a Geforce 8 GPU 3 years ago.
Welcome to the club !
yup kind of only....... it was working
where is the avitar of the blackshark you sport over at that other sight?
Yep, I totally agree:
After all its touted [i]generic[/i] CRT mode, which basically doesn't mean anything less or more than thet the galssers shutter exactly in sync with the video output, no matter of the EDID received or not (because running through an KMV switch, e.g.) VGA or DVI, period
same goes for [i]generic[/i] DLP: checkerboard output, vesa sync input, period.
Disregarding 3D Discover, the other modes are optimized for dedicated displays from manufacturers that have joined the licensing program. These optimized profiles compensate specific displays' frame lag and reduce the open shutter time to give better separation and less ghosting (which we know does not work allways out as intended).
I agree that we should not see optimized profiles without cash moving from the concerning manufacturer to nvida, but for [i]generic [/i] (gotta love italian words in italic) this simply doesn'T hold true and touting it so nvidia has allready limited the liability on support. (read "it should work, but it's not guaranteed to so properly").
Even more I would like to see the 120Hz limit fall as much as the allways active VESA input to override the sync, as a combined DLP/CRT mode, which is independant from refresh rate and you can freely select anaglyph/field sequential/checkerboard.
But that simply doesn't quite belong here unless we discuss the option to remove these limitations to paying customers which I doesn't think to be a so bad idea, quite like the 3d Play programme. But then again, nvidia would have to provide support, which wasn't the case if they opened up the options.
This would definately push sales on their (hardware) solution, which otherwise will become somewhat obsolete later this year when Bit Cauldron shows up on stage, 3dPlay is released, and IZ3D got their things set up.
So I could very well imagine they will do so once the competiotion offers something similar, so let's hope.
Yep, I totally agree:
After all its touted generic CRT mode, which basically doesn't mean anything less or more than thet the galssers shutter exactly in sync with the video output, no matter of the EDID received or not (because running through an KMV switch, e.g.) VGA or DVI, period
same goes for generic DLP: checkerboard output, vesa sync input, period.
Disregarding 3D Discover, the other modes are optimized for dedicated displays from manufacturers that have joined the licensing program. These optimized profiles compensate specific displays' frame lag and reduce the open shutter time to give better separation and less ghosting (which we know does not work allways out as intended).
I agree that we should not see optimized profiles without cash moving from the concerning manufacturer to nvida, but for generic (gotta love italian words in italic) this simply doesn'T hold true and touting it so nvidia has allready limited the liability on support. (read "it should work, but it's not guaranteed to so properly").
Even more I would like to see the 120Hz limit fall as much as the allways active VESA input to override the sync, as a combined DLP/CRT mode, which is independant from refresh rate and you can freely select anaglyph/field sequential/checkerboard.
But that simply doesn't quite belong here unless we discuss the option to remove these limitations to paying customers which I doesn't think to be a so bad idea, quite like the 3d Play programme. But then again, nvidia would have to provide support, which wasn't the case if they opened up the options.
This would definately push sales on their (hardware) solution, which otherwise will become somewhat obsolete later this year when Bit Cauldron shows up on stage, 3dPlay is released, and IZ3D got their things set up.
So I could very well imagine they will do so once the competiotion offers something similar, so let's hope.