[quote name='bullripper' post='1092717' date='Jul 24 2010, 06:46 AM']I'd relax and just keep using the .inf, at the end of the day if it had of worked without the inf then I'm pretty sure it would have been the same results as the hack around anyway wouldn't it?
I know we should be supported and I'm just as pissed off too.[/quote]
If it works with the .ini, why cant it be supported?
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'
[quote name='bullripper' post='1092717' date='Jul 24 2010, 06:46 AM']I'd relax and just keep using the .inf, at the end of the day if it had of worked without the inf then I'm pretty sure it would have been the same results as the hack around anyway wouldn't it?
I know we should be supported and I'm just as pissed off too.
If it works with the .ini, why cant it be supported?
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'
[quote name='Mund' post='1092855' date='Jul 24 2010, 04:04 PM']If it works with the .ini, why cant it be supported?
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'[/quote]
well, that was pure assumption. no one actually knew for sure if that was the case.
i'm just pissed that they "support" the projectors but they actually dont. They also didn't mention anything in their driver release about the fact that only certain models actually support it.
[quote name='Mund' post='1092855' date='Jul 24 2010, 04:04 PM']If it works with the .ini, why cant it be supported?
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'
well, that was pure assumption. no one actually knew for sure if that was the case.
i'm just pissed that they "support" the projectors but they actually dont. They also didn't mention anything in their driver release about the fact that only certain models actually support it.
What gets me angry is we know that it comes down to a simple check in EDID and looks at the product id, manufacturer id etc.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
What gets me angry is we know that it comes down to a simple check in EDID and looks at the product id, manufacturer id etc.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
[quote name='bullripper' post='1093125' date='Jul 25 2010, 05:10 AM']What gets me angry is we know that it comes down to a simple check in EDID and looks at the product id, manufacturer id etc.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.[/quote]
I'd really like to get a hold of that .inf bullripper.
EDIT: I found it [url="http://www.avsforum.com/avs-vb/showthread.php?t=1243032"]here[/url]
[quote name='bullripper' post='1093125' date='Jul 25 2010, 05:10 AM']What gets me angry is we know that it comes down to a simple check in EDID and looks at the product id, manufacturer id etc.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
I'd really like to get a hold of that .inf bullripper.
-how you know that resolution is not fake ( resampled inside the pj?) , if its not resampled that means that 96hz shuttering should work . I'd like to see that.
EDIT: Ok I asked this at the original thread, c ya there.
-how you know that resolution is not fake ( resampled inside the pj?) , if its not resampled that means that 96hz shuttering should work . I'd like to see that.
EDIT: Ok I asked this at the original thread, c ya there.
[quote name='raptor007dxn' post='1092060' date='Jul 22 2010, 06:20 PM']You know this sort of info is helpfull to know before I go out and buy a $600 projector...[/quote]
You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.
[quote name='raptor007dxn' post='1092060' date='Jul 22 2010, 06:20 PM']You know this sort of info is helpfull to know before I go out and buy a $600 projector...
You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.
[quote name='Chibi_Chaingun' post='1093423' date='Jul 25 2010, 10:31 PM']You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.[/quote]
you mean like how they are doing at the moment. they have on their page that the projectors are supported NOW, so people will go out and buy projectors on that assumption. however they are not supported. biiig problem.
[quote name='Chibi_Chaingun' post='1093423' date='Jul 25 2010, 10:31 PM']You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.
you mean like how they are doing at the moment. they have on their page that the projectors are supported NOW, so people will go out and buy projectors on that assumption. however they are not supported. biiig problem.
[quote name='Chibi_Chaingun' post='1093423' date='Jul 25 2010, 10:31 PM']You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.[/quote]
A fair point.
I just wonder what needs to be changed (just firmware?) and why?. If it is just down to the .ini why complicate things for current owners when we know a hack can get the projector working
Also, current stocks are not going to disappear on the 1st of August. Anyone buying will have to ask prior 'when was the projector made' I can see many people getting incorrect/no replies to that one.
[quote name='Chibi_Chaingun' post='1093423' date='Jul 25 2010, 10:31 PM']You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.
A fair point.
I just wonder what needs to be changed (just firmware?) and why?. If it is just down to the .ini why complicate things for current owners when we know a hack can get the projector working
Also, current stocks are not going to disappear on the 1st of August. Anyone buying will have to ask prior 'when was the projector made' I can see many people getting incorrect/no replies to that one.
[quote name='bullripper' post='1093125' date='Jul 25 2010, 05:10 AM']What gets me angry is we know that it comes down to a simple check in EDID and looks at the product id, manufacturer id etc.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.[/quote]
Where can I get this inf file?? I'm currently just using the Acer .inf file.
[quote name='bullripper' post='1093125' date='Jul 25 2010, 05:10 AM']What gets me angry is we know that it comes down to a simple check in EDID and looks at the product id, manufacturer id etc.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
Where can I get this inf file?? I'm currently just using the Acer .inf file.
[quote name='dally28' post='1095954' date='Jul 30 2010, 11:30 AM']Where can I get this inf file?? I'm currently just using the Acer .inf file.
Thanks... :-)[/quote]
Sorry to Mourt and Dally28, I have been slack getting back to your pm's.
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.
[quote name='bullripper' post='1096817' date='Jul 31 2010, 04:23 PM']Sorry to Mourt and Dally28, I have been slack getting back to your pm's.
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.[/quote]
Thanks Bullripper. I have my optoma working with the Acer_H5360.inf driver, however, I don't have audio over hdmi anymore. Is anyone else having this issue?
[quote name='bullripper' post='1096817' date='Jul 31 2010, 04:23 PM']Sorry to Mourt and Dally28, I have been slack getting back to your pm's.
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.
Thanks Bullripper. I have my optoma working with the Acer_H5360.inf driver, however, I don't have audio over hdmi anymore. Is anyone else having this issue?
It doesn't surprise me it has been removed, looks like we will be using the Acer H5360 monitor driver forever more :)
As I said in a couple of posts back, I think the end result had it been supported easily and just worked would have been the same as what we are already doing with the spoofed .inf as it is now known.
Perhaps it might have given us 1280x768@120hz and 1280x800@120hz with an official driver as the optoma can do these resolutions but that is all I can see that might change.
I think we should look at moving to another official supported projector monitor driver that does these two resolutions.
Previously when using Mourt's optoma.inf I could get those two resolutions without any issues.
ie There was no red text saying I'm running in a non stereo resolution.
[url="http://www.nvidia.com/object/3d-vision-requirements.html"]http://www.nvidia.com/object/3d-vision-requirements.html[/url] (I just checked and it still lists the Optoma's :) unless I'm looking at cached data somehow)
The funny thing is on that requirements page it says 1280x800 for the HD66 but HD67 says 1280x720, I thought they were the same projector and am sure they are and possibly just an error on that spec.
It doesn't surprise me it has been removed, looks like we will be using the Acer H5360 monitor driver forever more :)
As I said in a couple of posts back, I think the end result had it been supported easily and just worked would have been the same as what we are already doing with the spoofed .inf as it is now known.
Perhaps it might have given us 1280x768@120hz and 1280x800@120hz with an official driver as the optoma can do these resolutions but that is all I can see that might change.
I think we should look at moving to another official supported projector monitor driver that does these two resolutions.
Previously when using Mourt's optoma.inf I could get those two resolutions without any issues.
ie There was no red text saying I'm running in a non stereo resolution.
The funny thing is on that requirements page it says 1280x800 for the HD66 but HD67 says 1280x720, I thought they were the same projector and am sure they are and possibly just an error on that spec.
I know we should be supported and I'm just as pissed off too.[/quote]
If it works with the .ini, why cant it be supported?
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'
I know we should be supported and I'm just as pissed off too.
If it works with the .ini, why cant it be supported?
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'[/quote]
well, that was pure assumption. no one actually knew for sure if that was the case.
i'm just pissed that they "support" the projectors but they actually dont. They also didn't mention anything in their driver release about the fact that only certain models actually support it.
I thought the lack of support for Optoma projectors was due to Optoma not paying for 'official compatibility'
well, that was pure assumption. no one actually knew for sure if that was the case.
i'm just pissed that they "support" the projectors but they actually dont. They also didn't mention anything in their driver release about the fact that only certain models actually support it.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.[/quote]
I'd really like to get a hold of that .inf bullripper.
EDIT: I found it [url="http://www.avsforum.com/avs-vb/showthread.php?t=1243032"]here[/url]
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
I'd really like to get a hold of that .inf bullripper.
EDIT: I found it here
EDIT: Ok I asked this at the original thread, c ya there.
EDIT: Ok I asked this at the original thread, c ya there.
You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.
You mean like when it was never supported officially in the first place and people still bought it then wondered why it didn't work? I can understand if you just bought it after they just said it was supported, but that is not the case with most users here.
you mean like how they are doing at the moment. they have on their page that the projectors are supported NOW, so people will go out and buy projectors on that assumption. however they are not supported. biiig problem.
you mean like how they are doing at the moment. they have on their page that the projectors are supported NOW, so people will go out and buy projectors on that assumption. however they are not supported. biiig problem.
A fair point.
I just wonder what needs to be changed (just firmware?) and why?. If it is just down to the .ini why complicate things for current owners when we know a hack can get the projector working
Also, current stocks are not going to disappear on the 1st of August. Anyone buying will have to ask prior 'when was the projector made' I can see many people getting incorrect/no replies to that one.
A fair point.
I just wonder what needs to be changed (just firmware?) and why?. If it is just down to the .ini why complicate things for current owners when we know a hack can get the projector working
Also, current stocks are not going to disappear on the 1st of August. Anyone buying will have to ask prior 'when was the projector made' I can see many people getting incorrect/no replies to that one.
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.[/quote]
Where can I get this inf file?? I'm currently just using the Acer .inf file.
Thanks... :-)
This is all it is doing, Nvidia hasn't written some new security protocol for displays to be checked or anything they are simply using a protocol that has been out since the early 90's and was never a secruity focused check.
It also tells the Nvidia driver what resolutions and at what hz 3D can run in, this is why we get "out of stereo mode" red message or whatever it is when running at a different res we know the projector can do.
eg: I have run in 3D at 1280x768 and 1280x800 both at 120hz but only when I use the generic CRT method, if it thinks i'm authorized screen like the spoof does for H5360 then it will only do 1280x720.
I have found a hybrid of optoma monitor.inf and H5360 if anyone is interested, it's pretty cool as they have set 1280x720 to allow 72hz and 96hz (3X and 4X multiples of 24hz) for smoother playback with bluray content.
Where can I get this inf file?? I'm currently just using the Acer .inf file.
Thanks... :-)
Thanks... :-)[/quote]
Sorry to Mourt and Dally28, I have been slack getting back to your pm's.
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.
Thanks... :-)
Sorry to Mourt and Dally28, I have been slack getting back to your pm's.
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.[/quote]
Thanks Bullripper. I have my optoma working with the Acer_H5360.inf driver, however, I don't have audio over hdmi anymore. Is anyone else having this issue?
It wouldn't allow me to upload an .inf file to these forums when you asked about the hybrid .inf file.
Mourt has the link couple posts above Dally28 if you want it, I should have just posted this link to avs forum myself originally.
I don't know if it is any good anyhow as when I did a couple of quick tests at 72hz or 96hz it didn't look smoother to me, maybe just my testing.
Tell me if you guys think it makes it any better on 1080p24 content.
Thanks Bullripper. I have my optoma working with the Acer_H5360.inf driver, however, I don't have audio over hdmi anymore. Is anyone else having this issue?
As I said in a couple of posts back, I think the end result had it been supported easily and just worked would have been the same as what we are already doing with the spoofed .inf as it is now known.
Perhaps it might have given us 1280x768@120hz and 1280x800@120hz with an official driver as the optoma can do these resolutions but that is all I can see that might change.
I think we should look at moving to another official supported projector monitor driver that does these two resolutions.
Previously when using Mourt's optoma.inf I could get those two resolutions without any issues.
ie There was no red text saying I'm running in a non stereo resolution.
[url="http://www.nvidia.com/object/3d-vision-requirements.html"]http://www.nvidia.com/object/3d-vision-requirements.html[/url] (I just checked and it still lists the Optoma's :) unless I'm looking at cached data somehow)
The funny thing is on that requirements page it says 1280x800 for the HD66 but HD67 says 1280x720, I thought they were the same projector and am sure they are and possibly just an error on that spec.
As I said in a couple of posts back, I think the end result had it been supported easily and just worked would have been the same as what we are already doing with the spoofed .inf as it is now known.
Perhaps it might have given us 1280x768@120hz and 1280x800@120hz with an official driver as the optoma can do these resolutions but that is all I can see that might change.
I think we should look at moving to another official supported projector monitor driver that does these two resolutions.
Previously when using Mourt's optoma.inf I could get those two resolutions without any issues.
ie There was no red text saying I'm running in a non stereo resolution.
http://www.nvidia.com/object/3d-vision-requirements.html (I just checked and it still lists the Optoma's :) unless I'm looking at cached data somehow)
The funny thing is on that requirements page it says 1280x800 for the HD66 but HD67 says 1280x720, I thought they were the same projector and am sure they are and possibly just an error on that spec.