Custom high refresh rate resolutions removed in game config!
Hi, I just got my 3d vision kit, and I'm trying to set it up with a CRT monitor. I am having some very frustrating problems. Basically, I'll just lay it out like this:
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Hi, I just got my 3d vision kit, and I'm trying to set it up with a CRT monitor. I am having some very frustrating problems. Basically, I'll just lay it out like this:
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
[quote name='danahata77' date='07 February 2011 - 09:35 AM' timestamp='1297100118' post='1190164']
Hi, I just got my 3d vision kit, and I'm trying to set it up with a CRT monitor. I am having some very frustrating problems. Basically, I'll just lay it out like this:
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Please help.
[/quote]
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
[quote name='danahata77' date='07 February 2011 - 09:35 AM' timestamp='1297100118' post='1190164']
Hi, I just got my 3d vision kit, and I'm trying to set it up with a CRT monitor. I am having some very frustrating problems. Basically, I'll just lay it out like this:
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Please help.
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
[quote name='ERP' date='07 February 2011 - 06:52 PM' timestamp='1297104779' post='1190217']
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
[/quote]
Well, what I figured out how to get the games to "see" a custom resolution, and that was by adding some new custom "resolutions" with typical refresh rates like 85hz. So, apparently directx or something doesn't like high refresh rates. So as of now i have two custom "resolutions": 1400x1050@110hz and 1400x1050@85hz, and at least the games can "see" the 1400x1050 option. Of course I still have a problem, which is how to get the game to load the higher refresh rate. The windows desktop has no problem with 110hz (or 100hz). Is there a way to lock the refresh rate so that the game can't change it? The 3d vision option of "Apply this refresh rate to all games" doesn't work!
[quote name='ERP' date='07 February 2011 - 06:52 PM' timestamp='1297104779' post='1190217']
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
Well, what I figured out how to get the games to "see" a custom resolution, and that was by adding some new custom "resolutions" with typical refresh rates like 85hz. So, apparently directx or something doesn't like high refresh rates. So as of now i have two custom "resolutions": 1400x1050@110hz and 1400x1050@85hz, and at least the games can "see" the 1400x1050 option. Of course I still have a problem, which is how to get the game to load the higher refresh rate. The windows desktop has no problem with 110hz (or 100hz). Is there a way to lock the refresh rate so that the game can't change it? The 3d vision option of "Apply this refresh rate to all games" doesn't work!
With 3DVision, it forces the refresh rate to the one selected in the test, regardless of what you select in game.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
With 3DVision, it forces the refresh rate to the one selected in the test, regardless of what you select in game.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
[quote name='ERP' date='07 February 2011 - 08:42 PM' timestamp='1297111362' post='1190285']
With 3DVision, it forces the refresh rate to the one selected in the test, regardless of what you select in game.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
[/quote]
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
[quote name='ERP' date='07 February 2011 - 08:42 PM' timestamp='1297111362' post='1190285']
With 3DVision, it forces the refresh rate to the one selected in the test, regardless of what you select in game.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
OK, I'd like to report some success. I was able to lock the refresh rate to 110hz (or whatever other refresh rate I desired), by setting the ForceRefreshRate registry key to 9e (which is 110 in hexadecimal). Not very cool that I have to go to such lengths to achieve what should work out of the box, but I'm just glad to end this as of now 24-hr quest to find a solution. I wonder if the fact that i'm using a CRT has something to do with it. Anyway, here are the details of that registry hack, by the way:
[quote]DX9 games you can in Vista, but not DX10 ones.
OK, I'd like to report some success. I was able to lock the refresh rate to 110hz (or whatever other refresh rate I desired), by setting the ForceRefreshRate registry key to 9e (which is 110 in hexadecimal). Not very cool that I have to go to such lengths to achieve what should work out of the box, but I'm just glad to end this as of now 24-hr quest to find a solution. I wonder if the fact that i'm using a CRT has something to do with it. Anyway, here are the details of that registry hack, by the way:
[quote name='danahata77' date='07 February 2011 - 04:09 PM' timestamp='1297112957' post='1190300']
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
[/quote]
I'm on 280.26 and had the same problem. With my old drivers the games seemed to default to the highest refresh available. With 280.26 games kept running 1600x1000x85 instead of my custom 110 Hz. I tried "apply res to all games" in the 3D settings panel and the games still loaded at 85 Hz. In the game I loaded a different resolution (1680x1050) thinking that would work since the only freq shown in the nVidia drivers is my custom 1680x1050x106. Instead I got some super low refresh like 60Hz the glasses wouldn't even sync to. I went back to 1600x1000 in game and viola, 1600x1000x110!. Apparently you have to select the resolution in game after you set the default in the 3D driver. Fixed both Witcher and TF2 for me without any registry haxoring.
[quote name='danahata77' date='07 February 2011 - 04:09 PM' timestamp='1297112957' post='1190300']
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
I'm on 280.26 and had the same problem. With my old drivers the games seemed to default to the highest refresh available. With 280.26 games kept running 1600x1000x85 instead of my custom 110 Hz. I tried "apply res to all games" in the 3D settings panel and the games still loaded at 85 Hz. In the game I loaded a different resolution (1680x1050) thinking that would work since the only freq shown in the nVidia drivers is my custom 1680x1050x106. Instead I got some super low refresh like 60Hz the glasses wouldn't even sync to. I went back to 1600x1000 in game and viola, 1600x1000x110!. Apparently you have to select the resolution in game after you set the default in the 3D driver. Fixed both Witcher and TF2 for me without any registry haxoring.
Dang. It does not save between loads. Every time I load the game I have to swap resolutions the get the 3D refresh rate to load. Anyone know how to make it stick?
Dang. It does not save between loads. Every time I load the game I have to swap resolutions the get the 3D refresh rate to load. Anyone know how to make it stick?
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Please help.
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Please help.
Hi, I just got my 3d vision kit, and I'm trying to set it up with a CRT monitor. I am having some very frustrating problems. Basically, I'll just lay it out like this:
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Please help.
[/quote]
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
Hi, I just got my 3d vision kit, and I'm trying to set it up with a CRT monitor. I am having some very frustrating problems. Basically, I'll just lay it out like this:
1. Game has the resolution I need in the in-game display config, say 1280x1024, and works fine.
2. I add a custom resolution in Nvidia's control panel of 1280x1024, but I set the refresh rate to something high like 110 or 120hz, which is necessary for 3d vision
3. I test the resolution/refresh rate in windows, and it works fine
4. I go back into the game and now the resolution is no longer available on the display config screen. It has been removed!
Every time I add a new custom resolution to change the refresh rate to something high, the resolution won't show up anymore in the game. I've confirmed this happens in a whole set of games, such as Oblivion and Dragon Age, so s the underlying problem caused by something not game-specific, like directx? (Crysis seems to work, that is, let me choose the custom resolutions)
This is extremely frustrating and I'm at my wits end as to how to make these games stop removing my resolutions!
Please help.
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
My Blog
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
[/quote]
Well, what I figured out how to get the games to "see" a custom resolution, and that was by adding some new custom "resolutions" with typical refresh rates like 85hz. So, apparently directx or something doesn't like high refresh rates. So as of now i have two custom "resolutions": 1400x1050@110hz and 1400x1050@85hz, and at least the games can "see" the 1400x1050 option. Of course I still have a problem, which is how to get the game to load the higher refresh rate. The windows desktop has no problem with 110hz (or 100hz). Is there a way to lock the refresh rate so that the game can't change it? The 3d vision option of "Apply this refresh rate to all games" doesn't work!
The game is probably filtering out resolutions, either because they are enumerated last, or because it filters based on claimed frequency.
All the game does is enumerate modes from the adapter, if you see the res in windows, the game will see it when the modes are enumerated (assuming NVidia isn't doing something very funky).
A lot of games try to filter the list to limit the set they have to display, it's why a lot of people can't select 1920x1080@24fps in 3DTVPlay for example.
Well, what I figured out how to get the games to "see" a custom resolution, and that was by adding some new custom "resolutions" with typical refresh rates like 85hz. So, apparently directx or something doesn't like high refresh rates. So as of now i have two custom "resolutions": 1400x1050@110hz and 1400x1050@85hz, and at least the games can "see" the 1400x1050 option. Of course I still have a problem, which is how to get the game to load the higher refresh rate. The windows desktop has no problem with 110hz (or 100hz). Is there a way to lock the refresh rate so that the game can't change it? The 3d vision option of "Apply this refresh rate to all games" doesn't work!
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
My Blog
With 3DVision, it forces the refresh rate to the one selected in the test, regardless of what you select in game.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
[/quote]
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
With 3DVision, it forces the refresh rate to the one selected in the test, regardless of what you select in game.
It's really obvious in games like WOW that list every enumerated resolution and refresh rate, you set the refresh rate to say 60 anf it still ends up @110 or whatever you selected in the test box.
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
[quote]DX9 games you can in Vista, but not DX10 ones.
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\DirectDraw reg_Dword 'ForceRefreshRate'
X64 systems also need to add:
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\DirectDraw with the same registry key
The value you pick needs to be a hex value so if you wanted 75Hz forced you would enter 4B as the value for the registry key
If you don't using a program (or it doesn't work) then use this registry entry.
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\DirectDraw]
"ForceRefreshRate"=dword:00000055
Note dword value of 55 = 85 hz
Change dword value to 4b for 75 hz
Change dword value to 3c for 60 hz
Change dword value to 64 for 100 hz
[/quote]
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
[/quote]
I'm on 280.26 and had the same problem. With my old drivers the games seemed to default to the highest refresh available. With 280.26 games kept running 1600x1000x85 instead of my custom 110 Hz. I tried "apply res to all games" in the 3D settings panel and the games still loaded at 85 Hz. In the game I loaded a different resolution (1680x1050) thinking that would work since the only freq shown in the nVidia drivers is my custom 1680x1050x106. Instead I got some super low refresh like 60Hz the glasses wouldn't even sync to. I went back to 1600x1000 in game and viola, 1600x1000x110!. Apparently you have to select the resolution in game after you set the default in the 3D driver. Fixed both Witcher and TF2 for me without any registry haxoring.
I'm using the OSD of the CRT monitor itself to determine what the refresh rate is currently set to. It does not get automatically set to 110hz, but rather 85hz. I tried to use software called PowerStrip to use it's refresh rate locking feature, but it resulted in the screwed up output with a lot of flickering.
I'm on 280.26 and had the same problem. With my old drivers the games seemed to default to the highest refresh available. With 280.26 games kept running 1600x1000x85 instead of my custom 110 Hz. I tried "apply res to all games" in the 3D settings panel and the games still loaded at 85 Hz. In the game I loaded a different resolution (1680x1050) thinking that would work since the only freq shown in the nVidia drivers is my custom 1680x1050x106. Instead I got some super low refresh like 60Hz the glasses wouldn't even sync to. I went back to 1600x1000 in game and viola, 1600x1000x110!. Apparently you have to select the resolution in game after you set the default in the 3D driver. Fixed both Witcher and TF2 for me without any registry haxoring.