I was hoping that someone knew of some games that could be used as an example where they were able to fix this problem by either
1) editing a .ini or some other game config files,
2) editing one or more of the game's registry entries,
3) opening a console in the game and changing the value of some cvars or something, or
4) using some other trick that I haven't thought of.
I think this specific problem is a natural cause of games not being totally made with 3d in mind. Usually to save resources backgrounds and skybox is usually a 2d-image. To render this in 3d you to have it on a fictive "infinite" depth (skybox) or at least least if they're 2d having them on the correct depth. When gamedevs accepts 3d i'm pretty sure this will be no future phenomena in games. But so far i think we have to stick with it. :(
I think this specific problem is a natural cause of games not being totally made with 3d in mind. Usually to save resources backgrounds and skybox is usually a 2d-image. To render this in 3d you to have it on a fictive "infinite" depth (skybox) or at least least if they're 2d having them on the correct depth. When gamedevs accepts 3d i'm pretty sure this will be no future phenomena in games. But so far i think we have to stick with it. :(
Wondering about rendering process issues comparing nvidia and iZ3D:
I think you're right and that's too bad and I think that as the gamespace data goes through the rendering/processing pipeline, the skybox image data probably loses it's label/identity as skybox data so that as that data reaches the hardware, the hardware can't recognize it as skybox data. If it could, then they could make controls to control the depth of the skybox. This is where I think iZ3D has a little advantage because I get the general idea that their driver inserts/injects itself into the processing stream earlier than nvidia's driver does but I could be wrong about that. What makes me wonder about that is that lots of times nvidia's driver doesn't work well with some post-processing effects like motion-blur, bloom, warp effects, etc. During those events, the screen will go 2d temporarily while they work fine under iZ3D's driver. Obviously they're doing something differently. Just imagine if iZ3D and nvidia got together to make an uber-driver. :lol:
On second thought, that might not work well.
I just remembered that in Starwars Battlefront, nvidia's driver has good skydepth and iZ3D's has bad sky depth but I'll have to recheck this sometime. It doesn't sound right.
Wondering about rendering process issues comparing nvidia and iZ3D:
I think you're right and that's too bad and I think that as the gamespace data goes through the rendering/processing pipeline, the skybox image data probably loses it's label/identity as skybox data so that as that data reaches the hardware, the hardware can't recognize it as skybox data. If it could, then they could make controls to control the depth of the skybox. This is where I think iZ3D has a little advantage because I get the general idea that their driver inserts/injects itself into the processing stream earlier than nvidia's driver does but I could be wrong about that. What makes me wonder about that is that lots of times nvidia's driver doesn't work well with some post-processing effects like motion-blur, bloom, warp effects, etc. During those events, the screen will go 2d temporarily while they work fine under iZ3D's driver. Obviously they're doing something differently. Just imagine if iZ3D and nvidia got together to make an uber-driver. :lol:
On second thought, that might not work well.
I just remembered that in Starwars Battlefront, nvidia's driver has good skydepth and iZ3D's has bad sky depth but I'll have to recheck this sometime. It doesn't sound right.
1) editing a .ini or some other game config files,
2) editing one or more of the game's registry entries,
3) opening a console in the game and changing the value of some cvars or something, or
4) using some other trick that I haven't thought of.
Thanks for replying anyway though.
1) editing a .ini or some other game config files,
2) editing one or more of the game's registry entries,
3) opening a console in the game and changing the value of some cvars or something, or
4) using some other trick that I haven't thought of.
Thanks for replying anyway though.
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.
I think you're right and that's too bad and I think that as the gamespace data goes through the rendering/processing pipeline, the skybox image data probably loses it's label/identity as skybox data so that as that data reaches the hardware, the hardware can't recognize it as skybox data. If it could, then they could make controls to control the depth of the skybox. This is where I think iZ3D has a little advantage because I get the general idea that their driver inserts/injects itself into the processing stream earlier than nvidia's driver does but I could be wrong about that. What makes me wonder about that is that lots of times nvidia's driver doesn't work well with some post-processing effects like motion-blur, bloom, warp effects, etc. During those events, the screen will go 2d temporarily while they work fine under iZ3D's driver. Obviously they're doing something differently. Just imagine if iZ3D and nvidia got together to make an uber-driver. :lol:
On second thought, that might not work well.
I just remembered that in Starwars Battlefront, nvidia's driver has good skydepth and iZ3D's has bad sky depth but I'll have to recheck this sometime. It doesn't sound right.
I think you're right and that's too bad and I think that as the gamespace data goes through the rendering/processing pipeline, the skybox image data probably loses it's label/identity as skybox data so that as that data reaches the hardware, the hardware can't recognize it as skybox data. If it could, then they could make controls to control the depth of the skybox. This is where I think iZ3D has a little advantage because I get the general idea that their driver inserts/injects itself into the processing stream earlier than nvidia's driver does but I could be wrong about that. What makes me wonder about that is that lots of times nvidia's driver doesn't work well with some post-processing effects like motion-blur, bloom, warp effects, etc. During those events, the screen will go 2d temporarily while they work fine under iZ3D's driver. Obviously they're doing something differently. Just imagine if iZ3D and nvidia got together to make an uber-driver. :lol:
On second thought, that might not work well.
I just remembered that in Starwars Battlefront, nvidia's driver has good skydepth and iZ3D's has bad sky depth but I'll have to recheck this sometime. It doesn't sound right.