State of developement
Is the current software the extent that we'll be seeing this tech go? There are many situations I'm in where things feel off, where my right eye can see a piece of terrain that it would not be able to see in the same position in real life. This sort of effect causes a cross-eyed viewing and breaks the entire effect. The other issue I've noticed is when elements begin to enter the negative depths, towards the viewer off the screen, they extend too rapidly and cause the cross-eyed view. I've managed to create balanced ratio of depth that allow close objects to extend out near how they should, however, it almost entirely negates the inward positive depth effect. It would be nice if negative depths were treated as if they could move passed my range of view and not be displayed in a position that would normally not be possible to see. It's frustrating to be seeing elements of a wall in the middle of my right eye that are on the very outer limits of my left eyes peripheral range or even elements that should be completely out of my viewing range. One last pain that cropped up, but have doubts of it's ableness to be corrected. When you are standing against a wall to your left that has a slight angle the camera that renders the left eye will adjust to the left for the frame, but adjust within the wall causing the texture to be incorrectly drawn on the object. This causes two images that should line up into one after being stereod between your eyes to end up being different images composited on top of each other. Kind of painful to look at or even attempt to focus on.

To come back to my original question, any chance the drivers/software that handle generating the 3D effects are still being worked on with corrections to any of the above concerns?
Is the current software the extent that we'll be seeing this tech go? There are many situations I'm in where things feel off, where my right eye can see a piece of terrain that it would not be able to see in the same position in real life. This sort of effect causes a cross-eyed viewing and breaks the entire effect. The other issue I've noticed is when elements begin to enter the negative depths, towards the viewer off the screen, they extend too rapidly and cause the cross-eyed view. I've managed to create balanced ratio of depth that allow close objects to extend out near how they should, however, it almost entirely negates the inward positive depth effect. It would be nice if negative depths were treated as if they could move passed my range of view and not be displayed in a position that would normally not be possible to see. It's frustrating to be seeing elements of a wall in the middle of my right eye that are on the very outer limits of my left eyes peripheral range or even elements that should be completely out of my viewing range. One last pain that cropped up, but have doubts of it's ableness to be corrected. When you are standing against a wall to your left that has a slight angle the camera that renders the left eye will adjust to the left for the frame, but adjust within the wall causing the texture to be incorrectly drawn on the object. This causes two images that should line up into one after being stereod between your eyes to end up being different images composited on top of each other. Kind of painful to look at or even attempt to focus on.



To come back to my original question, any chance the drivers/software that handle generating the 3D effects are still being worked on with corrections to any of the above concerns?

#1
Posted 01/26/2009 12:09 AM   
[quote name='oversaucy' post='496591' date='Jan 25 2009, 05:09 PM']Is the current software the extent that we'll be seeing this tech go? There are many situations I'm in where things feel off, where my right eye can see a piece of terrain that it would not be able to see in the same position in real life. This sort of effect causes a cross-eyed viewing and breaks the entire effect. The other issue I've noticed is when elements begin to enter the negative depths, towards the viewer off the screen, they extend too rapidly and cause the cross-eyed view. I've managed to create balanced ratio of depth that allow close objects to extend out near how they should, however, it almost entirely negates the inward positive depth effect. It would be nice if negative depths were treated as if they could move passed my range of view and not be displayed in a position that would normally not be possible to see. It's frustrating to be seeing elements of a wall in the middle of my right eye that are on the very outer limits of my left eyes peripheral range or even elements that should be completely out of my viewing range. One last pain that cropped up, but have doubts of it's ableness to be corrected. When you are standing against a wall to your left that has a slight angle the camera that renders the left eye will adjust to the left for the frame, but adjust within the wall causing the texture to be incorrectly drawn on the object. This causes two images that should line up into one after being stereod between your eyes to end up being different images composited on top of each other. Kind of painful to look at or even attempt to focus on.

To come back to my original question, any chance the drivers/software that handle generating the 3D effects are still being worked on with corrections to any of the above concerns?[/quote]

None of these games had steroestopic software installed to them. Whe you watch the test setup wuth the nvidia letters and eye this shows you how in the future computer software will be able to implement this kind of software in games to come
[quote name='oversaucy' post='496591' date='Jan 25 2009, 05:09 PM']Is the current software the extent that we'll be seeing this tech go? There are many situations I'm in where things feel off, where my right eye can see a piece of terrain that it would not be able to see in the same position in real life. This sort of effect causes a cross-eyed viewing and breaks the entire effect. The other issue I've noticed is when elements begin to enter the negative depths, towards the viewer off the screen, they extend too rapidly and cause the cross-eyed view. I've managed to create balanced ratio of depth that allow close objects to extend out near how they should, however, it almost entirely negates the inward positive depth effect. It would be nice if negative depths were treated as if they could move passed my range of view and not be displayed in a position that would normally not be possible to see. It's frustrating to be seeing elements of a wall in the middle of my right eye that are on the very outer limits of my left eyes peripheral range or even elements that should be completely out of my viewing range. One last pain that cropped up, but have doubts of it's ableness to be corrected. When you are standing against a wall to your left that has a slight angle the camera that renders the left eye will adjust to the left for the frame, but adjust within the wall causing the texture to be incorrectly drawn on the object. This causes two images that should line up into one after being stereod between your eyes to end up being different images composited on top of each other. Kind of painful to look at or even attempt to focus on.



To come back to my original question, any chance the drivers/software that handle generating the 3D effects are still being worked on with corrections to any of the above concerns?



None of these games had steroestopic software installed to them. Whe you watch the test setup wuth the nvidia letters and eye this shows you how in the future computer software will be able to implement this kind of software in games to come

#2
Posted 01/26/2009 12:37 AM   
Scroll To Top