Quick poll: What 3D settings do you prefer?
  2 / 2    
Depending on how much greater than 6.5cm the separation is and how far away the viewer is, it might only require a minute or very small amount of divergence. Perhaps it's just not enough to cause you discomfort.
Depending on how much greater than 6.5cm the separation is and how far away the viewer is, it might only require a minute or very small amount of divergence. Perhaps it's just not enough to cause you discomfort.

#16
Posted 02/15/2013 04:24 AM   
So annoying, forum just jacked up and lost my comment, here goes again! Hey Airion, thanks for your reply. I see where your coming from, I have thought the same, but there is a disparity between this logic and the maths that I have not completely ironed out (admittedly I've only done the maths in my head not on paper, but your method of working out infinite depth does not hold up scrutiny (to me right now anyway, I could be wrong ;-).) There is a great article found here: [url]http://www.dur.ac.uk/n.s.holliman/Presentations/EI4297A-07Protocols.pdf[/url] It has some great findings about 3D with regards to user, but on the front page can be found the simple conclusion: "The key variable that must be determined for any stereoscopic image creation is the camera separation, as this directly affects the amount of depth a viewer perceives in the final image. A number of approaches have been tried, including exact modeling of the user’s eye separation. However, depending on the scene content, this may capture very large or small image disparities (the difference between corresponding scene points in the left and right image, which the brain interprets as depth) that can produce too much or too little perceived depth on the target display. Using exact eye spacing is only reasonable for the orthoscopic case, where the object size and depth matches the target display size and comfortable depth range. In practice this is rarely the case and the camera separation is often determined by trial and error, which is tedious and can easily result in an image suited only to the creator’s binocular vision." Anyway, all I am positive about is that you must work out yourself via trial and error what is the max perceived depth on the screen via adjusting the settings, and 100% depth via NVidia and measuring your eyes help towards that, but are not the be all and end all. *this message has left out the comparison between a screen at the edge of the universe that takes up the same amount of visual space as a screen on your monitor, because to be honest, I think the maths starts getting more complicated then first thought out, and probably doesn't help very much at all!
So annoying, forum just jacked up and lost my comment, here goes again!

Hey Airion, thanks for your reply. I see where your coming from, I have thought the same, but there is a disparity between this logic and the maths that I have not completely ironed out (admittedly I've only done the maths in my head not on paper, but your method of working out infinite depth does not hold up scrutiny (to me right now anyway, I could be wrong ;-).)

There is a great article found here:
http://www.dur.ac.uk/n.s.holliman/Presentations/EI4297A-07Protocols.pdf

It has some great findings about 3D with regards to user, but on the front page can be found the simple conclusion:

"The key variable that must be determined for any stereoscopic image creation is the camera separation, as this directly affects the amount of depth a viewer perceives in the final image. A number of approaches have been tried, including exact modeling of the user’s eye separation. However, depending on the scene content, this may capture very large or small image
disparities (the difference between corresponding scene points in the left and right image, which the brain interprets as depth) that can produce too much or too little perceived depth on the target display. Using exact eye spacing is only reasonable for the orthoscopic case, where the object size and depth matches the target display size and comfortable depth range. In practice this
is rarely the case and the camera separation is often determined by trial and error, which is tedious and can easily result in an image suited only to the creator’s binocular vision."

Anyway, all I am positive about is that you must work out yourself via trial and error what is the max perceived depth on the screen via adjusting the settings, and 100% depth via NVidia and measuring your eyes help towards that, but are not the be all and end all.

*this message has left out the comparison between a screen at the edge of the universe that takes up the same amount of visual space as a screen on your monitor, because to be honest, I think the maths starts getting more complicated then first thought out, and probably doesn't help very much at all!

OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti

#17
Posted 02/15/2013 10:56 AM   
Thanks for the link, Foreverseeking, it's a good read. However I think you've made a simple mistake: you've mistaken camera separation with physical left/right separation on the display. Take for example when people make 3D photographs of large landscapes such as the Grand Canyon. Often the cameras are separated many meters (otherwise they won't capture any disparity). This doesn't mean the left and right images are separated by meters when viewed on a display. Most of the paper is talking the limits of comfort rather than the theoretical limits. For those of us used to viewing 3D, comfort isn't really the issue as our eyes are used to 3D images. There is this one part that talks about the limits though (middle of page 4, emphasis mine): "An equation to calculate camera separation is derived which brings points infinitely far from the viewer to the furthest fusible distance of a display. The maximum depth plane, and hence the furthest fusible distance, is taken to be a point giving a screen disparity of the modeled eye separation ([b]smaller than the true viewer eye separation[/b]) when viewed by the user." ...which I think is what I'm saying. I'd just go a little further and say that the screen disparity for objects in infinity can be equal to true viewer eye separation because comfort isn't an issue for us.
Thanks for the link, Foreverseeking, it's a good read. However I think you've made a simple mistake: you've mistaken camera separation with physical left/right separation on the display.

Take for example when people make 3D photographs of large landscapes such as the Grand Canyon. Often the cameras are separated many meters (otherwise they won't capture any disparity). This doesn't mean the left and right images are separated by meters when viewed on a display.

Most of the paper is talking the limits of comfort rather than the theoretical limits. For those of us used to viewing 3D, comfort isn't really the issue as our eyes are used to 3D images. There is this one part that talks about the limits though (middle of page 4, emphasis mine):

"An equation to calculate camera separation is derived which brings points infinitely far from the viewer to the furthest fusible distance of a display. The maximum depth plane, and hence the furthest fusible distance, is taken to be a point giving a screen disparity of the modeled eye separation (smaller than the true viewer eye separation) when viewed by the user."

...which I think is what I'm saying. I'd just go a little further and say that the screen disparity for objects in infinity can be equal to true viewer eye separation because comfort isn't an issue for us.

#18
Posted 02/15/2013 12:11 PM   
I was trawling the net to see if anyone has had this argument before, seems like you have Airion ;-) https://forums.geforce.com/default/topic/519243/3d-vision/maximum-depth-hack-go-beyond-100-/4/ As I said, I understand everything you're logically saying, but it doesn't computer with the maths, which means I have a flaw in my logic or my maths... or perhaps even both! Your eyes do not see in a straight line, your focal point is an arc. Therefore as distance increase you can focus on a wider point then your ocular separation without any divergence in your eyes. Going back to my deleted universe monitor experiment, I feel that actually in a stripped down form it could be useful in a thought experiment. Ok, so we have a monitor at the edge of the known universe. It takes up the exact same area of our vision as a 27 inch monitor 2 feet away from us. Both monitors are 3D active displays with an infinite resolution (no pixel depth distortion) We set up both screens to give a similar user experience. If we have a black cube in 3D, at infinite depth, on the normal monitor, then we have left eye: black cube left of centre right eye: black cube right of centre. On the screen on the edge of the universe monitor, you could leave the object where it is, it will still seem very very far away, but it would never be at 90 degrees, as the edge of the known universe is not infinite. Therefore, by putting the left and right images of the black cube on the universe screen at the same point relative to the normal monitor we have: left eye: black cube left of centre right eye: black cube right of centre that distance will be millions/ billions of light years across, not 6.5cm. That distance will equate to 90 degrees to your perceived vision. Lets not forget, your eyes do not focus in a straight, 1 dimensional line. It's focal point is a 3 dimensional cone. Surely that affects the outcome? But this is where my logic and maths separate, and i'm sure its because the logic of it doesn't compute purely because, we are faking 3D here. The real processor of 3D is the brain, taking 2D images with no retinal focus differences and turning it into 3D. That is incredibly different to how we normally perceive depth, and is why we can create the illusion of depth and convergence on a 2D panel. Likewise, by increasing the Ocular distance on a distant object on a 2D screen, it does not have the effect as in real 3D as there are a different set of visual cues at work. But whether I am wrong or right, or whether you have it bang on the head, I'm pretty sure that we can all agree (hopefully) that the best way to figure out your preferred depth and convergence is to have a good old fiddle with the advanced controls until you go, "yeah baby!" Thoughts?
I was trawling the net to see if anyone has had this argument before, seems like you have Airion ;-)

https://forums.geforce.com/default/topic/519243/3d-vision/maximum-depth-hack-go-beyond-100-/4/


As I said, I understand everything you're logically saying, but it doesn't computer with the maths, which means I have a flaw in my logic or my maths... or perhaps even both!

Your eyes do not see in a straight line, your focal point is an arc. Therefore as distance increase you can focus on a wider point then your ocular separation without any divergence in your eyes.

Going back to my deleted universe monitor experiment, I feel that actually in a stripped down form it could be useful in a thought experiment.

Ok, so we have a monitor at the edge of the known universe. It takes up the exact same area of our vision as a 27 inch monitor 2 feet away from us. Both monitors are 3D active displays with an infinite resolution (no pixel depth distortion)

We set up both screens to give a similar user experience. If we have a black cube in 3D, at infinite depth, on the normal monitor, then we have

left eye: black cube left of centre
right eye: black cube right of centre.

On the screen on the edge of the universe monitor, you could leave the object where it is, it will still seem very very far away, but it would never be at 90 degrees, as the edge of the known universe is not infinite.

Therefore, by putting the left and right images of the black cube on the universe screen at the same point relative to the normal monitor we have:

left eye: black cube left of centre
right eye: black cube right of centre

that distance will be millions/ billions of light years across, not 6.5cm. That distance will equate to 90 degrees to your perceived vision. Lets not forget, your eyes do not focus in a straight, 1 dimensional line. It's focal point is a 3 dimensional cone. Surely that affects the outcome?

But this is where my logic and maths separate, and i'm sure its because the logic of it doesn't compute purely because, we are faking 3D here. The real processor of 3D is the brain, taking 2D images with no retinal focus differences and turning it into 3D. That is incredibly different to how we normally perceive depth, and is why we can create the illusion of depth and convergence on a 2D panel. Likewise, by increasing the Ocular distance on a distant object on a 2D screen, it does not have the effect as in real 3D as there are a different set of visual cues at work.

But whether I am wrong or right, or whether you have it bang on the head, I'm pretty sure that we can all agree (hopefully) that the best way to figure out your preferred depth and convergence is to have a good old fiddle with the advanced controls until you go, "yeah baby!"

Thoughts?

OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti

#19
Posted 02/15/2013 01:17 PM   
[quote="foreverseeking"]I was trawling the net to see if anyone has had this argument before, seems like you have Airion ;-) https://forums.geforce.com/default/topic/519243/3d-vision/maximum-depth-hack-go-beyond-100-/4/ [/quote] Thanks for finding that! I was looking for this link which I previously posted in that thread: [url]http://www.siggraph.org/publications/newsletter/volume/stereoscopic-3d-film-and-animationgetting-it-right[/url] This more explicitly deals with the 6.5cm limit, which is 2.5 inches as the article discusses. Significantly, it deals with the typical movie theater environment where the viewer is sitting much further away from a much larger screen than even I have with a projector at home. The author is a 20 year veteran in 3D production. [quote="foreverseeking"]Therefore, by putting the left and right images of the black cube on the universe screen at the same point relative to the normal monitor we have: left eye: black cube left of centre right eye: black cube right of centre that distance will be millions/ billions of light years across, not 6.5cm. That distance will equate to 90 degrees to your perceived vision. Lets not forget, your eyes do not focus in a straight, 1 dimensional line. It's focal point is a 3 dimensional cone. Surely that affects the outcome?[/quote] I think here you're still thinking in terms of what a single camera sees. Specifically, the rightmost part that a right camera (or eye) can see compared to what the leftmost part that a left camera (or eye) can see. Typically 3D cameras look straight ahead the whole time, and then the two images that are captured are adjusted in order to be displayed on a 3D display satisfactorily. Our eyes converge, unlike 3D cameras. Both eyes point exactly to the tiny thing we want to look at. Peripheral vision doesn't matter. When we look at the moon for example. the right eye doesn't look at the right edge of the moon and the left eye the left edge of the moon. Both eyes look at either the right edge together, or the left edge together.
foreverseeking said:I was trawling the net to see if anyone has had this argument before, seems like you have Airion ;-)
https://forums.geforce.com/default/topic/519243/3d-vision/maximum-depth-hack-go-beyond-100-/4/


Thanks for finding that! I was looking for this link which I previously posted in that thread:

http://www.siggraph.org/publications/newsletter/volume/stereoscopic-3d-film-and-animationgetting-it-right

This more explicitly deals with the 6.5cm limit, which is 2.5 inches as the article discusses. Significantly, it deals with the typical movie theater environment where the viewer is sitting much further away from a much larger screen than even I have with a projector at home. The author is a 20 year veteran in 3D production.

foreverseeking said:Therefore, by putting the left and right images of the black cube on the universe screen at the same point relative to the normal monitor we have:

left eye: black cube left of centre
right eye: black cube right of centre

that distance will be millions/ billions of light years across, not 6.5cm. That distance will equate to 90 degrees to your perceived vision. Lets not forget, your eyes do not focus in a straight, 1 dimensional line. It's focal point is a 3 dimensional cone. Surely that affects the outcome?


I think here you're still thinking in terms of what a single camera sees. Specifically, the rightmost part that a right camera (or eye) can see compared to what the leftmost part that a left camera (or eye) can see. Typically 3D cameras look straight ahead the whole time, and then the two images that are captured are adjusted in order to be displayed on a 3D display satisfactorily. Our eyes converge, unlike 3D cameras. Both eyes point exactly to the tiny thing we want to look at. Peripheral vision doesn't matter.

When we look at the moon for example. the right eye doesn't look at the right edge of the moon and the left eye the left edge of the moon. Both eyes look at either the right edge together, or the left edge together.

#20
Posted 02/15/2013 01:49 PM   
but then, and again I accept there is every chance you are correct, what about this example. Look out the window. Now hold your fingers in front of your eyes, now move them away and continue that focus point. Your eyes will be focusing on the horizon, with a distance of many hundreds of metres across. By your understanding, our eyes would now be diverging, but that clearly isn't the case. and also [quote="Airion]When we look at the moon for example. the right eye doesn't look at the right edge of the moon and the left edge of the moon. Both eyes look at either the right edge together, or the left edge together. [/quote] Does not that negate your argument about infinity at depth? Infinity depth requires are eyes to focus at 90 degrees. If your eyes are meeting at a point (converging) that is not infinity, and so therefore we would need to increase the separation if you wanted infinity. Another example, using two sticks, at ocular distance and next to the eye it will be in the centre of your vision. The further they are moved away the more they will retreat into the right of the left eye and the left of the right eye. I think we have both said everything, and I'm still not convinced either way. I think there is truths in what we both say, but still accept I might be completely wrong! Does anyone else have any input on this?
but then, and again I accept there is every chance you are correct, what about this example.

Look out the window. Now hold your fingers in front of your eyes, now move them away and continue that focus point. Your eyes will be focusing on the horizon, with a distance of many hundreds of metres across. By your understanding, our eyes would now be diverging, but that clearly isn't the case.

and also

Airion said:When we look at the moon for example. the right eye doesn't look at the right edge of the moon and the left edge of the moon. Both eyes look at either the right edge together, or the left edge together.


Does not that negate your argument about infinity at depth? Infinity depth requires are eyes to focus at 90 degrees. If your eyes are meeting at a point (converging) that is not infinity, and so therefore we would need to increase the separation if you wanted infinity.

Another example, using two sticks, at ocular distance and next to the eye it will be in the centre of your vision. The further they are moved away the more they will retreat into the right of the left eye and the left of the right eye.

I think we have both said everything, and I'm still not convinced either way. I think there is truths in what we both say, but still accept I might be completely wrong! Does anyone else have any input on this?

OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti

#21
Posted 02/15/2013 02:01 PM   
[quote="foreverseeking"]Look out the window. Now hold your fingers in front of your eyes, now move them away and continue that focus point. Your eyes will be focusing on the horizon, with a distance of many hundreds of metres across. By your understanding, our eyes would now be diverging, but that clearly isn't the case.[/quote] As a rule, our eyes [i]never[/i] diverge in real life even a little tiny bit, right? Honestly I don't follow your experiment here. I tried it a few times but I don't know what you mean. Maybe you're confusing stereovision with perspective. I think you've got this exactly backwards, for example: [quote="foreverseeking"]Another example, using two sticks, at ocular distance and next to the eye it will be in the centre of your vision. The further they are moved away the more they will retreat into the right of the left eye and the left of the right eye.[/quote] Test it with one eye at a time rather than both open. Put your left finger over your left eye. Close your left eye and open your right. Your finger will appear to the far left with your right eye open. Move your finger straight, further away, and it appears to move more to the center as viewed through the same right eye. This is perspective at work. [quote="foreverseeking"]Does not that negate your argument about infinity at depth? Infinity depth requires are eyes to focus at 90 degrees. If your eyes are meeting at a point (converging) that is not infinity, and so therefore we would need to increase the separation if you wanted infinity.[/quote] Our eyes are 6.5cm apart. The moon (either edge) is 381,000,000,000cm away from us. Mathematically the difference between the two isn't literally infinite, but it practically is. Do the math on that, and I'm sure it will be 90 degrees within any reasonable decimal point. Certainly given the biological limits of our visual acuity, we get nothing out of the 6.5cm between our eyes when looking that far away. Still, the moon is close enough and big enough that our eyes have to move, together, locked in parallel, to look at one edge or another. If the two edges were indistinguishable, it would appear as a star, a single point of light.
foreverseeking said:Look out the window. Now hold your fingers in front of your eyes, now move them away and continue that focus point. Your eyes will be focusing on the horizon, with a distance of many hundreds of metres across. By your understanding, our eyes would now be diverging, but that clearly isn't the case.


As a rule, our eyes never diverge in real life even a little tiny bit, right? Honestly I don't follow your experiment here. I tried it a few times but I don't know what you mean. Maybe you're confusing stereovision with perspective.

I think you've got this exactly backwards, for example:

foreverseeking said:Another example, using two sticks, at ocular distance and next to the eye it will be in the centre of your vision. The further they are moved away the more they will retreat into the right of the left eye and the left of the right eye.


Test it with one eye at a time rather than both open. Put your left finger over your left eye. Close your left eye and open your right. Your finger will appear to the far left with your right eye open. Move your finger straight, further away, and it appears to move more to the center as viewed through the same right eye. This is perspective at work.

foreverseeking said:Does not that negate your argument about infinity at depth? Infinity depth requires are eyes to focus at 90 degrees. If your eyes are meeting at a point (converging) that is not infinity, and so therefore we would need to increase the separation if you wanted infinity.


Our eyes are 6.5cm apart. The moon (either edge) is 381,000,000,000cm away from us. Mathematically the difference between the two isn't literally infinite, but it practically is. Do the math on that, and I'm sure it will be 90 degrees within any reasonable decimal point. Certainly given the biological limits of our visual acuity, we get nothing out of the 6.5cm between our eyes when looking that far away.

Still, the moon is close enough and big enough that our eyes have to move, together, locked in parallel, to look at one edge or another. If the two edges were indistinguishable, it would appear as a star, a single point of light.

#22
Posted 02/15/2013 02:37 PM   
Has anyone got a projector? I thought the seperation was still 6.5cm or whatever at 100%. I have only seen one movie in 3D Christmas Carol and I only recall a slight blurring of separation - probably within 6.5cm. Even on my 23" monitor, if I am further away from the screen, the depth seems greater.
Has anyone got a projector?

I thought the seperation was still 6.5cm or whatever at 100%. I have only seen one movie in 3D Christmas Carol and I only recall a slight blurring of separation - probably within 6.5cm.


Even on my 23" monitor, if I am further away from the screen, the depth seems greater.

Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM

Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes

#23
Posted 02/15/2013 03:46 PM   
Ok, Fair enough. I've sat down with a pad and pen, and even taking the focal point as being a 1 degree cone (I couldn't find exactly what it is) the maths does stack up against my original thoughts. I should have just done that in the first place, but meh, this was far more fun! Also, what has come to light in the maths, is the conclusion that for any object your eyes converge on (aka everything) the further back you are away from the screen, the further away they will appear due to increase of the viewing angle (e.g. something on screen close up may be 45 degrees to each eye, but further away is 50 degrees.) So all in all it just goes to show how little I know ;-) Sure seems weird that people are finding values much higher then 6.5 cm better to simulate huge distances when viewing on large screens, but I'm willing to let this one rest. Sorry for completely and utterly hijacking this thread andysonofbob, you may have it back now!
Ok, Fair enough. I've sat down with a pad and pen, and even taking the focal point as being a 1 degree cone (I couldn't find exactly what it is) the maths does stack up against my original thoughts. I should have just done that in the first place, but meh, this was far more fun! Also, what has come to light in the maths, is the conclusion that for any object your eyes converge on (aka everything) the further back you are away from the screen, the further away they will appear due to increase of the viewing angle (e.g. something on screen close up may be 45 degrees to each eye, but further away is 50 degrees.)

So all in all it just goes to show how little I know ;-)

Sure seems weird that people are finding values much higher then 6.5 cm better to simulate huge distances when viewing on large screens, but I'm willing to let this one rest.

Sorry for completely and utterly hijacking this thread andysonofbob, you may have it back now!

OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti

#24
Posted 02/15/2013 04:09 PM   
[quote="andysonofbob"]Has anyone got a projector?I thought the seperation was still 6.5cm or whatever at 100%. I have only seen one movie in 3D Christmas Carol and I only recall a slight blurring of separation - probably within 6.5cm.[/quote] I think this will explain a lot better than I can: [url]http://www.siggraph.org/publications/newsletter/volume/stereoscopic-3d-film-and-animationgetting-it-right[/url] Basically, with movies you can adjust neither depth nor convergence, only the size of the screen and your distance from it. When you get them on 3D Blu-ray, the depth and convergence are baked in and designed for a theater sized screen and viewing distance. The best you can do is hope to replicate that. With Nvidia, you can adjust both depth and convergence for games, so that's a huge plus. But with a projector your PC has no way of knowing how large you screen is. Move your projector forward, the image gets smaller. Move it back, the image gets larger. Zoom in, zoom out, etc, this changes the size and physical separation of objects, but there's no way the Nvidia control panel would know. That's why you can't just set it at 100%. You need to break out a tape measure. You might need more depth, you might (probably) need less depth to calibrate it properly.
andysonofbob said:Has anyone got a projector?I thought the seperation was still 6.5cm or whatever at 100%. I have only seen one movie in 3D Christmas Carol and I only recall a slight blurring of separation - probably within 6.5cm.


I think this will explain a lot better than I can: http://www.siggraph.org/publications/newsletter/volume/stereoscopic-3d-film-and-animationgetting-it-right

Basically, with movies you can adjust neither depth nor convergence, only the size of the screen and your distance from it. When you get them on 3D Blu-ray, the depth and convergence are baked in and designed for a theater sized screen and viewing distance. The best you can do is hope to replicate that.

With Nvidia, you can adjust both depth and convergence for games, so that's a huge plus. But with a projector your PC has no way of knowing how large you screen is. Move your projector forward, the image gets smaller. Move it back, the image gets larger. Zoom in, zoom out, etc, this changes the size and physical separation of objects, but there's no way the Nvidia control panel would know. That's why you can't just set it at 100%. You need to break out a tape measure. You might need more depth, you might (probably) need less depth to calibrate it properly.

#25
Posted 02/15/2013 04:13 PM   
[quote="foreverseeking"] Also, what has come to light in the maths, is the conclusion that for any object your eyes converge on (aka everything) the further back you are away from the screen, the further away they will appear due to increase of the viewing angle (e.g. something on screen close up may be 45 degrees to each eye, but further away is 50 degrees.)[/quote] Exactly, the further back you are the more pop out seems to pop out, and the deeper depth appears to be. [i]Except[/i] objects at infinity, which should appear at infinity no matter where you are. This is recommended for movies, as they're calibrated to seen at a distance. At home this comes at the price of field of view, which I feel is as important for immersion, so I ignore the recommendations when watching movies and sit as close as I can! As for why people enjoy games with separation beyond 6.5 cm, I figure it's because it expands depth in the foreground/middleground where most viewing is done. I've tried this myself in Assassin's Creed Brotherhood, and it was pretty impressive jumping from one building to another with the distance exaggerated. However, the infinity point seemed to be reached too quickly, as if objects which should appear closer were forced into the distance. I really don't know how the brain deals with divergence, as it's totally unnatural. At least the focus/accommodation (convergence) mismatch we get with stereoscopic 3D can be found in the natural world with reflections on water or in mirrors. Maybe our brains can process extra depth with divergence, but probably not. Most likely those people playing with greater than 6.5cm separation simply aren't diverging their eyes enough for it to make much of a difference. Indeed comfort is what matters most at the end of the day.
foreverseeking said: Also, what has come to light in the maths, is the conclusion that for any object your eyes converge on (aka everything) the further back you are away from the screen, the further away they will appear due to increase of the viewing angle (e.g. something on screen close up may be 45 degrees to each eye, but further away is 50 degrees.)


Exactly, the further back you are the more pop out seems to pop out, and the deeper depth appears to be. Except objects at infinity, which should appear at infinity no matter where you are. This is recommended for movies, as they're calibrated to seen at a distance. At home this comes at the price of field of view, which I feel is as important for immersion, so I ignore the recommendations when watching movies and sit as close as I can!

As for why people enjoy games with separation beyond 6.5 cm, I figure it's because it expands depth in the foreground/middleground where most viewing is done. I've tried this myself in Assassin's Creed Brotherhood, and it was pretty impressive jumping from one building to another with the distance exaggerated. However, the infinity point seemed to be reached too quickly, as if objects which should appear closer were forced into the distance. I really don't know how the brain deals with divergence, as it's totally unnatural. At least the focus/accommodation (convergence) mismatch we get with stereoscopic 3D can be found in the natural world with reflections on water or in mirrors. Maybe our brains can process extra depth with divergence, but probably not. Most likely those people playing with greater than 6.5cm separation simply aren't diverging their eyes enough for it to make much of a difference.

Indeed comfort is what matters most at the end of the day.

#26
Posted 02/15/2013 04:37 PM   
  2 / 2    
Scroll To Top