Dilemma: Many VR headset owners view wrong scaling as a hardware issue.
  1 / 2    
I keep running into comments from VR headsets users saying their virtual worlds are the wrong scale. Adjusting IPD has helped a few people on the PSVR reddit, but in the case of all headset brands, sometimes people claim they've adjusted their IPD properly and seem to associate wrong scale with a particular headset or game, rather than convergence being off. What i wonder is, do they lock convergence like its done on Nvidia? Or can convergence somehow be adjusted properly? Is IPD adjusted accurately on the Vive and Rift? Any ideas of whats going on? I know some of you guys have the headsets yourselves and i was hoping you cculd shed some light on the subject as i was hoping to make a post on the Oculus forums about making a convergence adjustment available in the software to account for eye-distance changes, in-game FOV setting changes, wearing glasses or not, eye recess depth, amount of fatty tissue wear headset mounts on the face, etc. But i don't even know how much of a problem this is since i don't know much about the headsets, i'm just concerned about people getting a good experience and i know some of you have more experience adjust depth/convergence than almost anyone else on the planet. Heres an example, I just watched UKRifter do a comparison of the Rift and Vive, he didn't like the Vive for Elite Dangerous because the planets looked small [i][b]and so he preferred the Rift better[/b][/i]. Which could be an IPD problem as well as a convergence problem, but like many people, his assumption seemed to be that this was a permanent hardware issue. Its here around 28:00 is you care to watch. https://www.youtube.com/watch?v=agrc2OmAEYg
I keep running into comments from VR headsets users saying their virtual worlds are the wrong scale. Adjusting IPD has helped a few people on the PSVR reddit, but in the case of all headset brands, sometimes people claim they've adjusted their IPD properly and seem to associate wrong scale with a particular headset or game, rather than convergence being off.

What i wonder is, do they lock convergence like its done on Nvidia? Or can convergence somehow be adjusted properly? Is IPD adjusted accurately on the Vive and Rift? Any ideas of whats going on?


I know some of you guys have the headsets yourselves and i was hoping you cculd shed some light on the subject as i was hoping to make a post on the Oculus forums about making a convergence adjustment available in the software to account for eye-distance changes, in-game FOV setting changes, wearing glasses or not, eye recess depth, amount of fatty tissue wear headset mounts on the face, etc. But i don't even know how much of a problem this is since i don't know much about the headsets, i'm just concerned about people getting a good experience and i know some of you have more experience adjust depth/convergence than almost anyone else on the planet.

Heres an example, I just watched UKRifter do a comparison of the Rift and Vive, he didn't like the Vive for Elite Dangerous because the planets looked small and so he preferred the Rift better. Which could be an IPD problem as well as a convergence problem, but like many people, his assumption seemed to be that this was a permanent hardware issue. Its here around 28:00 is you care to watch.

46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530

#1
Posted 12/01/2016 03:15 AM   
It's a software issue alone, and I believe there are a few reasons for it. 1. People have different IPD. The IPD adjustment makes it so that the lenses (and screens) are centred to the iris, but there is not an adjustment for the separation the eye will see. This means people with eyes closer together will see a large separation (High depth) and people with eyes far apart will see a small separation (low depth). This means that the devs who are even aware of this, will by default make their game have low depth so that no-one gets sick, at the cost of less 3D effect and depth. It's quite absurd as it would have been easy to tie in the IPD to automatic separation adjustment. 2. If you wear classes, for short sighted people (most of us glasses wearers), there are 2 big problems with perception: a. There is a binocular effect, which makes things appear i. far closer than they are, and ii. flat (no real 3D effect - just like looking through binoculars) which is identical to reducing 3D Depth in real life. b. There is an increase in FOV with glasses on (try looking at the perceived size of objects with glasses on vs glasses off i.e. edges of a screen). Both these things combined with Rift/Vive lenses greatly reduce the 3D effect. Basically, if you have a wide distance between your eyes and you wear high power glasses for short nearsightedness, you are screwed if you don't have a depth adjustment. I am sure there are other shenanigans too, which others will know about. For example, a lot of us set our 3D Vision depth to above 100% and higher than the distance between our eyes, because it gives a better 3D effect. Strictly speaking, this should not be allowed and would be considered a taboo, but it works great in practice. Oculus and HTC do need to allow adjustment of separation but they likely will not listen. Giving an example - people have been asking for an an adjustment for super-sampling for a long time so they don't have to use the debug tool - not even a response from them. Unfortunately, when companies get big, they also get arrogant and stop listening to their userbase. It's easily possible as even 3rd parties such as VorpX (which even works through the Oculus software) allow you to adjust separation (and convergence to an extent), in games that don't even natively support VR. If you do broach the topic, please post a link and I'll support it.
It's a software issue alone, and I believe there are a few reasons for it.

1. People have different IPD. The IPD adjustment makes it so that the lenses (and screens) are centred to the iris, but there is not an adjustment for the separation the eye will see. This means people with eyes closer together will see a large separation (High depth) and people with eyes far apart will see a small separation (low depth). This means that the devs who are even aware of this, will by default make their game have low depth so that no-one gets sick, at the cost of less 3D effect and depth. It's quite absurd as it would have been easy to tie in the IPD to automatic separation adjustment.

2. If you wear classes, for short sighted people (most of us glasses wearers), there are 2 big problems with perception:

a. There is a binocular effect, which makes things appear i. far closer than they are, and ii. flat (no real 3D effect - just like looking through binoculars) which is identical to reducing 3D Depth in real life.

b. There is an increase in FOV with glasses on (try looking at the perceived size of objects with glasses on vs glasses off i.e. edges of a screen).

Both these things combined with Rift/Vive lenses greatly reduce the 3D effect. Basically, if you have a wide distance between your eyes and you wear high power glasses for short nearsightedness, you are screwed if you don't have a depth adjustment.

I am sure there are other shenanigans too, which others will know about. For example, a lot of us set our 3D Vision depth to above 100% and higher than the distance between our eyes, because it gives a better 3D effect. Strictly speaking, this should not be allowed and would be considered a taboo, but it works great in practice.

Oculus and HTC do need to allow adjustment of separation but they likely will not listen. Giving an example - people have been asking for an an adjustment for super-sampling for a long time so they don't have to use the debug tool - not even a response from them. Unfortunately, when companies get big, they also get arrogant and stop listening to their userbase.

It's easily possible as even 3rd parties such as VorpX (which even works through the Oculus software) allow you to adjust separation (and convergence to an extent), in games that don't even natively support VR.

If you do broach the topic, please post a link and I'll support it.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#2
Posted 12/01/2016 05:46 AM   
@RAGEdemon I have myopia and astigmatism (to a moderate degree. I'm not one of those who can't see anything without glasses. Lol, I use the 3D glasses 1000 times more than my normal glasses), and I absolutely don't need my glasses when I play in 3D Vision. Even if I'm focusing on something that looks far away, if my face is at <~60cm from the monitor I see everything perfectly. With VR the screens are a few cm away from your eyes, so I assume that no one with myopia needs glasses. When I bought the 3D Vision 2 kit my first thought was "I hope I won't need to wear glasses on top of glasses". Luckily I didn't :). Projectors would be annoying for me, however.
@RAGEdemon

I have myopia and astigmatism (to a moderate degree. I'm not one of those who can't see anything without glasses. Lol, I use the 3D glasses 1000 times more than my normal glasses), and I absolutely don't need my glasses when I play in 3D Vision. Even if I'm focusing on something that looks far away, if my face is at <~60cm from the monitor I see everything perfectly. With VR the screens are a few cm away from your eyes, so I assume that no one with myopia needs glasses.

When I bought the 3D Vision 2 kit my first thought was "I hope I won't need to wear glasses on top of glasses". Luckily I didn't :). Projectors would be annoying for me, however.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#3
Posted 12/01/2016 06:48 AM   
I too have very mild (according to standard classification) myopia @ -2.00 and very mild astigmatism. I use a projector for which I need my glasses. I have no issue wearing 2 sets of glasses :) When it comes to VR, the virtual screen is projected at infinity so one still requires glasses at even my 'very mild' levels. I have found this to be the case with both my old Rift DK2 and new CV1. The vive will likely be the same. Without my glasses, it's unplayable. Coincidentally, I have an eye test booked for Monday, results from which I shall be using to order custom lenses for my Rift CV1 - Glasses in the rift are not comfortable at all. Some VR games have good depth. Others, noticeably not enough depth for my taste, such as Obduction.
I too have very mild (according to standard classification) myopia @ -2.00 and very mild astigmatism. I use a projector for which I need my glasses. I have no issue wearing 2 sets of glasses :)

When it comes to VR, the virtual screen is projected at infinity so one still requires glasses at even my 'very mild' levels. I have found this to be the case with both my old Rift DK2 and new CV1. The vive will likely be the same. Without my glasses, it's unplayable. Coincidentally, I have an eye test booked for Monday, results from which I shall be using to order custom lenses for my Rift CV1 - Glasses in the rift are not comfortable at all.

Some VR games have good depth. Others, noticeably not enough depth for my taste, such as Obduction.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#4
Posted 12/01/2016 07:13 AM   
I do not have a VR headset but these comments remind me a lot of the early 3D Vision reviews. The issue was that lots of people new to 3D were not aware of the importance of the convergence setting, and since Nvidia kept it hidden under the advanced settings, and did not teach players about it, a lot of reviewers complained that nothing ever came out of the screen and blamed the hardware for it. It looks like the VR companies are doing the same mistake.
I do not have a VR headset but these comments remind me a lot of the early 3D Vision reviews.
The issue was that lots of people new to 3D were not aware of the importance of the convergence setting, and since Nvidia kept it hidden under the advanced settings, and did not teach players about it, a lot of reviewers complained that nothing ever came out of the screen and blamed the hardware for it.
It looks like the VR companies are doing the same mistake.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#5
Posted 12/01/2016 09:42 AM   
[quote] 1. People have different IPD. The IPD adjustment makes it so that the lenses (and screens) are centred to the iris, but there is not an adjustment for the separation the eye will see. This means people with eyes closer together will see a large separation (High depth) and people with eyes far apart will see a small separation (low depth). This means that the devs who are even aware of this, will by default make their game have low depth so that no-one gets sick, at the cost of less 3D effect and depth. It's quite absurd as it would have been easy to tie in the IPD to automatic separation adjustment. Oculus and HTC do need to allow adjustment of separation but they likely will not listen. Giving an example - people have been asking for an an adjustment for super-sampling for a long time so they don't have to use the debug tool - not even a response from them. Unfortunately, when companies get big, they also get arrogant and stop listening to their userbase. [/quote] A separation adjustment that overrides the physical IPD shouldn't be necessary. Seperation should be equal to the IPD. The game engine should read the IPD provided provided by the HMD and use that as the separation value when calculating the the camera matrices. I'm a Unity developer and as far as i know the eye offsets are updated at runtime to react to any dynanamic IPD changes when using SteamVR. There should also never be a reason to adjust convergence. Now that being said an additional separation value could be useful if your IPD is outside of the available range for the HMD. The Vive has a IPD range of 60.4mm - 74mm, my IPD is actually 54mm (yes I'm a freak of nature apparently) so i imagine that in my case the scaling is going to be a bit off but it shouldn't be a big deal. I'm way more concerned about about things being out of focus because my eyes are viewing the inside edge of the lenses instead of directly at the sweet spot. I've also got astigmatism, it's fairly mild, -1.5 but it exacerbates the issue. Using glasses increases the distance to the lenses which in turn reduces the FOV and the size of the sweet spot so I'm forced to wear contacts. I've had this issue with all the HMDs i've used, Rift DK1, DK2, and the Vive. The DK1 and DK2 were completely useless since they were locked at 64mm. I got rid of my DK1 as soon as possible and had to resort to modding my DK2 by cutting up the headset, shifting the lenses closer together and then hacking the Oculus runtime to change it's separation values (I was on a mission). I was pretty thrilled when i found out that the vive had a physical IPD adjustment, and extremely disappointed when i got it and found out that 60.4mm was the lowest it supported, I almost returned the damn thing. I still love my Vive though, and I'm hoping that they'll improve the optics in it's next iteration. I'll take better optics over an increase in resolution any day of the week.

1. People have different IPD. The IPD adjustment makes it so that the lenses (and screens) are centred to the iris, but there is not an adjustment for the separation the eye will see. This means people with eyes closer together will see a large separation (High depth) and people with eyes far apart will see a small separation (low depth). This means that the devs who are even aware of this, will by default make their game have low depth so that no-one gets sick, at the cost of less 3D effect and depth. It's quite absurd as it would have been easy to tie in the IPD to automatic separation adjustment.

Oculus and HTC do need to allow adjustment of separation but they likely will not listen. Giving an example - people have been asking for an an adjustment for super-sampling for a long time so they don't have to use the debug tool - not even a response from them. Unfortunately, when companies get big, they also get arrogant and stop listening to their userbase.


A separation adjustment that overrides the physical IPD shouldn't be necessary. Seperation should be equal to the IPD. The game engine should read the IPD provided provided by the HMD and use that as the separation value when calculating the the camera matrices. I'm a Unity developer and as far as i know the eye offsets are updated at runtime to react to any dynanamic IPD changes when using SteamVR. There should also never be a reason to adjust convergence.

Now that being said an additional separation value could be useful if your IPD is outside of the available range for the HMD. The Vive has a IPD range of 60.4mm - 74mm, my IPD is actually 54mm (yes I'm a freak of nature apparently) so i imagine that in my case the scaling is going to be a bit off but it shouldn't be a big deal. I'm way more concerned about about things being out of focus because my eyes are viewing the inside edge of the lenses instead of directly at the sweet spot. I've also got astigmatism, it's fairly mild, -1.5 but it exacerbates the issue. Using glasses increases the distance to the lenses which in turn reduces the FOV and the size of the sweet spot so I'm forced to wear contacts. I've had this issue with all the HMDs i've used, Rift DK1, DK2, and the Vive. The DK1 and DK2 were completely useless since they were locked at 64mm. I got rid of my DK1 as soon as possible and had to resort to modding my DK2 by cutting up the headset, shifting the lenses closer together and then hacking the Oculus runtime to change it's separation values (I was on a mission). I was pretty thrilled when i found out that the vive had a physical IPD adjustment, and extremely disappointed when i got it and found out that 60.4mm was the lowest it supported, I almost returned the damn thing. I still love my Vive though, and I'm hoping that they'll improve the optics in it's next iteration. I'll take better optics over an increase in resolution any day of the week.

Like my work? You can send a donation via Paypal to sgs.rules@gmail.com

Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z

#6
Posted 12/01/2016 07:10 PM   
[quote="Libertine"]I keep running into comments from VR headsets users saying their virtual worlds are the wrong scale. Adjusting IPD has helped a few people on the PSVR reddit, but in the case of all headset brands, sometimes people claim they've adjusted their IPD properly and seem to associate wrong scale with a particular headset or game, rather than convergence being off. What i wonder is, do they lock convergence like its done on Nvidia? Or can convergence somehow be adjusted properly? Is IPD adjusted accurately on the Vive and Rift? Any ideas of whats going on?[/quote] The idea for VR is actually completely different. What we think of as separation and convergence are [i]both [/i]'locked' on VR. Because their goal is to give a 1:1 representation of the image. The goal of VR is to give you immersion and scale, and a life-like feel. So, no adjustments can be made by the end-user. Games can play with scale a different way though, by varying scale, but not convergence. In the game Chronos, there are a couple of terrific sections where you go through a mirror to a different location, and your size changes. In one you are giant, the other, small enough to fit in a bookcase. Both seem really natural and logical and don't change the strain on your eyes like convergence can. The reason I bring this up, is that it is possible to get a toyification effect, it's just not under your control. As far as changing the IPD to change scale or depth- no, that's not how it works. If people are saying that changed the scale in game, they are confused. With 3D Vision, we can and do change the 'scale' with separation and convergence for things like extra depth, or toyification. But that idea does not translate to VR headsets. The only use for IPD adjustments on VR headsets is to give you a clearer image. Your eyes need to be in the sweet spot of the lenses, otherwise you get a blurry image. [quote="Libertine"]Heres an example, I just watched UKRifter do a comparison of the Rift and Vive, he didn't like the Vive for Elite Dangerous because the planets looked small [i][b]and so he preferred the Rift better[/b][/i]. Which could be an IPD problem as well as a convergence problem, but like many people, his assumption seemed to be that this was a permanent hardware issue. Its here around 28:00 is you care to watch.[/quote] He's wrong. It doesn't have anything to do with IPD. That specific problem for Elite Dangerous was because the Vive build was broken or created wrong. The Rift build worked correctly, but they made a mistake for the Steam/Vive build. The resolution was also broken on Vive and gave an inferior experience. I'm not sure that is fixed or not at this point.
Libertine said:I keep running into comments from VR headsets users saying their virtual worlds are the wrong scale. Adjusting IPD has helped a few people on the PSVR reddit, but in the case of all headset brands, sometimes people claim they've adjusted their IPD properly and seem to associate wrong scale with a particular headset or game, rather than convergence being off.

What i wonder is, do they lock convergence like its done on Nvidia? Or can convergence somehow be adjusted properly? Is IPD adjusted accurately on the Vive and Rift? Any ideas of whats going on?

The idea for VR is actually completely different. What we think of as separation and convergence are both 'locked' on VR. Because their goal is to give a 1:1 representation of the image. The goal of VR is to give you immersion and scale, and a life-like feel. So, no adjustments can be made by the end-user.

Games can play with scale a different way though, by varying scale, but not convergence. In the game Chronos, there are a couple of terrific sections where you go through a mirror to a different location, and your size changes. In one you are giant, the other, small enough to fit in a bookcase. Both seem really natural and logical and don't change the strain on your eyes like convergence can. The reason I bring this up, is that it is possible to get a toyification effect, it's just not under your control.


As far as changing the IPD to change scale or depth- no, that's not how it works. If people are saying that changed the scale in game, they are confused. With 3D Vision, we can and do change the 'scale' with separation and convergence for things like extra depth, or toyification. But that idea does not translate to VR headsets.

The only use for IPD adjustments on VR headsets is to give you a clearer image. Your eyes need to be in the sweet spot of the lenses, otherwise you get a blurry image.


Libertine said:Heres an example, I just watched UKRifter do a comparison of the Rift and Vive, he didn't like the Vive for Elite Dangerous because the planets looked small and so he preferred the Rift better. Which could be an IPD problem as well as a convergence problem, but like many people, his assumption seemed to be that this was a permanent hardware issue. Its here around 28:00 is you care to watch.

He's wrong. It doesn't have anything to do with IPD. That specific problem for Elite Dangerous was because the Vive build was broken or created wrong. The Rift build worked correctly, but they made a mistake for the Steam/Vive build. The resolution was also broken on Vive and gave an inferior experience. I'm not sure that is fixed or not at this point.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#7
Posted 12/02/2016 12:01 AM   
[quote="masterotaku"]I have myopia and astigmatism (to a moderate degree. I'm not one of those who can't see anything without glasses. Lol, I use the 3D glasses 1000 times more than my normal glasses), and I absolutely don't need my glasses when I play in 3D Vision. Even if I'm focusing on something that looks far away, if my face is at <~60cm from the monitor I see everything perfectly. With VR the screens are a few cm away from your eyes, so I assume that no one with myopia needs glasses. When I bought the 3D Vision 2 kit my first thought was "I hope I won't need to wear glasses on top of glasses". Luckily I didn't :). Projectors would be annoying for me, however.[/quote] That works because you vision is good enough to see to 60 cm without too much trouble. The added stereo focus gives you a visual quality boost too, so even if it's a bit fuzzier than normal, you get a good experience. For VR, I see people mention this a lot, that the lenses are very close to your eyes. That doesn't matter at all, because they are lenses. The focal point is at distance. There is some argument about what it actually is. For Rift I think it's about 2M, for Vive, I think it might be set at infinity. The way to think about it is that the lenses move the screen to 2M or infinity away, not that they are up close. The fact that the screen is so close means the lenses have to be super strong however. Strong enough that if you leave them in the sun, they'll set your screen on fire.
masterotaku said:I have myopia and astigmatism (to a moderate degree. I'm not one of those who can't see anything without glasses. Lol, I use the 3D glasses 1000 times more than my normal glasses), and I absolutely don't need my glasses when I play in 3D Vision. Even if I'm focusing on something that looks far away, if my face is at <~60cm from the monitor I see everything perfectly. With VR the screens are a few cm away from your eyes, so I assume that no one with myopia needs glasses.

When I bought the 3D Vision 2 kit my first thought was "I hope I won't need to wear glasses on top of glasses". Luckily I didn't :). Projectors would be annoying for me, however.

That works because you vision is good enough to see to 60 cm without too much trouble. The added stereo focus gives you a visual quality boost too, so even if it's a bit fuzzier than normal, you get a good experience.

For VR, I see people mention this a lot, that the lenses are very close to your eyes. That doesn't matter at all, because they are lenses. The focal point is at distance. There is some argument about what it actually is. For Rift I think it's about 2M, for Vive, I think it might be set at infinity.

The way to think about it is that the lenses move the screen to 2M or infinity away, not that they are up close. The fact that the screen is so close means the lenses have to be super strong however. Strong enough that if you leave them in the sun, they'll set your screen on fire.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#8
Posted 12/02/2016 12:04 AM   
[quote="RAGEdemon"]It's a software issue alone, and I believe there are a few reasons for it. 1. People have different IPD. The IPD adjustment makes it so that the lenses (and screens) are centred to the iris, but there is not an adjustment for the separation the eye will see. This means people with eyes closer together will see a large separation (High depth) and people with eyes far apart will see a small separation (low depth). This means that the devs who are even aware of this, will by default make their game have low depth so that no-one gets sick, at the cost of less 3D effect and depth. It's quite absurd as it would have been easy to tie in the IPD to automatic separation adjustment.[/quote] No, this is not correct. Everyone sees the same depth in VR, no matter their IPD. The software is generating the same image regardless of the hardware IPD adjustment. In the SDK it is possible to do dynamic IPD adjustments, but the focus of that is to clarify the image by altering the distortion shader, not changing the perceived separation. As a general idea, they want every user to see the same 1:1 image, so all the software and hardware are setup to do that. I don't think that anyone sees a different scale in VR, and there is no difference as to depth. Infinity/max separation is always the same. [quote="RAGEdemon"]2. If you wear classes, for short sighted people (most of us glasses wearers), there are 2 big problems with perception: a. There is a binocular effect, which makes things appear i. far closer than they are, and ii. flat (no real 3D effect - just like looking through binoculars) which is identical to reducing 3D Depth in real life. b. There is an increase in FOV with glasses on (try looking at the perceived size of objects with glasses on vs glasses off i.e. edges of a screen). Both these things combined with Rift/Vive lenses greatly reduce the 3D effect. Basically, if you have a wide distance between your eyes and you wear high power glasses for short nearsightedness, you are screwed if you don't have a depth adjustment.[/quote] There is definitely a binocular effect because of wearing glasses. I'm a glasses/contacts wearer as well. -4.0/-3.75. This is the effect that you might remember after getting new glasses, where everything is distorted and wrong for a few days- until your brain adapts, and right-sizes everything. Unless you are really looking for it once you've adapted, you don't even think about the effect. You can bring this back by wearing your glasses backwards. You can get used to that too. In VR, optically, it's identical to real life. That's the primary engineering goal for them. That's why the headsets are at infinity, or far enough way for your eyes to be relaxed, and so that the disconnect of 3D images from focus point is comfortable. When we crank up the 3D on a close monitor, it's a strain on our eyes that we get used, or build up muscle strength. This is much easier for projector use, or VR. But, as far as reducing the 3D effect? I don't think that happens. The 3D that I see is the same that anyone else sees. The goal is to have it 'life-size', and people seeing the Tuscany house demo don't see the house as smaller or bigger based on IPD or glasses. They can report a change if the height is wrong. If you are a 6 foot human, and the height is set to 5 feet, it will be weird, you will feel wrong and scale will feel wrong. It's important to set the height of the in-game camera/eye. [quote="RAGEdemon"]I am sure there are other shenanigans too, which others will know about. For example, a lot of us set our 3D Vision depth to above 100% and higher than the distance between our eyes, because it gives a better 3D effect. Strictly speaking, this should not be allowed and would be considered a taboo, but it works great in practice. Oculus and HTC do need to allow adjustment of separation but they likely will not listen. Giving an example - people have been asking for an an adjustment for super-sampling for a long time so they don't have to use the debug tool - not even a response from them. Unfortunately, when companies get big, they also get arrogant and stop listening to their userbase. It's easily possible as even 3rd parties such as VorpX (which even works through the Oculus software) allow you to adjust separation (and convergence to an extent), in games that don't even natively support VR. If you do broach the topic, please post a link and I'll support it.[/quote] I don't think they will ever allow adjustment to separation or convergence. They go to great lengths to keep it consistent and 1:1. That's pretty much the entire point of VR. I'd agree it would be fun as an enthusiast to be able to do hyper or hypo stereo in VR, but they aren't going to add it as a feature, because it's not the 'reality' part of 'virtual reality'. VorpX can allow you do this, and in fact it's the opposite for VorpX. It's pretty much [i]always [/i]wrong, and getting to 1:1 is a trial. Bizarre distortion, wrong scale, you name it. Very flexible, but not a great experience.
RAGEdemon said:It's a software issue alone, and I believe there are a few reasons for it.

1. People have different IPD. The IPD adjustment makes it so that the lenses (and screens) are centred to the iris, but there is not an adjustment for the separation the eye will see. This means people with eyes closer together will see a large separation (High depth) and people with eyes far apart will see a small separation (low depth). This means that the devs who are even aware of this, will by default make their game have low depth so that no-one gets sick, at the cost of less 3D effect and depth. It's quite absurd as it would have been easy to tie in the IPD to automatic separation adjustment.

No, this is not correct. Everyone sees the same depth in VR, no matter their IPD.

The software is generating the same image regardless of the hardware IPD adjustment. In the SDK it is possible to do dynamic IPD adjustments, but the focus of that is to clarify the image by altering the distortion shader, not changing the perceived separation.

As a general idea, they want every user to see the same 1:1 image, so all the software and hardware are setup to do that. I don't think that anyone sees a different scale in VR, and there is no difference as to depth. Infinity/max separation is always the same.


RAGEdemon said:2. If you wear classes, for short sighted people (most of us glasses wearers), there are 2 big problems with perception:

a. There is a binocular effect, which makes things appear i. far closer than they are, and ii. flat (no real 3D effect - just like looking through binoculars) which is identical to reducing 3D Depth in real life.

b. There is an increase in FOV with glasses on (try looking at the perceived size of objects with glasses on vs glasses off i.e. edges of a screen).

Both these things combined with Rift/Vive lenses greatly reduce the 3D effect. Basically, if you have a wide distance between your eyes and you wear high power glasses for short nearsightedness, you are screwed if you don't have a depth adjustment.

There is definitely a binocular effect because of wearing glasses. I'm a glasses/contacts wearer as well. -4.0/-3.75. This is the effect that you might remember after getting new glasses, where everything is distorted and wrong for a few days- until your brain adapts, and right-sizes everything. Unless you are really looking for it once you've adapted, you don't even think about the effect. You can bring this back by wearing your glasses backwards. You can get used to that too.

In VR, optically, it's identical to real life. That's the primary engineering goal for them. That's why the headsets are at infinity, or far enough way for your eyes to be relaxed, and so that the disconnect of 3D images from focus point is comfortable. When we crank up the 3D on a close monitor, it's a strain on our eyes that we get used, or build up muscle strength. This is much easier for projector use, or VR.


But, as far as reducing the 3D effect? I don't think that happens. The 3D that I see is the same that anyone else sees. The goal is to have it 'life-size', and people seeing the Tuscany house demo don't see the house as smaller or bigger based on IPD or glasses.

They can report a change if the height is wrong. If you are a 6 foot human, and the height is set to 5 feet, it will be weird, you will feel wrong and scale will feel wrong. It's important to set the height of the in-game camera/eye.


RAGEdemon said:I am sure there are other shenanigans too, which others will know about. For example, a lot of us set our 3D Vision depth to above 100% and higher than the distance between our eyes, because it gives a better 3D effect. Strictly speaking, this should not be allowed and would be considered a taboo, but it works great in practice.

Oculus and HTC do need to allow adjustment of separation but they likely will not listen. Giving an example - people have been asking for an an adjustment for super-sampling for a long time so they don't have to use the debug tool - not even a response from them. Unfortunately, when companies get big, they also get arrogant and stop listening to their userbase.

It's easily possible as even 3rd parties such as VorpX (which even works through the Oculus software) allow you to adjust separation (and convergence to an extent), in games that don't even natively support VR.

If you do broach the topic, please post a link and I'll support it.

I don't think they will ever allow adjustment to separation or convergence. They go to great lengths to keep it consistent and 1:1. That's pretty much the entire point of VR.

I'd agree it would be fun as an enthusiast to be able to do hyper or hypo stereo in VR, but they aren't going to add it as a feature, because it's not the 'reality' part of 'virtual reality'.

VorpX can allow you do this, and in fact it's the opposite for VorpX. It's pretty much always wrong, and getting to 1:1 is a trial. Bizarre distortion, wrong scale, you name it. Very flexible, but not a great experience.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#9
Posted 12/02/2016 12:22 AM   
[quote="bo3b"]Strong enough that if you leave them in the sun, they'll set your screen on fire.[/quote] And set the enemy vessels on Fire ?!?! ^_^ Some things never change ;)) ^_^
bo3b said:Strong enough that if you leave them in the sun, they'll set your screen on fire.

And set the enemy vessels on Fire ?!?! ^_^ Some things never change ;)) ^_^

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#10
Posted 12/02/2016 12:25 AM   
Hi bo3b, as always thank you for your insight mate; no pun intended. 1. Regarding the application IPD not being dynamically adjusted in reference to the hardware set IPD, please see the following 2 discussions on the matter. The IPD is always reported as 64mm no matter the hardware IPD setting. Unfortunately it is a 'feature' according to Oculus. https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/ https://forums.oculus.com/developer/discussion/33209 cybereality has come back in August and has this to say: [color="green"]I'm really sorry for the delay. I don't think I initially understood how serious this issue was and didn't do a proper investigation. I can confirm that it is happening for me too. I have a bug report filed already, and will be sure to stay on top of the engineering team until this is resolved. Thanks.[/color] Hopefully, it'll be fixed soon, if not already - comex's tool is finally reporting the correct values now on my end. 2. Regarding the binocular effect - it is absolutely correct that one gets used to a warped image that glasses lenses cause - the mind will fix it over a few days/weeks. However and unfortunately, the eyes/brain cannot get used to the binocular effect that glasses cause. If one wears glasses but wants to see things in VR as though they were not wearing glasses i.e. perceive no binoculor affect, then an overriding separation adjustment would be required to increase depth to compensate (for advanced users). Another problem not oft talked about is the chromatic aberrations introduced by glasses lenses which the Oculus Runtime cannot account for during warping. Perhaps your sub-par VR experience could be partially attributed to these factors somewhat? Do you have a better experience with contacts? [img]http://www.opticsreviewer.com/image-files/chromatic-aberration.jpg[/img] There is always Lasik for us, I suppose... ;-) I agree that Oculus has little chance of providing this for its users, which is a shame.
Hi bo3b, as always thank you for your insight mate; no pun intended.

1. Regarding the application IPD not being dynamically adjusted in reference to the hardware set IPD, please see the following 2 discussions on the matter. The IPD is always reported as 64mm no matter the hardware IPD setting. Unfortunately it is a 'feature' according to Oculus.

https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/
https://forums.oculus.com/developer/discussion/33209

cybereality has come back in August and has this to say:
I'm really sorry for the delay. I don't think I initially understood how serious this issue was and didn't do a proper investigation. I can confirm that it is happening for me too. I have a bug report filed already, and will be sure to stay on top of the engineering team until this is resolved. Thanks.

Hopefully, it'll be fixed soon, if not already - comex's tool is finally reporting the correct values now on my end.

2. Regarding the binocular effect - it is absolutely correct that one gets used to a warped image that glasses lenses cause - the mind will fix it over a few days/weeks. However and unfortunately, the eyes/brain cannot get used to the binocular effect that glasses cause. If one wears glasses but wants to see things in VR as though they were not wearing glasses i.e. perceive no binoculor affect, then an overriding separation adjustment would be required to increase depth to compensate (for advanced users).

Another problem not oft talked about is the chromatic aberrations introduced by glasses lenses which the Oculus Runtime cannot account for during warping. Perhaps your sub-par VR experience could be partially attributed to these factors somewhat? Do you have a better experience with contacts?

Image

There is always Lasik for us, I suppose... ;-)

I agree that Oculus has little chance of providing this for its users, which is a shame.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#11
Posted 12/02/2016 05:16 AM   
[quote="RAGEdemon"]Hi bo3b, as always thank you for your insight mate; no pun intended. 1. Regarding the application IPD not being dynamically adjusted in reference to the hardware set IPD, please see the following 2 discussions on the matter. The IPD is always reported as 64mm no matter the hardware IPD setting. Unfortunately it is a 'feature' according to Oculus. https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/ https://forums.oculus.com/developer/discussion/33209 cybereality has come back in August and has this to say: [color="green"]I'm really sorry for the delay. I don't think I initially understood how serious this issue was and didn't do a proper investigation. I can confirm that it is happening for me too. I have a bug report filed already, and will be sure to stay on top of the engineering team until this is resolved. Thanks.[/color] Hopefully, it'll be fixed soon, if not already - comex's tool is finally reporting the correct values now on my end.[/quote] Oh right! That IPD bug. I thought they'd fixed that, but damn Oculus never pays attention to any bug reports and never communicates with anyone outside of their pre-selected developers. You are of course right, that if the IPD is being used wrong by the game, sent in as 64 always, with no respect to the actual IPD, that different users will get different scaling. At infinity, if a person has a >64 IPD they'd see less depth than desired, just like turning down the separation. If a person has <64, they'd see hyperstereo, more separation than desired, and probably eyestrain as their eyes diverge. I wish Oculus wasn't so arrogant. It's not helping them to act like they hold all the cards and don't have to pay attention to anyone else. [quote="RAGEdemon"]2. Regarding the binocular effect - it is absolutely correct that one gets used to a warped image that glasses lenses cause - the mind will fix it over a few days/weeks. However and unfortunately, the eyes/brain cannot get used to the binocular effect that glasses cause. If one wears glasses but wants to see things in VR as though they were not wearing glasses i.e. perceive no binoculor affect, then an overriding separation adjustment would be required to increase depth to compensate (for advanced users). Another problem not oft talked about is the chromatic aberrations introduced by glasses lenses which the Oculus Runtime cannot account for during warping. Perhaps your sub-par VR experience could be partially attributed to these factors somewhat? Do you have a better experience with contacts? [img]http://www.opticsreviewer.com/image-files/chromatic-aberration.jpg[/img] There is always Lasik for us, I suppose... ;-) I agree that Oculus has little chance of providing this for its users, which is a shame.[/quote] Less sure about the binocular effect. I have three different scenarios that I use my Rift. Contact lenses, regular glasses (one pair full prescription, one pair detuned by 0.5), and the VRLensLab inserts. I haven't noticed any difference between the three viewing modes. My full strength glasses don't fit well and are uncomfortable. The VRLensLab lens are quite good and give me the same experience. With contacts, it's much simpler, but also no noticeable difference. Since you mention it, I'll specifically look for differences.
RAGEdemon said:Hi bo3b, as always thank you for your insight mate; no pun intended.

1. Regarding the application IPD not being dynamically adjusted in reference to the hardware set IPD, please see the following 2 discussions on the matter. The IPD is always reported as 64mm no matter the hardware IPD setting. Unfortunately it is a 'feature' according to Oculus.


https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/

https://forums.oculus.com/developer/discussion/33209


cybereality has come back in August and has this to say:
I'm really sorry for the delay. I don't think I initially understood how serious this issue was and didn't do a proper investigation. I can confirm that it is happening for me too. I have a bug report filed already, and will be sure to stay on top of the engineering team until this is resolved. Thanks.

Hopefully, it'll be fixed soon, if not already - comex's tool is finally reporting the correct values now on my end.

Oh right! That IPD bug. I thought they'd fixed that, but damn Oculus never pays attention to any bug reports and never communicates with anyone outside of their pre-selected developers.

You are of course right, that if the IPD is being used wrong by the game, sent in as 64 always, with no respect to the actual IPD, that different users will get different scaling. At infinity, if a person has a >64 IPD they'd see less depth than desired, just like turning down the separation. If a person has <64, they'd see hyperstereo, more separation than desired, and probably eyestrain as their eyes diverge.

I wish Oculus wasn't so arrogant. It's not helping them to act like they hold all the cards and don't have to pay attention to anyone else.

RAGEdemon said:2. Regarding the binocular effect - it is absolutely correct that one gets used to a warped image that glasses lenses cause - the mind will fix it over a few days/weeks. However and unfortunately, the eyes/brain cannot get used to the binocular effect that glasses cause. If one wears glasses but wants to see things in VR as though they were not wearing glasses i.e. perceive no binoculor affect, then an overriding separation adjustment would be required to increase depth to compensate (for advanced users).

Another problem not oft talked about is the chromatic aberrations introduced by glasses lenses which the Oculus Runtime cannot account for during warping. Perhaps your sub-par VR experience could be partially attributed to these factors somewhat? Do you have a better experience with contacts?

Image

There is always Lasik for us, I suppose... ;-)

I agree that Oculus has little chance of providing this for its users, which is a shame.

Less sure about the binocular effect. I have three different scenarios that I use my Rift. Contact lenses, regular glasses (one pair full prescription, one pair detuned by 0.5), and the VRLensLab inserts.

I haven't noticed any difference between the three viewing modes. My full strength glasses don't fit well and are uncomfortable. The VRLensLab lens are quite good and give me the same experience. With contacts, it's much simpler, but also no noticeable difference.

Since you mention it, I'll specifically look for differences.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#12
Posted 12/02/2016 09:48 AM   
[quote]You are of course right, that if the IPD is being used wrong by the game, sent in as 64 always, with no respect to the actual IPD, that different users will get different scaling. At infinity, if a person has a >64 IPD they'd see less depth than desired, just like turning down the separation. If a person has <64, they'd see hyperstereo, more separation than desired, and probably eyestrain as their eyes diverge.[/quote]I don't think that is quite right - it certainly will have an effect on the percieved scale, but getting it wrong - even vastly wrong - should not cause their eyes to diverge. "Separation" as we know it comes from the *off-center* part of the projection matrix - this is what causes our eyes to diverge if it is set too high. For VR, the 3D effect is not created in this manner as separation comes from physically separating the screens (wheras in 3D Vision we only have one screen and have to simulate separating it into two). VR *does* use an off-center projection matrix, but it is only to account for the alignment difference between the lenses and center of the screens, and is otherwise unrelated to the IPD. "Convergence" as we know it is mathematically equivelent to moving the camera in space, which is how VR achieves 3D, whereas we use it for pop-out and toyification (in the case of 3D Vision that is multiplied by separation to achieve this equivalence, but that is just maths and doesn't change the concept). This controls the percieved scale of the world and for realistic results in VR should be tied to the IPD, but getting it wrong will only change the percieved scale, and should not cause the eyes to diverge. Setting it way too high does require the eyes to go cross-eyed to focus on something close as I'm sure we are all quite familiar with, but that is the opposite of diverging... and no different to trying to focus on something very small and very close in real life (just that in a virtual world there is no nose stopping things from getting *really* close). You can think of it this way - no matter how far apart your eyes are, the light rays coming from an object infinitely far away are still parallel, so under no circumstances will your eyes need to diverge, and the same applies to VR. The reason 3D Vision can cause the eyes to diverge is because it simulates infinity based on however many pixels match the IPD (which depends on the monitor size and your IPD), whereas in VR that answer is always 0 pixels. It is noteworthy that there is nothing inherently special about "separation" and "convergence" - the exact same effect can be achieved with a regular camera in the real world. In this case "separation" requires a lens shifted off-center from the image sensor (these lenses exist, but they are always combined with tilt lenses, are expensive and lack nice features like auto focus, so most people just use a regular lens and adjust the relative placement of the resulting images to achieve something similar. For a cheap shift lens to experiment with, check out Loreo's shift lens in a cap. Note that the lens only needs to be shifted off-centre by about 2mm in either direction to achieve good separation, though this does depend on the screen size it is intended to be viewed on later). If it is not shifted (and the "parallax" has not yet been adjusted) "infinity" will appear at screen depth, same as it does if displaying a VR type projection on a screen before applying the lens distortion shader. "Convergence" can be varied by changing how far the camera is shifted in space, and all it does is change the percieved scale. Here is an example from a recent holiday of achieving toyification in real life by moving the cameras 70cm apart on a dolly rail as though I had increased convergence (ignore the "broken shader" like effect on the shadows - it is amazing just how far clouds can move in the few seconds it takes to move the camera) - the eyes do not need to diverge to see this despite the convergence being 10x my IPD: [img]https://scontent-syd2-1.xx.fbcdn.net/t31.0-8/14707862_10154131665684091_965270426299316423_o.jpg[/img] Also, if it were possible to switch bodies in the real world, the same scale differences would be observed until the brain had time to adjust for it (the brain is good at that, it doesn't take long). I'm pretty sure there are special glasses (think twin periscopes) that researches have used for this.
You are of course right, that if the IPD is being used wrong by the game, sent in as 64 always, with no respect to the actual IPD, that different users will get different scaling. At infinity, if a person has a >64 IPD they'd see less depth than desired, just like turning down the separation. If a person has <64, they'd see hyperstereo, more separation than desired, and probably eyestrain as their eyes diverge.
I don't think that is quite right - it certainly will have an effect on the percieved scale, but getting it wrong - even vastly wrong - should not cause their eyes to diverge.

"Separation" as we know it comes from the *off-center* part of the projection matrix - this is what causes our eyes to diverge if it is set too high. For VR, the 3D effect is not created in this manner as separation comes from physically separating the screens (wheras in 3D Vision we only have one screen and have to simulate separating it into two). VR *does* use an off-center projection matrix, but it is only to account for the alignment difference between the lenses and center of the screens, and is otherwise unrelated to the IPD.

"Convergence" as we know it is mathematically equivelent to moving the camera in space, which is how VR achieves 3D, whereas we use it for pop-out and toyification (in the case of 3D Vision that is multiplied by separation to achieve this equivalence, but that is just maths and doesn't change the concept). This controls the percieved scale of the world and for realistic results in VR should be tied to the IPD, but getting it wrong will only change the percieved scale, and should not cause the eyes to diverge. Setting it way too high does require the eyes to go cross-eyed to focus on something close as I'm sure we are all quite familiar with, but that is the opposite of diverging... and no different to trying to focus on something very small and very close in real life (just that in a virtual world there is no nose stopping things from getting *really* close).

You can think of it this way - no matter how far apart your eyes are, the light rays coming from an object infinitely far away are still parallel, so under no circumstances will your eyes need to diverge, and the same applies to VR. The reason 3D Vision can cause the eyes to diverge is because it simulates infinity based on however many pixels match the IPD (which depends on the monitor size and your IPD), whereas in VR that answer is always 0 pixels.


It is noteworthy that there is nothing inherently special about "separation" and "convergence" - the exact same effect can be achieved with a regular camera in the real world. In this case "separation" requires a lens shifted off-center from the image sensor (these lenses exist, but they are always combined with tilt lenses, are expensive and lack nice features like auto focus, so most people just use a regular lens and adjust the relative placement of the resulting images to achieve something similar. For a cheap shift lens to experiment with, check out Loreo's shift lens in a cap. Note that the lens only needs to be shifted off-centre by about 2mm in either direction to achieve good separation, though this does depend on the screen size it is intended to be viewed on later). If it is not shifted (and the "parallax" has not yet been adjusted) "infinity" will appear at screen depth, same as it does if displaying a VR type projection on a screen before applying the lens distortion shader.

"Convergence" can be varied by changing how far the camera is shifted in space, and all it does is change the percieved scale. Here is an example from a recent holiday of achieving toyification in real life by moving the cameras 70cm apart on a dolly rail as though I had increased convergence (ignore the "broken shader" like effect on the shadows - it is amazing just how far clouds can move in the few seconds it takes to move the camera) - the eyes do not need to diverge to see this despite the convergence being 10x my IPD:

Image


Also, if it were possible to switch bodies in the real world, the same scale differences would be observed until the brain had time to adjust for it (the brain is good at that, it doesn't take long). I'm pretty sure there are special glasses (think twin periscopes) that researches have used for this.

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#13
Posted 12/02/2016 02:54 PM   
[quote="RAGEdemon"]Hi bo3b, as always thank you for your insight mate; no pun intended. 1. Regarding the application IPD not being dynamically adjusted in reference to the hardware set IPD, please see the following 2 discussions on the matter. The IPD is always reported as 64mm no matter the hardware IPD setting. Unfortunately it is a 'feature' according to Oculus. https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/ https://forums.oculus.com/developer/discussion/33209 cybereality has come back in August and has this to say: [color="green"]I'm really sorry for the delay. I don't think I initially understood how serious this issue was and didn't do a proper investigation. I can confirm that it is happening for me too. I have a bug report filed already, and will be sure to stay on top of the engineering team until this is resolved. Thanks.[/color] Hopefully, it'll be fixed soon, if not already - comex's tool is finally reporting the correct values now on my end. [/quote] This was fixed few months ago. Related topics on reddit if you are interested: https://www.reddit.com/r/oculus/comments/4z9q0f/ipd_issue_fixed_in_17_are_you_able_to_confirm_the/ https://www.reddit.com/r/oculus/comments/4zr175/ipd_fix_makes_a_massive_difference_just_played/
RAGEdemon said:Hi bo3b, as always thank you for your insight mate; no pun intended.

1. Regarding the application IPD not being dynamically adjusted in reference to the hardware set IPD, please see the following 2 discussions on the matter. The IPD is always reported as 64mm no matter the hardware IPD setting. Unfortunately it is a 'feature' according to Oculus.

https://www.reddit.com/r/oculus/comments/4m8n7q/cv1_lens_space_adjust_is_not_the_ipd_setting/
https://forums.oculus.com/developer/discussion/33209

cybereality has come back in August and has this to say:
I'm really sorry for the delay. I don't think I initially understood how serious this issue was and didn't do a proper investigation. I can confirm that it is happening for me too. I have a bug report filed already, and will be sure to stay on top of the engineering team until this is resolved. Thanks.

Hopefully, it'll be fixed soon, if not already - comex's tool is finally reporting the correct values now on my end.


This was fixed few months ago.

Related topics on reddit if you are interested: https://www.reddit.com/r/oculus/comments/4z9q0f/ipd_issue_fixed_in_17_are_you_able_to_confirm_the/
https://www.reddit.com/r/oculus/comments/4zr175/ipd_fix_makes_a_massive_difference_just_played/

#14
Posted 12/02/2016 05:05 PM   
No pun intended, but: This is the 3D Vision section... Am I missing something? This is a VR related discussion and should be moved to the appropriate section of the forums (meaning the VR one)...
No pun intended, but:

This is the 3D Vision section... Am I missing something? This is a VR related discussion and should be moved to the appropriate section of the forums (meaning the VR one)...

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#15
Posted 12/02/2016 10:01 PM   
  1 / 2    
Scroll To Top