[quote="mike_ar69"]As well as the variable names being hard to decipher, I am unclear what is going on in some of the Vertex Shaders for shadows as well - the output coordinate is not a clip/proj coord, which is clear if you try and move them to screen depth, they go all over the place, not just wrong depth, but a kind of rotation as well. I think I must have been looking at the wrong VS, some VS do "work" properly in other shaders e.g. for skybox. I wish I had more time for this, but I don't right now, but I will contribute what bits I manage to work out.[/quote]
ok, I have a haunch about that... but since I don't have the game yet to try I'll just look at the maths and see if my theory is plausible...
This is the vertex shader that DHR sent me (346aa0e5f6b29dc8-vs_replace.txt):
[code]
void main(
float2 v0 : POSITION0,
out float4 o0 : SV_Position0,
out float4 o1 : TEXCOORD0,
out float3 o2 : TEXCOORD1)
{
float4 r0;
uint4 bitmask, uiDest;
float4 fDest;
o0.xy = v0.xy;
o0.zw = float2(0,1);
r0.xy = v0.xy * float2(0.5,-0.5) + float2(0.5,0.5);
o1.xy = Constants[0].zw * r0.xy;
r0.xyzw = Constants[2].xyzw * v0.yyyy;
r0.xyzw = v0.xxxx * Constants[1].xyzw + r0.xyzw;
r0.xyzw = Constants[4].xyzw + r0.xyzw;
r0.xyz = r0.xyz / r0.www;
o2.xyz = Constants[0].xxx * r0.xyz;
return;
}
[/code]
So it looks like there is a matrix multiply here:
[code]
r0.xyzw = Constants[2].xyzw * v0.yyyy;
r0.xyzw = v0.xxxx * Constants[1].xyzw + r0.xyzw;
r0.xyzw = Constants[4].xyzw + r0.xyzw;
[/code]
It's multiplying a 2D coordinate v0.xy (or since we are using homogeneous coordinates [v0.x, v0.y, 1]) by this matrix:
[code]
Constants[1]: 0.752560079 | 0.0156586282 | -0.696669042 | 0
Constants[2]: 0.0025248453 | 0.576704204 | 0.0156896524 | 0
Constants[3]: n/a
Constants[4]: -0.679409742 | 0.0229271911 | -0.733400822 | 2.49999994e-005
[/code]
So, this matrix looks interesting. I can make an assumption that v0.xy are likely between -1 and +1 based on this other line of code (that is obviously scaling it to be between 0 and 1 for the TEXCOORD0 output), and will guess that these will correspond to screen coordinates (which makes sense given this is a post processing effect):
[code] r0.xy = v0.xy * float2(0.5,-0.5) + float2(0.5,0.5); [/code]
So, what is this matrix doing? Well, let's see what happens if we run a few coordinates through this matrix, then normalise the result:
(I might have top and bottom mixed up here - I can't recall which way DX11 uses off hand)
Top left of the screen (-1,-1,1): [[-1.43449467 -0.56943564 -0.05242143 0.000025 ]]
Normalised: [[-57379.78802911 -22777.42619066 -2096.85734632 1. ]]
Top right of the screen (1,-1,1): [[ 0.07062549 -0.53811838 -1.44575952 0.000025 ]]
Normalised: [[ 2825.0197358 -21524.73590459 -57830.38204393 1. ]]
Bottom left of the screen (-1,1,1): [[-1.42944498 0.58397277 -0.02104213 0.000025 ]]
Normalised: [[-57177.80040027 23358.91123661 -841.6851242 1. ]]
Bottom right of the screen (1,1,1): [[ 0.07567518 0.61529002 -1.41438021 0.000025 ]]
Normalised: [[ 3027.00736465 24611.60152268 -56575.20982181 1. ]]
Center of the screen (0,0,1): [[-0.67940974 0.02292719 -0.73340082 0.000025 ]]
Normalised: [[-27176.39033223 917.08766601 -29336.03358406 1. ]]
Let's round those coordinates and place them on the screen:
[code]
-57379 2825
-22777 -21524
-2096 -57830
-27176
917
-29336
-57177 3027
23358 24611
-841 -56575
[/code]
If I'm not mistaken, these coordinates appear to form a plane in 3D space (note - I had to scale them to use this website):
http://bodurov.com/VectorVisualizer/?vectors=-5.7379/-2.2777/-0.2096/0.2825/-2.1524/-5.783v-5.7177/2.3358/-0.0841/0.3027/2.4611/-5.6575v-5.7379/-2.2777/-0.2096/-5.7177/2.3358/-0.0841v0.2825/-2.1524/-5.783/0.3027/2.4611/-5.6575v-5.7379/-2.2777/-0.2096/0.3027/2.4611/-5.6575v-5.7177/2.3358/-0.0841/0.2825/-2.1524/-5.783v0/0/0/-2.7176/0.0917/-2.9336
Gentlemen, what I belive you are looking at is the coordinates of the camera frustrum (likely at the far clipping plane) in world coordinates.
So... how would you go about applying a stereo correction given this knowledge? Well... say you took the coordinates at the top right corner and subtracted the coordinates at the top left corder - you would be left with a vector pointing towards camera right, which is the direction you need to apply the stereo correction. Divide that vector by the far clipping plane and you now have scaled the magnitude down to a unit distance from the camera. Calculate the stereo correction and multiply the result by this vector and I think you will be very close. You will probably need to experiment a little to find the exact spot to insert it and I could be wrong about some things...
A few of my own fixes with some math that might be of interest (none of this is exactly the same as above, just that have some passing similarities):
In Tinker I used the camera frustrum and far clipping plane to derive the horizontal FOV to fix a view space coordinate for shadows:
https://github.com/DarkStarSword/3d-fixes/commit/da1483c728398e9c355dfad42cb550e9434557bb
Stranded Deep also uses the frustrum coordinates in world space for some caustics and underwater shadows (though without using that matrix that Mad Max does - it just passed in the four corners in individual constant registers):
https://github.com/DarkStarSword/3d-fixes/blob/4eeb98aedb557167611e57b5f75b2a90e4d33d80/Stranded%20Deep/ShaderOverride/PixelShaders/661A055E.txt
Miasmata's light shafts had no matrices to help me at all - I ran two points through an unrelated objects MVP matrix to determine the camera's orientation (similar to how we have points on the camera frustrum) and worked out the correction amount from there:
https://github.com/DarkStarSword/3d-fixes/commit/8c8f51d2e0b8e583b09c3fe12d64ebbf1c9fa7e5
mike_ar69 said:As well as the variable names being hard to decipher, I am unclear what is going on in some of the Vertex Shaders for shadows as well - the output coordinate is not a clip/proj coord, which is clear if you try and move them to screen depth, they go all over the place, not just wrong depth, but a kind of rotation as well. I think I must have been looking at the wrong VS, some VS do "work" properly in other shaders e.g. for skybox. I wish I had more time for this, but I don't right now, but I will contribute what bits I manage to work out.
ok, I have a haunch about that... but since I don't have the game yet to try I'll just look at the maths and see if my theory is plausible...
This is the vertex shader that DHR sent me (346aa0e5f6b29dc8-vs_replace.txt):
void main(
float2 v0 : POSITION0,
out float4 o0 : SV_Position0,
out float4 o1 : TEXCOORD0,
out float3 o2 : TEXCOORD1)
So, this matrix looks interesting. I can make an assumption that v0.xy are likely between -1 and +1 based on this other line of code (that is obviously scaling it to be between 0 and 1 for the TEXCOORD0 output), and will guess that these will correspond to screen coordinates (which makes sense given this is a post processing effect):
Gentlemen, what I belive you are looking at is the coordinates of the camera frustrum (likely at the far clipping plane) in world coordinates.
So... how would you go about applying a stereo correction given this knowledge? Well... say you took the coordinates at the top right corner and subtracted the coordinates at the top left corder - you would be left with a vector pointing towards camera right, which is the direction you need to apply the stereo correction. Divide that vector by the far clipping plane and you now have scaled the magnitude down to a unit distance from the camera. Calculate the stereo correction and multiply the result by this vector and I think you will be very close. You will probably need to experiment a little to find the exact spot to insert it and I could be wrong about some things...
A few of my own fixes with some math that might be of interest (none of this is exactly the same as above, just that have some passing similarities):
EDIT: Initial post had the wrong Matrix Image linked. Fixed now!
Based on the Math I see from DarkStarSword and based on the values that does look to be the projection matrix.
The matrix can be modified to add stereo perspective into it.
[img]http://www.songho.ca/opengl/files/gl_projectionmatrix_eq16.png[/img]
(This is an OPENGL view matrix -> Notice that the columns and rows ARE INVERTED)
Where: r= right, l=left, t=top, b=bottom
A very good article explaining the Proj Matrix is here: http://www.songho.ca/opengl/gl_projectionmatrix.html
(Even if you know it is always helpful to compare/refresh your knowledge)
To apply the stereo correction on this matrix you can follow this:
[code]
// Apply ONLY FOR Perspective projection!
if ((mCurrentProj[3] == 0) && (mCurrentProj[7] == 0) && (mCurrentProj[11] == -1) && (mCurrentProj[15] == 0))
{
// Left Eye
if (NV3DVisionGetCurrentFrame() == 1)
{
// Apply the stereo correction for Current Eye
mCurrentProj[8] += currentSeparation;
mCurrentProj[12] += currentSeparation * currentConvergence;
}
// Right Eye
else if (NV3DVisionGetCurrentFrame() == 2)
{
// Apply the stereo correction for Current Eye
mCurrentProj[8] -= currentSeparation;
mCurrentProj[12] -= currentSeparation * currentConvergence;
}
}
[/code]
Again this is on OpenGL. In DirectX the Rows and Coords are Inverted (Basically a DX matrix transposed() gives you an OpenGL one). Still the same principle applies.
Hope this helps;))
EDIT: Initial post had the wrong Matrix Image linked. Fixed now!
Based on the Math I see from DarkStarSword and based on the values that does look to be the projection matrix.
The matrix can be modified to add stereo perspective into it.
(This is an OPENGL view matrix -> Notice that the columns and rows ARE INVERTED)
To apply the stereo correction on this matrix you can follow this:
// Apply ONLY FOR Perspective projection!
if ((mCurrentProj[3] == 0) && (mCurrentProj[7] == 0) && (mCurrentProj[11] == -1) && (mCurrentProj[15] == 0))
{
// Left Eye
if (NV3DVisionGetCurrentFrame() == 1)
{
// Apply the stereo correction for Current Eye
mCurrentProj[8] += currentSeparation;
mCurrentProj[12] += currentSeparation * currentConvergence;
}
// Right Eye
else if (NV3DVisionGetCurrentFrame() == 2)
{
// Apply the stereo correction for Current Eye
mCurrentProj[8] -= currentSeparation;
mCurrentProj[12] -= currentSeparation * currentConvergence;
}
}
Again this is on OpenGL. In DirectX the Rows and Coords are Inverted (Basically a DX matrix transposed() gives you an OpenGL one). Still the same principle applies.
Hope this helps;))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Hehe, I've had the same though in the past and worked out that same adjustment:
[url]https://github.com/DarkStarSword/3d-fixes/blob/master/matrix.py#L74[/url]
But so far I haven't used that approach in a fix.
I'm not sure if it will help in this case though - that matrix isn't a pure projection matrix (not enough zeroes), it's more likely to be an inverse view-projection matrix, but maybe not quite that either.
I didn't include Constants[3] from the vertex shader constant buffer above because it wasn't used in the multiplication since it was only using a 2D coordinate as the input. For the sake of completeness, here's the full 4x4 matrix:
[code]
Constants[1]: 0.752560079 | 0.0156586282 | -0.696669042 | 0
Constants[2]: 0.0025248453 | 0.576704204 | 0.0156896524 | 0
Constants[3]: 0 | 0 | 0 | 9.99997425
Constants[4]: -0.679409742 | 0.0229271911 | -0.733400822 | 0.000025
[/code]
I'm not sure if it will help in this case though - that matrix isn't a pure projection matrix (not enough zeroes), it's more likely to be an inverse view-projection matrix, but maybe not quite that either.
I didn't include Constants[3] from the vertex shader constant buffer above because it wasn't used in the multiplication since it was only using a 2D coordinate as the input. For the sake of completeness, here's the full 4x4 matrix:
After playing around in matrix.py for a while, I've concluded that a matrix of that form is most likely built up with rotations (x,y and/or z), optionally scaled, multiplied by a projection matrix, then inverted:
e.g. here's a similar matrix (the .I means inverse):
[code]
In [98]: print(matrix.random_mvp().I)
ROTATE X: 90.488218
ROTATE Z: 114.958019
ROTATE Y: 150.514597
PROJECTION: near: 4.75428 far: 823.277 H FOV: 80.8799 V FOV: 85.259
[[ 0.31304129 0.41374227 -0.67615544 0. ]
[ 0.834539 0.00330958 0.38839365 0. ]
[ 0. 0. 0. -0.20912234]
[ 0.20768657 -0.87425187 -0.43880514 0.210337 ]]
[/code]
Notably, no matrix that included a translation ended up with zeroes in the right spot - this suggests that the camera position was not used in the creation of this matrix, and the coordinates above will be centered on the camera - which I can easily confirm using Pythagoras to observe that the distance of each corner from the origin is the same:
[code]
In [24]: math.sqrt((-57379)**2 + (-22777)**2 + (-2096)**2)
Out[24]: 61770.01364740014
In [25]: math.sqrt((2825)**2 + (-21524)**2 + (-57830)**2)
Out[25]: 61770.31731341519
In [26]: math.sqrt((-57177)**2 + (23358)**2 + (-841)**2)
Out[26]: 61769.83708898705
In [27]: math.sqrt((3027)**2 + (24611)**2 + (-56575)**2)
Out[27]: 61770.50003844877
[/code]
And another consequence of this is I can easily derive the value of the far clipping plane by using the point at the center of the screen:
[code]
In [29]: math.sqrt((-27176.39033223)**2 + (917.0876660)**2 + (-29336.03358406)**2)
Out[29]: 40000.00134652435
[/code]
I love it when the result looks like a plausible value that a developer might have picked. The far clipping plane (at least in the scene/level this frame analysis was taken from, experience has taught me that this does not necessarily hold constant for an entire game) is 40000
After playing around in matrix.py for a while, I've concluded that a matrix of that form is most likely built up with rotations (x,y and/or z), optionally scaled, multiplied by a projection matrix, then inverted:
e.g. here's a similar matrix (the .I means inverse):
Notably, no matrix that included a translation ended up with zeroes in the right spot - this suggests that the camera position was not used in the creation of this matrix, and the coordinates above will be centered on the camera - which I can easily confirm using Pythagoras to observe that the distance of each corner from the origin is the same:
In [24]: math.sqrt((-57379)**2 + (-22777)**2 + (-2096)**2)
Out[24]: 61770.01364740014
In [25]: math.sqrt((2825)**2 + (-21524)**2 + (-57830)**2)
Out[25]: 61770.31731341519
In [26]: math.sqrt((-57177)**2 + (23358)**2 + (-841)**2)
Out[26]: 61769.83708898705
In [27]: math.sqrt((3027)**2 + (24611)**2 + (-56575)**2)
Out[27]: 61770.50003844877
And another consequence of this is I can easily derive the value of the far clipping plane by using the point at the center of the screen:
In [29]: math.sqrt((-27176.39033223)**2 + (917.0876660)**2 + (-29336.03358406)**2)
Out[29]: 40000.00134652435
I love it when the result looks like a plausible value that a developer might have picked. The far clipping plane (at least in the scene/level this frame analysis was taken from, experience has taught me that this does not necessarily hold constant for an entire game) is 40000
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
One thing I've been meaning to try to work out, is does a matrix exist that we can multiply a regular projection matrix by to get a modified projection matrix with the stereo correction built in (but not change it in any other way)?
If there is, we should be able to multiply any matrix that includes a projection matrix as well as some other matrix (like MVP or VP) by it, or alternatively inverse it then multiply that by the inverse MV/MVP matrix to build the stereo correction right into them as well.
Not sure what the answer is yet - I haven't sat down with a notebook to work out the math.
One thing I've been meaning to try to work out, is does a matrix exist that we can multiply a regular projection matrix by to get a modified projection matrix with the stereo correction built in (but not change it in any other way)?
If there is, we should be able to multiply any matrix that includes a projection matrix as well as some other matrix (like MVP or VP) by it, or alternatively inverse it then multiply that by the inverse MV/MVP matrix to build the stereo correction right into them as well.
Not sure what the answer is yet - I haven't sat down with a notebook to work out the math.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
The answer to my above musing is:
[code]
[ 1, 0, 0, 0 ],
[ 0, 1, 0, 0 ],
[ (sep*conv) / (q*near), 0, 1, 0 ],
[ sep - (sep*conv)/near, 0, 0, 1 ]
[/code]
Where q = far/(far-near)
We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).
It should be possible to inverse that and multiply the inverse view-projection matrix by it to add a stereo correction to it, or multiply the screen coordinate by it before multiplying by the inverse view-projection matrix.
Edit: It's inverse turns out to be just as simple:
[code]
[ 1, 0, 0, 0 ],
[ 0, 1, 0, 0 ],
[ -(sep*conv) / (q*near), 0, 1, 0 ],
[ -sep + (sep*conv)/near, 0, 0, 1 ]
[/code]
It's not super clear that this will work for this game, but it might do.
We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).
It should be possible to inverse that and multiply the inverse view-projection matrix by it to add a stereo correction to it, or multiply the screen coordinate by it before multiplying by the inverse view-projection matrix.
Edit: It's inverse turns out to be just as simple:
@DarkStarSword: Really awesome analysis there, thanks for sharing that detail. Really interesting. I especially appreciate the train-of-thought as a way of looking at these sorts of problems.
I'm not certain, but I think that Chiri had built into 3Dmigoto that idea of modifying the projection matrix to handle stereo shift as well. No one has ever used it, but if I read the code right it's an option. I don't think it will work in this case, because I think it needs to be able to match on a named matrix to be able to inject the changes, and this Constants[1] type stuff wouldn't work.
I think his idea was to make an empirical fix using the tuneup/tunedown controls to move something like shadows to the right depth visually. But this may not have ever worked quite right, I think he got hung up on different screen sizes. Worked in his fix for Bioshock Infinite, but only for one screen size.
@DarkStarSword: Really awesome analysis there, thanks for sharing that detail. Really interesting. I especially appreciate the train-of-thought as a way of looking at these sorts of problems.
I'm not certain, but I think that Chiri had built into 3Dmigoto that idea of modifying the projection matrix to handle stereo shift as well. No one has ever used it, but if I read the code right it's an option. I don't think it will work in this case, because I think it needs to be able to match on a named matrix to be able to inject the changes, and this Constants[1] type stuff wouldn't work.
I think his idea was to make an empirical fix using the tuneup/tunedown controls to move something like shadows to the right depth visually. But this may not have ever worked quite right, I think he got hung up on different screen sizes. Worked in his fix for Bioshock Infinite, but only for one screen size.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="helifax"]Based on the Math I see from DarkStarSword and based on the values that does look to be the projection matrix.
The matrix can be modified to add stereo perspective into it.
[img]http://www.songho.ca/opengl/files/gl_projectionmatrix_eq24.png[/img]
(This is an OPENGL view matrix -> Notice that the columns and rows ARE INVERTED)
Where: r= right, l=left, t=top, b=bottom
A very good article explaining the Proj Matrix is here: http://www.songho.ca/opengl/gl_projectionmatrix.html
(Even if you know it is always helpful to compare/refresh your knowledge)
...
[/quote]
If I'm reading this correctly, that matrix is actually an orthographic projection:
[img]http://www.songho.ca/opengl/files/gl_projectionmatrix02.png[/img]
Which I don't think we ever use. We only ever use Perspective Projection, right? It's always possible I'm confused.
Should be this? (for OpenGL)
[img]http://www.songho.ca/opengl/files/gl_projectionmatrix_eq16.png[/img]
helifax said:Based on the Math I see from DarkStarSword and based on the values that does look to be the projection matrix.
The matrix can be modified to add stereo perspective into it.
(This is an OPENGL view matrix -> Notice that the columns and rows ARE INVERTED)
Yes you are correct! My bad I posted the wrong image (Stupid Me). Yes the orthographic projection is not used (unless you want to render some UI using matrices for some reason).
The projection matrix is the one that is always used.
I'll just change the Image reference in my post to further avoid even more ambiguity:))
Thx ;)
Yes you are correct! My bad I posted the wrong image (Stupid Me). Yes the orthographic projection is not used (unless you want to render some UI using matrices for some reason).
The projection matrix is the one that is always used.
I'll just change the Image reference in my post to further avoid even more ambiguity:))
Thx ;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Yes you are correct! My bad I posted the wrong image (Stupid Me). Yes the orthographic projection is not used (unless you want to render some UI using matrices for some reason).
The projection matrix is the one that is always used.
I'll just change the Image reference in my post to further avoid even more ambiguity:))
Thx ;)[/quote]
OK, cool, just trying to understand all these tidbits. I thought maybe you guys meant to use orthographic for using a stereoization matrix.
[quote="DarkStarSword"]We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).[/quote]
Not positive I did this right of course, but I think the near value will be roughly 0.5.
Based on this image:
[img]http://sg.bo3b.net/max/MadMax17_85.jps[/img]
The only spot I could find to clip was Max himself in some inside room/angle scenario where he fades out, then disappears, presumably at a clipping point to avoid going inside his head.
helifax said:Yes you are correct! My bad I posted the wrong image (Stupid Me). Yes the orthographic projection is not used (unless you want to render some UI using matrices for some reason).
The projection matrix is the one that is always used.
I'll just change the Image reference in my post to further avoid even more ambiguity:))
Thx ;)
OK, cool, just trying to understand all these tidbits. I thought maybe you guys meant to use orthographic for using a stereoization matrix.
DarkStarSword said:We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).
Not positive I did this right of course, but I think the near value will be roughly 0.5.
Based on this image:
The only spot I could find to clip was Max himself in some inside room/angle scenario where he fades out, then disappears, presumably at a clipping point to avoid going inside his head.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Here's the image I'm keen on fixing, just the splash screen. But if we can fix the shadow here, I'm certain we can apply the fix to all other shadows, they all have the same form. This one is easy because it's not moving, and easy to get to.
After hunting VS, I get to this image, which I'm fairly sure is the right shader for the light casting the shadows.
The two shaders that create that hanging skull shadow are then
This definitely is a matrix multiplication here.
Now I am unsure if is the Inverse of the View-Projection or what matrix it exactly represents:
What we normally see in other games shaders:
[code]
......
r8.xy = g_screenSizeInv.xy * v0.xy;
......
r13.x = dot(r8.xyzw, g_invViewProjMatrix._m00_m10_m20_m30);
r13.y = dot(r8.xyzw, g_invViewProjMatrix._m01_m11_m21_m31);
r13.z = dot(r8.xyzw, g_invViewProjMatrix._m02_m12_m22_m32);
r3.w = dot(r8.xyzw, g_invViewProjMatrix._m03_m13_m23_m33);
r13.xyz = r13.xyz / r3.www;
[/code]
What we see in this game shaders:
[code]
r0.xyzw = Constants[2].xyzw * v0.yyyy;
r0.xyzw = v0.xxxx * Constants[1].xyzw + r0.xyzw;
r0.xyzw = Constants[4].xyzw + r0.xyzw;
r0.xyz = r0.xyz / r0.www;
[/code]
Notice the divide of each component with the "w" (r0.xyz = r0.xyz / r0.www;). This clearly shows(IMO) that it is some sort of a View-Projection Matrix.
EDIT:
Now based on what DarkStarSword showed above this is clearly a inverse ViewProj Matrix + something else..
[quote="DarkStarSword"]The answer to my above musing is:
[code]
[ 1, 0, 0, 0 ],
[ 0, 1, 0, 0 ],
[ (sep*conv) / (q*near), 0, 1, 0 ],
[ sep - (sep*conv)/near, 0, 0, 1 ]
[/code]
Where q = far/(far-near)
We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).
It should be possible to inverse that and multiply the inverse view-projection matrix by it to add a stereo correction to it, or multiply the screen coordinate by it before multiplying by the inverse view-projection matrix.
Edit: It's inverse turns out to be just as simple:
[code]
[ 1, 0, 0, 0 ],
[ 0, 1, 0, 0 ],
[ -(sep*conv) / (q*near), 0, 1, 0 ],
[ -sep + (sep*conv)/near, 0, 0, 1 ]
[/code]
It's not super clear that this will work for this game, but it might do.
[/quote]
I am not sure I follow exactly here, but are you saying that if we multiply Our Constants[1-4] matrix with the Inverse you posted above we might actually "inject" the stereoscopy in the Projection Matrix ?? (Sorry don't have a notepad with me to sketch up some calculations) ^_^
Notice the divide of each component with the "w" (r0.xyz = r0.xyz / r0.www;). This clearly shows(IMO) that it is some sort of a View-Projection Matrix.
EDIT:
Now based on what DarkStarSword showed above this is clearly a inverse ViewProj Matrix + something else..
DarkStarSword said:The answer to my above musing is:
We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).
It should be possible to inverse that and multiply the inverse view-projection matrix by it to add a stereo correction to it, or multiply the screen coordinate by it before multiplying by the inverse view-projection matrix.
Edit: It's inverse turns out to be just as simple:
It's not super clear that this will work for this game, but it might do.
I am not sure I follow exactly here, but are you saying that if we multiply Our Constants[1-4] matrix with the Inverse you posted above we might actually "inject" the stereoscopy in the Projection Matrix ?? (Sorry don't have a notepad with me to sketch up some calculations) ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]
What we see in this game shaders:
[code]
r0.xyzw = Constants[2].xyzw * v0.yyyy;
r0.xyzw = v0.xxxx * Constants[1].xyzw + r0.xyzw;
r0.xyzw = Constants[4].xyzw + r0.xyzw;
r0.xyz = r0.xyz / r0.www;
[/code]
Notice the divide of each component with the "w" (r0.xyz = r0.xyz / r0.www;). This clearly shows(IMO) that it is some sort of a View-Projection Matrix.
[/quote]
Hi guys, I'm not sure I fully followed everything here (sneaked a bit of time in at work ;-) The v0 seems to be a screen space coord as Darkstarsword said - the VS actually generates the texture coordinate from it. So the matrix multiplication to produce r0.xyz will be producing an absolute coordinate in either View Space, World Space, or perhaps a 'view-like' space centered at a light source. I think this is the case because of the divide by r0.www, which would make (if they had done it) the w component 1, which makes it a coordinate and not a vector (use of w in proj space is different, the w comp is just used to store the original z value temporarily, then I can't remember what happens to it). It's a common necessity when doing transformations using this "transpose form" (because that's actually what it is, multiplication by the transpose of a matrix, which is necessary if the coordinates are row major (or maybe column major, I forget now lol, but you get the point)). Anyway, it's clear that o0 is a screen coordinate and o2 is one of the other things. I tried to correct o2 when I looked last week and all hell broke loose on screen if I remember correctly - dark patches appearing and so on. Just for the sake of "emulating" an incorrectly scaled coordinate system I tried using various scaling factors as an approximation but did not get anything to look correct.
Notice the divide of each component with the "w" (r0.xyz = r0.xyz / r0.www;). This clearly shows(IMO) that it is some sort of a View-Projection Matrix.
Hi guys, I'm not sure I fully followed everything here (sneaked a bit of time in at work ;-) The v0 seems to be a screen space coord as Darkstarsword said - the VS actually generates the texture coordinate from it. So the matrix multiplication to produce r0.xyz will be producing an absolute coordinate in either View Space, World Space, or perhaps a 'view-like' space centered at a light source. I think this is the case because of the divide by r0.www, which would make (if they had done it) the w component 1, which makes it a coordinate and not a vector (use of w in proj space is different, the w comp is just used to store the original z value temporarily, then I can't remember what happens to it). It's a common necessity when doing transformations using this "transpose form" (because that's actually what it is, multiplication by the transpose of a matrix, which is necessary if the coordinates are row major (or maybe column major, I forget now lol, but you get the point)). Anyway, it's clear that o0 is a screen coordinate and o2 is one of the other things. I tried to correct o2 when I looked last week and all hell broke loose on screen if I remember correctly - dark patches appearing and so on. Just for the sake of "emulating" an incorrectly scaled coordinate system I tried using various scaling factors as an approximation but did not get anything to look correct.
[quote="bo3b"]Possibly helpful, there is a WorldViewProjMatrix in a different VertexShader, that is active in this same frame, as found in ShaderUsage.txt.[/quote]
Great find :)
You can use my frame analysis feature to dump the constant buffers from that shader and the shader for the shadows in the same frame, then compare them to see if the same matrix is available in any of the constant buffers the shadow shader has access to.
[code]
[Hunting]
analyse_frame = VK_F8
analyse_options =
[ShaderOverrideShaderWithWVP]
hash=fd1e42ae995ff08e
analyse_options = dump_cb_txt
[ShaderOverrideShadows]
hash=346aa0e5f6b29dc8
analyse_options = dump_cb_txt
[/code]
Or alternatively turn on dump_cb_txt in the global analyse_options and marvel at the large inexpensive array of redundant text files you recieve upon pressing F8 ;-)
I have a work in progress to copy constant buffers from one shader to another (as well as other resources like textures, render targets, depth targets and ... vertex buffers (I have some crazy ideas I might be able to solve with these)) - I will need this to fix physical lighting (specular highlights, environment reflections) in Unity 5 games with variable FOV, but that's been a lowish priority since the effects are relatively minor and a fudge value can work. If it turns out we need it for this game as well I can bump that up on my priority list.
[quote="bo3b"]Not positive I did this right of course, but I think the near value will be roughly 0.5.
Based on this image:
[img]http://sg.bo3b.net/max/MadMax17_85.jps[/img]
The only spot I could find to clip was Max himself in some inside room/angle scenario where he fades out, then disappears, presumably at a clipping point to avoid going inside his head.[/quote]
Great :)
The actual clipping plane might be closer than where he fades out, but at the very least that will give us an upper bound to work with :)
I used this approach to estimate the near + far clipping planes in The Witcher 3 when I was experimenting with adjusting the UI from the depth buffer, then just dialed them in until everything lined up.
[quote="helifax"]I am not sure I follow exactly here, but are you saying that if we multiply Our Constants[1-4] matrix with the Inverse you posted above we might actually "inject" the stereoscopy in the Projection Matrix ?? (Sorry don't have a notepad with me to sketch up some calculations) ^_^[/quote]
That's the theory. I don't know for sure if it will work yet, and even if it can work it might not be the best thing to do for this game, but maybe it will help somewhere. I actually started thinking about this approach a couple of weeks back while trying to fix that nasty screen space reflection shader in Crysis 3 and wondering if there was an easier way.
The maths should be relatively straight forward - to inject the inverse variant it needs to be on the left of the multiplication (the forward variant would be on the right):
[code]
Substitute some variables:
e = -(sep*conv) / (q*near)
f = -sep + (sep*conv)/near
[ 1, 0, 0, 0 ] [ ax, ay, az, aw ]
[ 0, 1, 0, 0 ] x [ bx, by, bz, bw ]
[ e, 0, 1, 0 ] [ cx, cy, cz, cw ]
[ f, 0, 0, 1 ] [ dx, dy, dz, dw ]
[ ax, ay, az, aw ]
= [ bx, by, bz, bw ]
[ (e * ax) + cx, (e * ay) + cy, (e * az) + cz, (e * aw) + cw ]
[ (f * ax) + dx, (f * ay) + dy, (f * az) + dz, (f * aw) + dw ]
[/code]
bo3b said:Possibly helpful, there is a WorldViewProjMatrix in a different VertexShader, that is active in this same frame, as found in ShaderUsage.txt.
Great find :)
You can use my frame analysis feature to dump the constant buffers from that shader and the shader for the shadows in the same frame, then compare them to see if the same matrix is available in any of the constant buffers the shadow shader has access to.
Or alternatively turn on dump_cb_txt in the global analyse_options and marvel at the large inexpensive array of redundant text files you recieve upon pressing F8 ;-)
I have a work in progress to copy constant buffers from one shader to another (as well as other resources like textures, render targets, depth targets and ... vertex buffers (I have some crazy ideas I might be able to solve with these)) - I will need this to fix physical lighting (specular highlights, environment reflections) in Unity 5 games with variable FOV, but that's been a lowish priority since the effects are relatively minor and a fudge value can work. If it turns out we need it for this game as well I can bump that up on my priority list.
bo3b said:Not positive I did this right of course, but I think the near value will be roughly 0.5.
Based on this image:
The only spot I could find to clip was Max himself in some inside room/angle scenario where he fades out, then disappears, presumably at a clipping point to avoid going inside his head.
Great :)
The actual clipping plane might be closer than where he fades out, but at the very least that will give us an upper bound to work with :)
I used this approach to estimate the near + far clipping planes in The Witcher 3 when I was experimenting with adjusting the UI from the depth buffer, then just dialed them in until everything lined up.
helifax said:I am not sure I follow exactly here, but are you saying that if we multiply Our Constants[1-4] matrix with the Inverse you posted above we might actually "inject" the stereoscopy in the Projection Matrix ?? (Sorry don't have a notepad with me to sketch up some calculations) ^_^
That's the theory. I don't know for sure if it will work yet, and even if it can work it might not be the best thing to do for this game, but maybe it will help somewhere. I actually started thinking about this approach a couple of weeks back while trying to fix that nasty screen space reflection shader in Crysis 3 and wondering if there was an easier way.
The maths should be relatively straight forward - to inject the inverse variant it needs to be on the left of the multiplication (the forward variant would be on the right):
Substitute some variables:
e = -(sep*conv) / (q*near)
f = -sep + (sep*conv)/near
ok, I have a haunch about that... but since I don't have the game yet to try I'll just look at the maths and see if my theory is plausible...
This is the vertex shader that DHR sent me (346aa0e5f6b29dc8-vs_replace.txt):
So it looks like there is a matrix multiply here:
It's multiplying a 2D coordinate v0.xy (or since we are using homogeneous coordinates [v0.x, v0.y, 1]) by this matrix:
So, this matrix looks interesting. I can make an assumption that v0.xy are likely between -1 and +1 based on this other line of code (that is obviously scaling it to be between 0 and 1 for the TEXCOORD0 output), and will guess that these will correspond to screen coordinates (which makes sense given this is a post processing effect):
So, what is this matrix doing? Well, let's see what happens if we run a few coordinates through this matrix, then normalise the result:
(I might have top and bottom mixed up here - I can't recall which way DX11 uses off hand)
Top left of the screen (-1,-1,1): [[-1.43449467 -0.56943564 -0.05242143 0.000025 ]]
Normalised: [[-57379.78802911 -22777.42619066 -2096.85734632 1. ]]
Top right of the screen (1,-1,1): [[ 0.07062549 -0.53811838 -1.44575952 0.000025 ]]
Normalised: [[ 2825.0197358 -21524.73590459 -57830.38204393 1. ]]
Bottom left of the screen (-1,1,1): [[-1.42944498 0.58397277 -0.02104213 0.000025 ]]
Normalised: [[-57177.80040027 23358.91123661 -841.6851242 1. ]]
Bottom right of the screen (1,1,1): [[ 0.07567518 0.61529002 -1.41438021 0.000025 ]]
Normalised: [[ 3027.00736465 24611.60152268 -56575.20982181 1. ]]
Center of the screen (0,0,1): [[-0.67940974 0.02292719 -0.73340082 0.000025 ]]
Normalised: [[-27176.39033223 917.08766601 -29336.03358406 1. ]]
Let's round those coordinates and place them on the screen:
If I'm not mistaken, these coordinates appear to form a plane in 3D space (note - I had to scale them to use this website):
http://bodurov.com/VectorVisualizer/?vectors=-5.7379/-2.2777/-0.2096/0.2825/-2.1524/-5.783v-5.7177/2.3358/-0.0841/0.3027/2.4611/-5.6575v-5.7379/-2.2777/-0.2096/-5.7177/2.3358/-0.0841v0.2825/-2.1524/-5.783/0.3027/2.4611/-5.6575v-5.7379/-2.2777/-0.2096/0.3027/2.4611/-5.6575v-5.7177/2.3358/-0.0841/0.2825/-2.1524/-5.783v0/0/0/-2.7176/0.0917/-2.9336
Gentlemen, what I belive you are looking at is the coordinates of the camera frustrum (likely at the far clipping plane) in world coordinates.
So... how would you go about applying a stereo correction given this knowledge? Well... say you took the coordinates at the top right corner and subtracted the coordinates at the top left corder - you would be left with a vector pointing towards camera right, which is the direction you need to apply the stereo correction. Divide that vector by the far clipping plane and you now have scaled the magnitude down to a unit distance from the camera. Calculate the stereo correction and multiply the result by this vector and I think you will be very close. You will probably need to experiment a little to find the exact spot to insert it and I could be wrong about some things...
A few of my own fixes with some math that might be of interest (none of this is exactly the same as above, just that have some passing similarities):
In Tinker I used the camera frustrum and far clipping plane to derive the horizontal FOV to fix a view space coordinate for shadows:
https://github.com/DarkStarSword/3d-fixes/commit/da1483c728398e9c355dfad42cb550e9434557bb
Stranded Deep also uses the frustrum coordinates in world space for some caustics and underwater shadows (though without using that matrix that Mad Max does - it just passed in the four corners in individual constant registers):
https://github.com/DarkStarSword/3d-fixes/blob/4eeb98aedb557167611e57b5f75b2a90e4d33d80/Stranded%20Deep/ShaderOverride/PixelShaders/661A055E.txt
Miasmata's light shafts had no matrices to help me at all - I ran two points through an unrelated objects MVP matrix to determine the camera's orientation (similar to how we have points on the camera frustrum) and worked out the correction amount from there:
https://github.com/DarkStarSword/3d-fixes/commit/8c8f51d2e0b8e583b09c3fe12d64ebbf1c9fa7e5
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Based on the Math I see from DarkStarSword and based on the values that does look to be the projection matrix.
The matrix can be modified to add stereo perspective into it.
(This is an OPENGL view matrix -> Notice that the columns and rows ARE INVERTED)
Where: r= right, l=left, t=top, b=bottom
A very good article explaining the Proj Matrix is here: http://www.songho.ca/opengl/gl_projectionmatrix.html
(Even if you know it is always helpful to compare/refresh your knowledge)
To apply the stereo correction on this matrix you can follow this:
Again this is on OpenGL. In DirectX the Rows and Coords are Inverted (Basically a DX matrix transposed() gives you an OpenGL one). Still the same principle applies.
Hope this helps;))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
https://github.com/DarkStarSword/3d-fixes/blob/master/matrix.py#L74
But so far I haven't used that approach in a fix.
I'm not sure if it will help in this case though - that matrix isn't a pure projection matrix (not enough zeroes), it's more likely to be an inverse view-projection matrix, but maybe not quite that either.
I didn't include Constants[3] from the vertex shader constant buffer above because it wasn't used in the multiplication since it was only using a 2D coordinate as the input. For the sake of completeness, here's the full 4x4 matrix:
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
e.g. here's a similar matrix (the .I means inverse):
Notably, no matrix that included a translation ended up with zeroes in the right spot - this suggests that the camera position was not used in the creation of this matrix, and the coordinates above will be centered on the camera - which I can easily confirm using Pythagoras to observe that the distance of each corner from the origin is the same:
And another consequence of this is I can easily derive the value of the far clipping plane by using the point at the center of the screen:
I love it when the result looks like a plausible value that a developer might have picked. The far clipping plane (at least in the scene/level this frame analysis was taken from, experience has taught me that this does not necessarily hold constant for an entire game) is 40000
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
If there is, we should be able to multiply any matrix that includes a projection matrix as well as some other matrix (like MVP or VP) by it, or alternatively inverse it then multiply that by the inverse MV/MVP matrix to build the stereo correction right into them as well.
Not sure what the answer is yet - I haven't sat down with a notebook to work out the math.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Where q = far/(far-near)
We've already determined far. We don't necessarily know what near is (I think it's 0.1 based on the constants used to scale the Z buffer, but I'm not certain), but it will be a small (non-zero) value that we could guess an approximation for and probably be close enough - you can even adjust the convergence to find an approximate value for it in the game (find something that pokes through the camera and clips, adjust convergence until the point it clips is at screen depth, and the convergence will then be equal to the near clipping plane).
It should be possible to inverse that and multiply the inverse view-projection matrix by it to add a stereo correction to it, or multiply the screen coordinate by it before multiplying by the inverse view-projection matrix.
Edit: It's inverse turns out to be just as simple:
It's not super clear that this will work for this game, but it might do.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I'm not certain, but I think that Chiri had built into 3Dmigoto that idea of modifying the projection matrix to handle stereo shift as well. No one has ever used it, but if I read the code right it's an option. I don't think it will work in this case, because I think it needs to be able to match on a named matrix to be able to inject the changes, and this Constants[1] type stuff wouldn't work.
I think his idea was to make an empirical fix using the tuneup/tunedown controls to move something like shadows to the right depth visually. But this may not have ever worked quite right, I think he got hung up on different screen sizes. Worked in his fix for Bioshock Infinite, but only for one screen size.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
If I'm reading this correctly, that matrix is actually an orthographic projection:
Which I don't think we ever use. We only ever use Perspective Projection, right? It's always possible I'm confused.
Should be this? (for OpenGL)
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
The projection matrix is the one that is always used.
I'll just change the Image reference in my post to further avoid even more ambiguity:))
Thx ;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
OK, cool, just trying to understand all these tidbits. I thought maybe you guys meant to use orthographic for using a stereoization matrix.
Not positive I did this right of course, but I think the near value will be roughly 0.5.
Based on this image:
The only spot I could find to clip was Max himself in some inside room/angle scenario where he fades out, then disappears, presumably at a clipping point to avoid going inside his head.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
After hunting VS, I get to this image, which I'm fairly sure is the right shader for the light casting the shadows.
The two shaders that create that hanging skull shadow are then
346aa0e5f6b29dc8-vs_replace.txt:
And PixelShader of d09b2165321f05cf-ps_replace.txt:
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
http://sg.bo3b.net/max/ShaderUsage.txt
fd1e42ae995ff08e-vs_replace.txt:
I don't find any inverse matrices, nor any standard VPM matrices.
Full ShaderCache (v1.2.2) here: http://sg.bo3b.net/max/MadMax_1.2.2_ShaderCache.7z
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Now I am unsure if is the Inverse of the View-Projection or what matrix it exactly represents:
What we normally see in other games shaders:
What we see in this game shaders:
Notice the divide of each component with the "w" (r0.xyz = r0.xyz / r0.www;). This clearly shows(IMO) that it is some sort of a View-Projection Matrix.
EDIT:
Now based on what DarkStarSword showed above this is clearly a inverse ViewProj Matrix + something else..
I am not sure I follow exactly here, but are you saying that if we multiply Our Constants[1-4] matrix with the Inverse you posted above we might actually "inject" the stereoscopy in the Projection Matrix ?? (Sorry don't have a notepad with me to sketch up some calculations) ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Hi guys, I'm not sure I fully followed everything here (sneaked a bit of time in at work ;-) The v0 seems to be a screen space coord as Darkstarsword said - the VS actually generates the texture coordinate from it. So the matrix multiplication to produce r0.xyz will be producing an absolute coordinate in either View Space, World Space, or perhaps a 'view-like' space centered at a light source. I think this is the case because of the divide by r0.www, which would make (if they had done it) the w component 1, which makes it a coordinate and not a vector (use of w in proj space is different, the w comp is just used to store the original z value temporarily, then I can't remember what happens to it). It's a common necessity when doing transformations using this "transpose form" (because that's actually what it is, multiplication by the transpose of a matrix, which is necessary if the coordinates are row major (or maybe column major, I forget now lol, but you get the point)). Anyway, it's clear that o0 is a screen coordinate and o2 is one of the other things. I tried to correct o2 when I looked last week and all hell broke loose on screen if I remember correctly - dark patches appearing and so on. Just for the sake of "emulating" an incorrectly scaled coordinate system I tried using various scaling factors as an approximation but did not get anything to look correct.
Rig: Intel i7-8700K @4.7GHz, 16Gb Ram, SSD, GTX 1080Ti, Win10x64, Asus VG278
Great find :)
You can use my frame analysis feature to dump the constant buffers from that shader and the shader for the shadows in the same frame, then compare them to see if the same matrix is available in any of the constant buffers the shadow shader has access to.
Or alternatively turn on dump_cb_txt in the global analyse_options and marvel at the large inexpensive array of redundant text files you recieve upon pressing F8 ;-)
I have a work in progress to copy constant buffers from one shader to another (as well as other resources like textures, render targets, depth targets and ... vertex buffers (I have some crazy ideas I might be able to solve with these)) - I will need this to fix physical lighting (specular highlights, environment reflections) in Unity 5 games with variable FOV, but that's been a lowish priority since the effects are relatively minor and a fudge value can work. If it turns out we need it for this game as well I can bump that up on my priority list.
Great :)
The actual clipping plane might be closer than where he fades out, but at the very least that will give us an upper bound to work with :)
I used this approach to estimate the near + far clipping planes in The Witcher 3 when I was experimenting with adjusting the UI from the depth buffer, then just dialed them in until everything lined up.
That's the theory. I don't know for sure if it will work yet, and even if it can work it might not be the best thing to do for this game, but maybe it will help somewhere. I actually started thinking about this approach a couple of weeks back while trying to fix that nasty screen space reflection shader in Crysis 3 and wondering if there was an easier way.
The maths should be relatively straight forward - to inject the inverse variant it needs to be on the left of the multiplication (the forward variant would be on the right):
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword