Hello DarkStarSword, here is a screen shot showing what I mean by artefact. to the right of the tree on the left is a black shape.
And a screenshot showing the fire lighting.
Hello DarkStarSword, here is a screen shot showing what I mean by artefact. to the right of the tree on the left is a black shape.
And a screenshot showing the fire lighting.
Hmm, looks more like the typical Unity 4 lighting + surface halo combination mess than pure Unity 5 lighting mess (Some Unity 5 games look a lot cleaner than that - I think the ones that still look like that are using the legacy Unity 4 style lighting shaders, or maybe it's just to do with the surface type - not sure). Unfortunately, this means that without fixing both issues it will be hard to tell if either is fixed as they both interact with each other.
Let's start by concentrating on one of each type. for the surface halos, let's focus on the [s]ground[/s] rocky wall (Edit: On second thought, I think the wall might be easier to see the effect of fixing it) and for the light let's focus on the red point light that fills most of that scene centered around the fire.
When you are hunting a surface halo in Unity you are looking for a vertex shader that either turns the surface a solid colour when skipped, or if none of them do that find one that makes the surface disappear. Hunt for any vertex shaders that have this effect on the wall and post them here (or if you want to jump ahead, see if you can work out where you need to add the stereo correction formula). Once this is fixed, the halo of nearby objects (such as the red outline of the tree) should no longer be visible on the wall (it will still be visible on other surfaces we haven't fixed yet, and the light and shadows still won't be aligned).
For the light you will need to hunt both the vertex and pixel shaders - you are looking for a shader that removes all the light projected from that fire when it is skipped (if it just removes the shadows but leaves the light it's probably the wrong one). There might be several shaders that all have the same effect, so dump any that seem like they might be relevant and post them here.
If you have trouble identifying the right shader through hunting, try changing marking_mode to pink in the d3dx.ini - this can make it easier to find point light shaders in particular as turning them pink will render a large pink volume around the light covering all the scenery the light can reach.
Hmm, looks more like the typical Unity 4 lighting + surface halo combination mess than pure Unity 5 lighting mess (Some Unity 5 games look a lot cleaner than that - I think the ones that still look like that are using the legacy Unity 4 style lighting shaders, or maybe it's just to do with the surface type - not sure). Unfortunately, this means that without fixing both issues it will be hard to tell if either is fixed as they both interact with each other.
Let's start by concentrating on one of each type. for the surface halos, let's focus on the ground rocky wall (Edit: On second thought, I think the wall might be easier to see the effect of fixing it) and for the light let's focus on the red point light that fills most of that scene centered around the fire.
When you are hunting a surface halo in Unity you are looking for a vertex shader that either turns the surface a solid colour when skipped, or if none of them do that find one that makes the surface disappear. Hunt for any vertex shaders that have this effect on the wall and post them here (or if you want to jump ahead, see if you can work out where you need to add the stereo correction formula). Once this is fixed, the halo of nearby objects (such as the red outline of the tree) should no longer be visible on the wall (it will still be visible on other surfaces we haven't fixed yet, and the light and shadows still won't be aligned).
For the light you will need to hunt both the vertex and pixel shaders - you are looking for a shader that removes all the light projected from that fire when it is skipped (if it just removes the shadows but leaves the light it's probably the wrong one). There might be several shaders that all have the same effect, so dump any that seem like they might be relevant and post them here.
If you have trouble identifying the right shader through hunting, try changing marking_mode to pink in the d3dx.ini - this can make it easier to find point light shaders in particular as turning them pink will render a large pink volume around the light covering all the scenery the light can reach.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
What is that so called artifact in the first screenshot then. I see them all over so do I have to disable each individual one. It is only in one eye. What is a surface halo ?
I had already found a vertex shader which disabled that whole red light. How do I post code in this forum ?
I appreciate you devoting time to this and I will try but maybe I have bitten off more than I can chew especially since I can't use alt-tab and have to start the game again after every change without even being able to skip the intro logos.
What is that so called artifact in the first screenshot then. I see them all over so do I have to disable each individual one. It is only in one eye. What is a surface halo ?
I had already found a vertex shader which disabled that whole red light. How do I post code in this forum ?
I appreciate you devoting time to this and I will try but maybe I have bitten off more than I can chew especially since I can't use alt-tab and have to start the game again after every change without even being able to skip the intro logos.
If you're referring to the black region to the right of the tree, that's just a halo. It looks a bit weird compared to most halos because it's in the lighting transfer in Unity, whereas most typical halos occur in transparent effects (which is why I say the halos interact with the lighting in Unity), but you can clearly see that in the left eye it matches the shape of the rock to it's right and in the right eye it matches the shape of the tree to the left.
It will either be on the surface behind the halo (the rocky wall), or the fog. My experience is that fog in Unity games is usually not rendered as a separate effect (with some exceptions), but rather rendered on the surface of whatever is behind it, so I'd bet either way it needs to be fixed in the wall surface shader.
If you're referring to the black region to the right of the tree, that's just a halo. It looks a bit weird compared to most halos because it's in the lighting transfer in Unity, whereas most typical halos occur in transparent effects (which is why I say the halos interact with the lighting in Unity), but you can clearly see that in the left eye it matches the shape of the rock to it's right and in the right eye it matches the shape of the tree to the left.
It will either be on the surface behind the halo (the rocky wall), or the fog. My experience is that fog in Unity games is usually not rendered as a separate effect (with some exceptions), but rather rendered on the surface of whatever is behind it, so I'd bet either way it needs to be fixed in the wall surface shader.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
You can post the shaders inside [code][/code] blocks on the forum. Yeah, I wouldn't usually recommend Unity as a good engine for a first fix by hand (if the template had worked it would have been a lot easier), but despite it's challenges it is now a well understood engine, so it is a better choice then something new.
Another game with unskippable intro videos? How annoying - DreadOut and Life Is Strange were the same and my time has probably added up to multiple hours just starting at their damn logos by now. I wish devs would stop doing that.
You can post the shaders inside [code][/code] blocks on the forum. Yeah, I wouldn't usually recommend Unity as a good engine for a first fix by hand (if the template had worked it would have been a lot easier), but despite it's challenges it is now a well understood engine, so it is a better choice then something new.
Another game with unskippable intro videos? How annoying - DreadOut and Life Is Strange were the same and my time has probably added up to multiple hours just starting at their damn logos by now. I wish devs would stop doing that.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
What's the filename of that shader? It's contents match ca5cfc8e4d8b1ce5-vs_replace.txt exactly, so you should just be able to use the one from my template to fix that.
You will also need the corresponding pixel shader (should have the same visual appearance as the vertex shader while skipped) as part of the fix needs to go there as well.
What's the filename of that shader? It's contents match ca5cfc8e4d8b1ce5-vs_replace.txt exactly, so you should just be able to use the one from my template to fix that.
You will also need the corresponding pixel shader (should have the same visual appearance as the vertex shader while skipped) as part of the fix needs to go there as well.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
That is the only one with the same name as yours. I did try replacing it but it didn't seem to have any effect but I didn't change a pixel shader.
If I understand correctly the point position is modified by the stereo driver but the texture positions not so you have to do stereo corrections on the texture positions - is that correct ?
What do you have to change in the pixel shaders ?
That is the only one with the same name as yours. I did try replacing it but it didn't seem to have any effect but I didn't change a pixel shader.
If I understand correctly the point position is modified by the stereo driver but the texture positions not so you have to do stereo corrections on the texture positions - is that correct ?
Ok, this is my progress so far. I can't find the reason for the halo right of the rock in the first screenshot and the halo left of the tree in the second screenshot. What exactly causes halos ?
Ok, this is my progress so far. I can't find the reason for the halo right of the rock in the first screenshot and the halo left of the tree in the second screenshot. What exactly causes halos ?
[quote="tonka"]That is the only one with the same name as yours. I did try replacing it but it didn't seem to have any effect but I didn't change a pixel shader.[/quote]
Right, that's because Unity lights need fixes in both the vertex and pixel shaders, otherwise they will still be broken. The vertex shader contains the hard part of the fix, but you can just use the template for that as it usually doesn't need to be changed. The pixel shaders need a much simpler fix, but they keep changing with new Unity versions and game updates so I don't have them all in my template.
[quote]If I understand correctly the point position is modified by the stereo driver but the texture positions not so you have to do stereo corrections on the texture positions - is that correct ?[/quote]Correct. A "texture coordinate" is really just any old value passed from one stage of the rendering pipeline to another so the driver has no way to know whether they should be adjusted or not, or how they should be adjusted. Don't pay too much attention to the name - in some cases it might actually be a texture coordinate, but most of the time it isn't.
In the case of Unity's lighting shaders there are two "texture coordinates" - TEXCOORD0 is used to pass a copy of the screen position through to the pixel shader, and TEXCOORD1 is used to pass a ray in view-space coordinates through. Both of these need to be adjusted in the vertex shader, but only if the nvidia driver adjusted the output position - which is why I have checks in there to work out if it's a full screen pass or not (the driver won't have adjusted the output position when it's full screen).
[quote]What do you have to change in the pixel shaders ?[/quote][url]https://forums.geforce.com/default/topic/766890/3d-vision/bo3bs-school-for-shaderhackers/post/4728373/#4728373[/url]
tonka said:That is the only one with the same name as yours. I did try replacing it but it didn't seem to have any effect but I didn't change a pixel shader.
Right, that's because Unity lights need fixes in both the vertex and pixel shaders, otherwise they will still be broken. The vertex shader contains the hard part of the fix, but you can just use the template for that as it usually doesn't need to be changed. The pixel shaders need a much simpler fix, but they keep changing with new Unity versions and game updates so I don't have them all in my template.
If I understand correctly the point position is modified by the stereo driver but the texture positions not so you have to do stereo corrections on the texture positions - is that correct ?
Correct. A "texture coordinate" is really just any old value passed from one stage of the rendering pipeline to another so the driver has no way to know whether they should be adjusted or not, or how they should be adjusted. Don't pay too much attention to the name - in some cases it might actually be a texture coordinate, but most of the time it isn't.
In the case of Unity's lighting shaders there are two "texture coordinates" - TEXCOORD0 is used to pass a copy of the screen position through to the pixel shader, and TEXCOORD1 is used to pass a ray in view-space coordinates through. Both of these need to be adjusted in the vertex shader, but only if the nvidia driver adjusted the output position - which is why I have checks in there to work out if it's a full screen pass or not (the driver won't have adjusted the output position when it's full screen).
[quote="tonka"]I think that this is the pixel shader.[/quote]That looks right - try this:
[code][/Texture2D<float4> t3 : register(t3);
TextureCube<float4> t2 : register(t2);
Texture2D<float4> t1 : register(t1);
Texture2D<float4> t0 : register(t0);
SamplerState s3_s : register(s3);
SamplerState s2_s : register(s2);
SamplerState s1_s : register(s1);
SamplerState s0_s : register(s0);
cbuffer cb4 : register(b4)
{
float4 cb4[4];
}
cbuffer cb3 : register(b3)
{
float4 cb3[26];
}
cbuffer cb2 : register(b2)
{
float4 cb2[2];
}
cbuffer cb1 : register(b1)
{
float4 cb1[8];
}
cbuffer cb0 : register(b0)
{
float4 cb0[11];
}
Texture2D<float4> StereoParams : register(t125);
Texture1D<float4> IniParams : register(t120);
void main(
float4 v0 : SV_POSITION0,
float4 v1 : TEXCOORD0,
float3 v2 : TEXCOORD1,
out float4 o0 : SV_Target0,
// New input passed from vertex shader with UNITY_MATRIX_IT_MV[0].x:
float fov : TEXCOORD2
)
{
float4 r0,r1,r2,r3,r4,r5,r6;
uint4 bitmask, uiDest;
float4 fDest;
r0.xyz = cb0[9].xyz * cb0[3].xyz;
r0.xz = r0.xx + r0.yz;
r0.y = r0.y * r0.z;
r0.x = cb0[9].z * cb0[3].z + r0.x;
r0.y = sqrt(r0.y);
r0.y = dot(cb0[3].ww, r0.yy);
r0.x = r0.x + r0.y;
r0.yz = v1.xy / v1.ww;
// This shader lumped two samples close together, which might make it harder to
// work out which one is the Z buffer. I happen to know that it will be t0 - I
// have scripts that can extract information from Unity to work this out.
r1.xyzw = t0.Sample(s0_s, r0.yz).xyzw;
r2.xyzw = t3.Sample(s3_s, r0.yz).xyzw;
// The value from the Z buffer is being scaled to match world depth:
r0.y = cb1[7].x * r1.x + cb1[7].y;
r0.y = 1 / r0.y;
r0.z = cb1[5].z / v2.z;
r1.xyz = v2.xyz * r0.zzz;
// And now the scaled depth value is being multiplied by a three dimensional
// coordinate (the view-space ray from the vertex shader). This will give us a
// three-dimensional coordinate along the ray, which we will need to adjust:
r1.xyw = r1.xyz * r0.yyy;
// We note that the third component of the new coordinate is w (sometimes it's
// z), so for convenience let's call that depth:
float depth = r1.w;
// Load the stereo parameters as usual:
float4 stereo = StereoParams.Load(0);
float separation = stereo.x; float convergence = stereo.y;
// And finally adjust the ray, by subtracting the view-space variation of the
// stereo correction formula. The view-space variation includes part of the
// inverse projection matrix ("fov" here), which we calculated in the vertex
// shader and passed in using a spare texture coordinate (TEXCOORD2):
r1.x -= separation * (depth - convergence) * fov;
// This is something to keep an eye out for in the lighting pixel shaders.
// These next four lines are one of the two main ways matrix multiplications
// are done in shaders, and signifies that a coordinate is being changed from
// one coordinate system to another (in this case it's the _CameraToWorld
// matrix). It is very common for us to have to perform an adjustment before or
// after one of these:
r3.xyz = cb4[1].xyz * r1.yyy;
r3.xyz = cb4[0].xyz * r1.xxx + r3.xyz;
r3.xyz = cb4[2].xyz * r1.www + r3.xyz;
r3.xyz = cb4[3].xyz + r3.xyz;
r4.xyz = -cb0[8].xyz + r3.xyz;
r5.xyz = float3(0.0078125,0.0078125,0.0078125) + r4.xyz;
r5.xyzw = t2.Sample(s2_s, r5.xyz).xyzw;
r6.xyz = float3(-0.0078125,-0.0078125,0.0078125) + r4.xyz;
r6.xyzw = t2.Sample(s2_s, r6.xyz).xyzw;
r5.y = r6.x;
r6.xyz = float3(-0.0078125,0.0078125,-0.0078125) + r4.xyz;
r6.xyzw = t2.Sample(s2_s, r6.xyz).xyzw;
r5.z = r6.x;
r6.xyz = float3(0.0078125,-0.0078125,-0.0078125) + r4.xyz;
r6.xyzw = t2.Sample(s2_s, r6.xyz).xyzw;
r5.w = r6.x;
r0.z = dot(r4.xyz, r4.xyz);
r0.w = sqrt(r0.z);
r0.w = cb2[1].w * r0.w;
r0.w = 0.970000029 * r0.w;
r5.xyzw = r5.xyzw < r0.wwww;
r5.xyzw = r5.xyzw ? cb3[24].xxxx : float4(1,1,1,1);
r0.w = dot(r5.xyzw, float4(0.25,0.25,0.25,0.25));
r1.x = cb0[8].w * r0.z;
r0.z = rsqrt(r0.z);
r4.xyz = r4.xyz * r0.zzz;
r5.xyzw = t1.Sample(s1_s, r1.xx).xyzw;
r0.z = r5.x * r0.w;
r0.w = saturate(r0.z);
r5.xyz = -cb1[4].xyz + r3.xyz;
r3.xyz = -cb3[25].xyz + r3.xyz;
r1.x = dot(r3.xyz, r3.xyz);
r1.x = sqrt(r1.x);
r0.y = -r1.z * r0.y + r1.x;
r0.y = cb3[25].w * r0.y + r1.w;
r0.y = r0.y * cb0[10].z + cb0[10].w;
r0.y = saturate(1 + -r0.y);
r1.x = dot(r5.xyz, r5.xyz);
r1.x = rsqrt(r1.x);
r1.xyz = -r5.xyz * r1.xxx + -r4.xyz;
r1.w = dot(r1.xyz, r1.xyz);
r1.w = rsqrt(r1.w);
r1.xyz = r1.xyz * r1.www;
r2.xyz = r2.xyz * float3(2,2,2) + float3(-1,-1,-1);
r1.w = 128 * r2.w;
r2.w = dot(r2.xyz, r2.xyz);
r2.w = rsqrt(r2.w);
r2.xyz = r2.xyz * r2.www;
r1.x = dot(r1.xyz, r2.xyz);
r1.y = dot(-r4.xyz, r2.xyz);
r1.y = max(0, r1.y);
r0.z = r1.y * r0.z;
r2.xyz = cb0[9].xyz * r0.zzz;
r0.z = max(0, r1.x);
r0.z = log2(r0.z);
r0.z = r1.w * r0.z;
r0.z = exp2(r0.z);
r0.z = r0.z * r0.w;
r2.w = r0.z * r0.x;
o0.xyzw = r2.xyzw * r0.yyyy;
return;
}
[/code]
// This shader lumped two samples close together, which might make it harder to
// work out which one is the Z buffer. I happen to know that it will be t0 - I
// have scripts that can extract information from Unity to work this out.
// And now the scaled depth value is being multiplied by a three dimensional
// coordinate (the view-space ray from the vertex shader). This will give us a
// three-dimensional coordinate along the ray, which we will need to adjust:
r1.xyw = r1.xyz * r0.yyy;
// We note that the third component of the new coordinate is w (sometimes it's
// z), so for convenience let's call that depth:
// And finally adjust the ray, by subtracting the view-space variation of the
// stereo correction formula. The view-space variation includes part of the
// inverse projection matrix ("fov" here), which we calculated in the vertex
// shader and passed in using a spare texture coordinate (TEXCOORD2):
r1.x -= separation * (depth - convergence) * fov;
// This is something to keep an eye out for in the lighting pixel shaders.
// These next four lines are one of the two main ways matrix multiplications
// are done in shaders, and signifies that a coordinate is being changed from
// one coordinate system to another (in this case it's the _CameraToWorld
// matrix). It is very common for us to have to perform an adjustment before or
// after one of these:
[quote="tonka"]Ok, this is my progress so far. I can't find the reason for the halo right of the rock in the first screenshot and the halo left of the tree in the second screenshot. What exactly causes halos ?
[/quote]Good - looks like you got the light sorted. The remaining halos there will be on the surface of the tree trunk, the grass, the ground and the flame - these should follow much the same pattern as the rock surface you already fixed.
Halos are almost always caused by a texture coordinate that has not been stereo corrected to line up with the position. In most games this is restricted to semi-transparent effects that calculate their opacity from the depth buffer, and the opacity will be misaligned. Unity halos look a lot worse because they also have issues on opaque surfaces where the lighting becomes misaligned as it is transferred to the surface. Regardless, the pattern to fix halos is the same either way.
tonka said:Ok, this is my progress so far. I can't find the reason for the halo right of the rock in the first screenshot and the halo left of the tree in the second screenshot. What exactly causes halos ?
Good - looks like you got the light sorted. The remaining halos there will be on the surface of the tree trunk, the grass, the ground and the flame - these should follow much the same pattern as the rock surface you already fixed.
Halos are almost always caused by a texture coordinate that has not been stereo corrected to line up with the position. In most games this is restricted to semi-transparent effects that calculate their opacity from the depth buffer, and the opacity will be misaligned. Unity halos look a lot worse because they also have issues on opaque surfaces where the lighting becomes misaligned as it is transferred to the surface. Regardless, the pattern to fix halos is the same either way.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
I have adjusted every shader on r0 after it has first been filled which is not correct for this shader I see. You must only make the adjustment after a sample I assume. I had disabled the main shadows from the fire on the cabin because when enabled I always get black shapes on the rock wall as you see in the original screenshot. I don't know how to solve that.
I have modified 29 shaders for that scene which is probably far too much.
How do I make an on/off mechanism in 3Dmitogo as in lesson 3 of bo3b's school for helixmod ?
Its terrible not being able to use alt-tab.
I have adjusted every shader on r0 after it has first been filled which is not correct for this shader I see. You must only make the adjustment after a sample I assume. I had disabled the main shadows from the fire on the cabin because when enabled I always get black shapes on the rock wall as you see in the original screenshot. I don't know how to solve that.
I have modified 29 shaders for that scene which is probably far too much.
How do I make an on/off mechanism in 3Dmitogo as in lesson 3 of bo3b's school for helixmod ?
[quote="tonka"]I have adjusted every shader on r0 after it has first been filled which is not correct for this shader I see.[/quote]
Right, it's not always r0 - the compiler is free to pick any temporary variable it likes. It's better to learn to recognise the patterns based on things that it cannot change - things like the v* inputs, o* outputs, constant buffers, texture registers and so on. If the game provided headers it would be easier to identify these, but Unity strips them out (I have scripts that can extract them back out of the .asset files in Unity games).
[quote]You must only make the adjustment after a sample I assume.[/quote]Yes, and more specifically after the depth value has been scaled and multiplied by the ray, and before the matrix multiply to change coordinate systems.
[quote]I had disabled the main shadows from the fire on the cabin because when enabled I always get black shapes on the rock wall as you see in the original screenshot. I don't know how to solve that.
I have modified 29 shaders for that scene which is probably far too much.[/quote]Unity games do need a lot of shaders adjusted. Hard to say if 29 is right for this scene just by looking at it, but it's certainly possible. You might want to go back through any of the shaders you adjusted in case any early attempts at fixing the effects are causing new problems once the lights have been fixed properly.
[quote]How do I make an on/off mechanism in 3Dmitogo as in lesson 3 of bo3b's school for helixmod ?[/quote]
It's all documented with examples in the d3dx.ini. Personally I recommend a slight variation though to transition the fix on and off slowly over a period of time, which makes it a lot easier to see what is changing:
[code]
[KeyDebug]
Key = z
x = 1
type = toggle
transition = 1000
release_transition = 1000
[/code]
[code]
float4 params = IniParams.Load(0);
// multiply adjustment by the debug transition to show what the adjustment is
// doing when pressing the key, e.g.
r0.x += stereo.x * (r0.z - stereo.y) * params.x;
[/code]
[quote]Its terrible not being able to use alt-tab.[/quote]Personally I use a second laptop and edit the shaders over the network. Bo3b has mentioned that he sometimes uses discover mode which works in a window - if you have a pair of red/blue glasses you can still see most of the issues.
tonka said:I have adjusted every shader on r0 after it has first been filled which is not correct for this shader I see.
Right, it's not always r0 - the compiler is free to pick any temporary variable it likes. It's better to learn to recognise the patterns based on things that it cannot change - things like the v* inputs, o* outputs, constant buffers, texture registers and so on. If the game provided headers it would be easier to identify these, but Unity strips them out (I have scripts that can extract them back out of the .asset files in Unity games).
You must only make the adjustment after a sample I assume.
Yes, and more specifically after the depth value has been scaled and multiplied by the ray, and before the matrix multiply to change coordinate systems.
I had disabled the main shadows from the fire on the cabin because when enabled I always get black shapes on the rock wall as you see in the original screenshot. I don't know how to solve that.
I have modified 29 shaders for that scene which is probably far too much.
Unity games do need a lot of shaders adjusted. Hard to say if 29 is right for this scene just by looking at it, but it's certainly possible. You might want to go back through any of the shaders you adjusted in case any early attempts at fixing the effects are causing new problems once the lights have been fixed properly.
How do I make an on/off mechanism in 3Dmitogo as in lesson 3 of bo3b's school for helixmod ?
It's all documented with examples in the d3dx.ini. Personally I recommend a slight variation though to transition the fix on and off slowly over a period of time, which makes it a lot easier to see what is changing:
[KeyDebug]
Key = z
x = 1
type = toggle
transition = 1000
release_transition = 1000
float4 params = IniParams.Load(0);
// multiply adjustment by the debug transition to show what the adjustment is
// doing when pressing the key, e.g.
r0.x += stereo.x * (r0.z - stereo.y) * params.x;
Its terrible not being able to use alt-tab.
Personally I use a second laptop and edit the shaders over the network. Bo3b has mentioned that he sometimes uses discover mode which works in a window - if you have a pair of red/blue glasses you can still see most of the issues.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
And a screenshot showing the fire lighting.
Let's start by concentrating on one of each type. for the surface halos, let's focus on the
groundrocky wall (Edit: On second thought, I think the wall might be easier to see the effect of fixing it) and for the light let's focus on the red point light that fills most of that scene centered around the fire.When you are hunting a surface halo in Unity you are looking for a vertex shader that either turns the surface a solid colour when skipped, or if none of them do that find one that makes the surface disappear. Hunt for any vertex shaders that have this effect on the wall and post them here (or if you want to jump ahead, see if you can work out where you need to add the stereo correction formula). Once this is fixed, the halo of nearby objects (such as the red outline of the tree) should no longer be visible on the wall (it will still be visible on other surfaces we haven't fixed yet, and the light and shadows still won't be aligned).
For the light you will need to hunt both the vertex and pixel shaders - you are looking for a shader that removes all the light projected from that fire when it is skipped (if it just removes the shadows but leaves the light it's probably the wrong one). There might be several shaders that all have the same effect, so dump any that seem like they might be relevant and post them here.
If you have trouble identifying the right shader through hunting, try changing marking_mode to pink in the d3dx.ini - this can make it easier to find point light shaders in particular as turning them pink will render a large pink volume around the light covering all the scenery the light can reach.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I had already found a vertex shader which disabled that whole red light. How do I post code in this forum ?
I appreciate you devoting time to this and I will try but maybe I have bitten off more than I can chew especially since I can't use alt-tab and have to start the game again after every change without even being able to skip the intro logos.
It will either be on the surface behind the halo (the rocky wall), or the fog. My experience is that fog in Unity games is usually not rendered as a separate effect (with some exceptions), but rather rendered on the surface of whatever is behind it, so I'd bet either way it needs to be fixed in the wall surface shader.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Another game with unskippable intro videos? How annoying - DreadOut and Life Is Strange were the same and my time has probably added up to multiple hours just starting at their damn logos by now. I wish devs would stop doing that.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
You will also need the corresponding pixel shader (should have the same visual appearance as the vertex shader while skipped) as part of the fix needs to go there as well.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
If I understand correctly the point position is modified by the stereo driver but the texture positions not so you have to do stereo corrections on the texture positions - is that correct ?
What do you have to change in the pixel shaders ?
[code][/Texture2D<float4> t3 : register(t3);
TextureCube<float4> t2 : register(t2);
Texture2D<float4> t1 : register(t1);
Texture2D<float4> t0 : register(t0);
SamplerState s3_s : register(s3);
SamplerState s2_s : register(s2);
SamplerState s1_s : register(s1);
SamplerState s0_s : register(s0);
cbuffer cb4 : register(b4)
{
float4 cb4[4];
}
cbuffer cb3 : register(b3)
{
float4 cb3[26];
}
cbuffer cb2 : register(b2)
{
float4 cb2[2];
}
cbuffer cb1 : register(b1)
{
float4 cb1[8];
}
cbuffer cb0 : register(b0)
{
float4 cb0[11];
}
Texture2D<float4> StereoParams : register(t125);
Texture1D<float4> IniParams : register(t120);
void main(
float4 v0 : SV_POSITION0,
float4 v1 : TEXCOORD0,
float3 v2 : TEXCOORD1,
out float4 o0 : SV_Target0)
{
float4 r0,r1,r2,r3,r4,r5,r6;
uint4 bitmask, uiDest;
float4 fDest;
r0.xyz = cb0[9].xyz * cb0[3].xyz;
r0.xz = r0.xx + r0.yz;
r0.y = r0.y * r0.z;
r0.x = cb0[9].z * cb0[3].z + r0.x;
r0.y = sqrt(r0.y);
r0.y = dot(cb0[3].ww, r0.yy);
r0.x = r0.x + r0.y;
r0.yz = v1.xy / v1.ww;
r1.xyzw = t0.Sample(s0_s, r0.yz).xyzw;
r2.xyzw = t3.Sample(s3_s, r0.yz).xyzw;
r0.y = cb1[7].x * r1.x + cb1[7].y;
r0.y = 1 / r0.y;
r0.z = cb1[5].z / v2.z;
r1.xyz = v2.xyz * r0.zzz;
r1.xyw = r1.xyz * r0.yyy;
r3.xyz = cb4[1].xyz * r1.yyy;
r3.xyz = cb4[0].xyz * r1.xxx + r3.xyz;
r3.xyz = cb4[2].xyz * r1.www + r3.xyz;
r3.xyz = cb4[3].xyz + r3.xyz;
r4.xyz = -cb0[8].xyz + r3.xyz;
r5.xyz = float3(0.0078125,0.0078125,0.0078125) + r4.xyz;
r5.xyzw = t2.Sample(s2_s, r5.xyz).xyzw;
r6.xyz = float3(-0.0078125,-0.0078125,0.0078125) + r4.xyz;
r6.xyzw = t2.Sample(s2_s, r6.xyz).xyzw;
r5.y = r6.x;
r6.xyz = float3(-0.0078125,0.0078125,-0.0078125) + r4.xyz;
r6.xyzw = t2.Sample(s2_s, r6.xyz).xyzw;
r5.z = r6.x;
r6.xyz = float3(0.0078125,-0.0078125,-0.0078125) + r4.xyz;
r6.xyzw = t2.Sample(s2_s, r6.xyz).xyzw;
r5.w = r6.x;
r0.z = dot(r4.xyz, r4.xyz);
r0.w = sqrt(r0.z);
r0.w = cb2[1].w * r0.w;
r0.w = 0.970000029 * r0.w;
r5.xyzw = r5.xyzw < r0.wwww;
r5.xyzw = r5.xyzw ? cb3[24].xxxx : float4(1,1,1,1);
r0.w = dot(r5.xyzw, float4(0.25,0.25,0.25,0.25));
r1.x = cb0[8].w * r0.z;
r0.z = rsqrt(r0.z);
r4.xyz = r4.xyz * r0.zzz;
r5.xyzw = t1.Sample(s1_s, r1.xx).xyzw;
r0.z = r5.x * r0.w;
r0.w = saturate(r0.z);
r5.xyz = -cb1[4].xyz + r3.xyz;
r3.xyz = -cb3[25].xyz + r3.xyz;
r1.x = dot(r3.xyz, r3.xyz);
r1.x = sqrt(r1.x);
r0.y = -r1.z * r0.y + r1.x;
r0.y = cb3[25].w * r0.y + r1.w;
r0.y = r0.y * cb0[10].z + cb0[10].w;
r0.y = saturate(1 + -r0.y);
r1.x = dot(r5.xyz, r5.xyz);
r1.x = rsqrt(r1.x);
r1.xyz = -r5.xyz * r1.xxx + -r4.xyz;
r1.w = dot(r1.xyz, r1.xyz);
r1.w = rsqrt(r1.w);
r1.xyz = r1.xyz * r1.www;
r2.xyz = r2.xyz * float3(2,2,2) + float3(-1,-1,-1);
r1.w = 128 * r2.w;
r2.w = dot(r2.xyz, r2.xyz);
r2.w = rsqrt(r2.w);
r2.xyz = r2.xyz * r2.www;
r1.x = dot(r1.xyz, r2.xyz);
r1.y = dot(-r4.xyz, r2.xyz);
r1.y = max(0, r1.y);
r0.z = r1.y * r0.z;
r2.xyz = cb0[9].xyz * r0.zzz;
r0.z = max(0, r1.x);
r0.z = log2(r0.z);
r0.z = r1.w * r0.z;
r0.z = exp2(r0.z);
r0.z = r0.z * r0.w;
r2.w = r0.z * r0.x;
o0.xyzw = r2.xyzw * r0.yyyy;
return;
}
code]
Right, that's because Unity lights need fixes in both the vertex and pixel shaders, otherwise they will still be broken. The vertex shader contains the hard part of the fix, but you can just use the template for that as it usually doesn't need to be changed. The pixel shaders need a much simpler fix, but they keep changing with new Unity versions and game updates so I don't have them all in my template.
Correct. A "texture coordinate" is really just any old value passed from one stage of the rendering pipeline to another so the driver has no way to know whether they should be adjusted or not, or how they should be adjusted. Don't pay too much attention to the name - in some cases it might actually be a texture coordinate, but most of the time it isn't.
In the case of Unity's lighting shaders there are two "texture coordinates" - TEXCOORD0 is used to pass a copy of the screen position through to the pixel shader, and TEXCOORD1 is used to pass a ray in view-space coordinates through. Both of these need to be adjusted in the vertex shader, but only if the nvidia driver adjusted the output position - which is why I have checks in there to work out if it's a full screen pass or not (the driver won't have adjusted the output position when it's full screen).
https://forums.geforce.com/default/topic/766890/3d-vision/bo3bs-school-for-shaderhackers/post/4728373/#4728373
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Halos are almost always caused by a texture coordinate that has not been stereo corrected to line up with the position. In most games this is restricted to semi-transparent effects that calculate their opacity from the depth buffer, and the opacity will be misaligned. Unity halos look a lot worse because they also have issues on opaque surfaces where the lighting becomes misaligned as it is transferred to the surface. Regardless, the pattern to fix halos is the same either way.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I have modified 29 shaders for that scene which is probably far too much.
How do I make an on/off mechanism in 3Dmitogo as in lesson 3 of bo3b's school for helixmod ?
Its terrible not being able to use alt-tab.
Right, it's not always r0 - the compiler is free to pick any temporary variable it likes. It's better to learn to recognise the patterns based on things that it cannot change - things like the v* inputs, o* outputs, constant buffers, texture registers and so on. If the game provided headers it would be easier to identify these, but Unity strips them out (I have scripts that can extract them back out of the .asset files in Unity games).
Yes, and more specifically after the depth value has been scaled and multiplied by the ray, and before the matrix multiply to change coordinate systems.
Unity games do need a lot of shaders adjusted. Hard to say if 29 is right for this scene just by looking at it, but it's certainly possible. You might want to go back through any of the shaders you adjusted in case any early attempts at fixing the effects are causing new problems once the lights have been fixed properly.
It's all documented with examples in the d3dx.ini. Personally I recommend a slight variation though to transition the fix on and off slowly over a period of time, which makes it a lot easier to see what is changing:
Personally I use a second laptop and edit the shaders over the network. Bo3b has mentioned that he sometimes uses discover mode which works in a window - if you have a pair of red/blue glasses you can still see most of the issues.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword