Hey 3d4dd. I apologize if you've already tried this. But for the newer Helixmod DLLs: try adding the line [color="orange"]UseExtInterfaceOnly=true[/color] under the [General] tab in the Dx9Settings file. The downside to that line is that it makes all textures in the texture log show up as "0xFFFFFFFF".
I'm not sure about the shadow issue you're having, though.
Hey 3d4dd. I apologize if you've already tried this. But for the newer Helixmod DLLs: try adding the line UseExtInterfaceOnly=true under the [General] tab in the Dx9Settings file. The downside to that line is that it makes all textures in the texture log show up as "0xFFFFFFFF".
I'm not sure about the shadow issue you're having, though.
The way to tell if that is necessary is to look for "Direct3DCreate9Ex Created" in the log. I'm surprised it would affect the texture hashes, but thinking about it that does match my experience when I had to use it in Euro Truck Simulator 2 as well (IIRC not all textures were like this - the "xxxxx" texture drawn on roads you couldn't drive down had a valid hash, while the sky was FFFFFFFF - based on that I thought it was only textures that were dynamically updated).
Edit: I also had to use CalcTexCRCatUpdate=true in that game - might have been related, I can't really recall.
Edit: Grass and roads had valid hashes as well (they used the same shader as the sky, so I had to rely on this to fix the skybox)
The way to tell if that is necessary is to look for "Direct3DCreate9Ex Created" in the log. I'm surprised it would affect the texture hashes, but thinking about it that does match my experience when I had to use it in Euro Truck Simulator 2 as well (IIRC not all textures were like this - the "xxxxx" texture drawn on roads you couldn't drive down had a valid hash, while the sky was FFFFFFFF - based on that I thought it was only textures that were dynamically updated).
Edit: I also had to use CalcTexCRCatUpdate=true in that game - might have been related, I can't really recall.
Edit: Grass and roads had valid hashes as well (they used the same shader as the sky, so I had to rely on this to fix the skybox)
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Thank You very much for Your answers! I have to go to bed (11:30 pm for me) so I could only try UseExtInterfaceOnly=true. Now the actual debug.dll works with the game :) I will take a look at the shadows and the textures tomorrow as soon as possible.
Thank You very much for Your answers! I have to go to bed (11:30 pm for me) so I could only try UseExtInterfaceOnly=true. Now the actual debug.dll works with the game :) I will take a look at the shadows and the textures tomorrow as soon as possible.
My original display name is 3d4dd - for some reason Nvidia changed it..?!
I did some experiments with the changed PS for the shadows (PSD2309127). Using the code You have suggested has an effect on the shadows. It looks as if the shadows are "inverted" (shadows for the left eye are closer to the objects displayed for the right eye and vice versa), see [url]http://photos.3dvisionlive.com/3d4dd/image/568d0092e7e564fe1a0000bc/[/url]. v2.x is not 0 as this results in another image. So passing the value from VS 8A77467B seems to work. I have tried to correct it by multiplying v2.x with a correction factor like -0.5 or using a constant register instead of v2.x. But then I have got the problem that I would need different values depending on the depth of the objects. So if I use -0.4 as constant register then the shadow is aligned with the character's feed but not for distant characters, see [url]http://photos.3dvisionlive.com/3d4dd/image/568d0051e7e564102800003a/[/url]. If I use -0.6 then the shadow is aligned for distant objects but not for the foreground, see [url]http://photos.3dvisionlive.com/3d4dd/image/568d00e9e7e56405550000ba/[/url]. So we need an additional correction to change the "constant register" depending on the distance of the objects...
Regarding textures there is "Direct3DCreate9Ex Created" in the log.
[quote="DarkStarSword"]See if the above experiment pans out first. If you want to add it to a shared account feel free - then I could take a closer look at it. We also now have a bit of a budget for shared games it seems, which we could quite easily put towards this game so you don't have to spend your money on it - either way contact pirateguygrush to arrange this (and I think you are deserving of access yourself)[/quote]
As the game isn't very expensive I would be happy to gift it to a shared account. But I will only do it if You really are interested in taking a closer look at it and if You guess that a shadow fix won't take much time. It shouldn't be an obligation and I can understand that there are games (or even game engines) with much higher priority than FFLR. So feel free to tell me if You really want to spend the time to help fixing this game ;)
I did some experiments with the changed PS for the shadows (PSD2309127). Using the code You have suggested has an effect on the shadows. It looks as if the shadows are "inverted" (shadows for the left eye are closer to the objects displayed for the right eye and vice versa), see http://photos.3dvisionlive.com/3d4dd/image/568d0092e7e564fe1a0000bc/. v2.x is not 0 as this results in another image. So passing the value from VS 8A77467B seems to work. I have tried to correct it by multiplying v2.x with a correction factor like -0.5 or using a constant register instead of v2.x. But then I have got the problem that I would need different values depending on the depth of the objects. So if I use -0.4 as constant register then the shadow is aligned with the character's feed but not for distant characters, see http://photos.3dvisionlive.com/3d4dd/image/568d0051e7e564102800003a/. If I use -0.6 then the shadow is aligned for distant objects but not for the foreground, see http://photos.3dvisionlive.com/3d4dd/image/568d00e9e7e56405550000ba/. So we need an additional correction to change the "constant register" depending on the distance of the objects...
Regarding textures there is "Direct3DCreate9Ex Created" in the log.
DarkStarSword said:See if the above experiment pans out first. If you want to add it to a shared account feel free - then I could take a closer look at it. We also now have a bit of a budget for shared games it seems, which we could quite easily put towards this game so you don't have to spend your money on it - either way contact pirateguygrush to arrange this (and I think you are deserving of access yourself)
As the game isn't very expensive I would be happy to gift it to a shared account. But I will only do it if You really are interested in taking a closer look at it and if You guess that a shadow fix won't take much time. It shouldn't be an obligation and I can understand that there are games (or even game engines) with much higher priority than FFLR. So feel free to tell me if You really want to spend the time to help fixing this game ;)
My original display name is 3d4dd - for some reason Nvidia changed it..?!
Those observations and screenshots tell me that we haven't found the right pattern yet. Here's a few more experiments for you to try (alone and in combination with each other):
1. Move this line:[code]mad r0.zw, v0.z, -r1.x, -c0.xyyw[/code]
above the stereo correction and see what happens if you use r0.z or r0.w as the depth instead of r2.z
2. Try dividing by v2.x instead of multiplying (add an 'rcp r31.z, v2.z' on the line above it then use r31.z)
3. Try adding the stereo correction formula instead of subtracting it (remove the - from -r31.w on the final line of the stereo correction)
4. Try adding the convergence instead of subtracting it (this is based on one pattern I still can't explain in the Unity 4 version of Stranded Deep)
I'd also like you to take a look to see what matrices are available in other shaders - check the headers for any constant registers with size 4. The names vary, but you're looking for something like a projection or inverse projection matrix.
[quote="3d4dd"]As the game isn't very expensive I would be happy to gift it to a shared account. But I will only do it if You really are interested in taking a closer look at it and if You guess that a shadow fix won't take much time. It shouldn't be an obligation and I can understand that there are games (or even game engines) with much higher priority than FFLR. So feel free to tell me if You really want to spend the time to help fixing this game ;)[/quote]I'm happy to take a look at it, but right now I have no idea how long it would take - could be minutes, could be a day or two, could be never. I do have a few other things I'm working on as well (not the least of which is Batman's weird grid based lighting shader - now, that's a tough one). Right now I'm getting you to try some of the first experiments I would try to see if we luck out on the right pattern, but I guess I've developed a bit of an intuition now so at some point I'd just have to look at it and see how it behaves under various experiments for myself.
Those observations and screenshots tell me that we haven't found the right pattern yet. Here's a few more experiments for you to try (alone and in combination with each other):
1. Move this line:
mad r0.zw, v0.z, -r1.x, -c0.xyyw
above the stereo correction and see what happens if you use r0.z or r0.w as the depth instead of r2.z
2. Try dividing by v2.x instead of multiplying (add an 'rcp r31.z, v2.z' on the line above it then use r31.z)
3. Try adding the stereo correction formula instead of subtracting it (remove the - from -r31.w on the final line of the stereo correction)
4. Try adding the convergence instead of subtracting it (this is based on one pattern I still can't explain in the Unity 4 version of Stranded Deep)
I'd also like you to take a look to see what matrices are available in other shaders - check the headers for any constant registers with size 4. The names vary, but you're looking for something like a projection or inverse projection matrix.
3d4dd said:As the game isn't very expensive I would be happy to gift it to a shared account. But I will only do it if You really are interested in taking a closer look at it and if You guess that a shadow fix won't take much time. It shouldn't be an obligation and I can understand that there are games (or even game engines) with much higher priority than FFLR. So feel free to tell me if You really want to spend the time to help fixing this game ;)
I'm happy to take a look at it, but right now I have no idea how long it would take - could be minutes, could be a day or two, could be never. I do have a few other things I'm working on as well (not the least of which is Batman's weird grid based lighting shader - now, that's a tough one). Right now I'm getting you to try some of the first experiments I would try to see if we luck out on the right pattern, but I guess I've developed a bit of an intuition now so at some point I'd just have to look at it and see how it behaves under various experiments for myself.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"] I do have a few other things I'm working on as well (not the least of which is Batman's weird grid based lighting shader - now, that's a tough one). Right now I'm getting you to try some of the first experiments I would try to see if we luck out on the right pattern, but I guess I've developed a bit of an intuition now so at some point I'd just have to look at it and see how it behaves under various experiments for myself.[/quote]
Does the Batman game uses Compute shaders for the Grid lighting system? I am asking because I have encountered a similar problem in Star Wars: Battlefront.
As a matter of FACT all Frostbyte engines uses this lighting system. On most games you can disable it using WorldRenderer.LightCsPathEnabled = 0 and the lighting system will just swap to a shader based one.
In SW:Battlefront however we lose some of the lighting.
I did some searching and the light is created using Compute shaders. I wasn't able to actually find any VS+PS that controls the lighting, but I found CS that once disabled removes the lighting...
I am curios if the Grid method doesn't refer to the Tile Method used in SW:B From what I've read is the same method but I can be wrong;) hence the curiosity!
DarkStarSword said: I do have a few other things I'm working on as well (not the least of which is Batman's weird grid based lighting shader - now, that's a tough one). Right now I'm getting you to try some of the first experiments I would try to see if we luck out on the right pattern, but I guess I've developed a bit of an intuition now so at some point I'd just have to look at it and see how it behaves under various experiments for myself.
Does the Batman game uses Compute shaders for the Grid lighting system? I am asking because I have encountered a similar problem in Star Wars: Battlefront.
As a matter of FACT all Frostbyte engines uses this lighting system. On most games you can disable it using WorldRenderer.LightCsPathEnabled = 0 and the lighting system will just swap to a shader based one.
In SW:Battlefront however we lose some of the lighting.
I did some searching and the light is created using Compute shaders. I wasn't able to actually find any VS+PS that controls the lighting, but I found CS that once disabled removes the lighting...
I am curios if the Grid method doesn't refer to the Tile Method used in SW:B From what I've read is the same method but I can be wrong;) hence the curiosity!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Batman is actually [s]UE4[/s] UE3, but it's not following the same pattern as other [s]UE4[/s] UE3 games I've looked at so far.
You are correct that it is a compute shader (pretty sure all grid lighting shaders will be just because of how the technique works). Correcting the placement of the lights is easy enough, but convincing each tile in the grid that they should even try to draw the lights is harder. We've had some success but not enough for it to be playable yet.
I was also chasing a red herring for a while thinking that it was a standard UE4 shader (there is a grid lighting shader in the UE4 source code), but things just weren't adding up and once I wrote some scripts to identify it's name from the game files I found it was in fact a custom shader specific to that game.
The extra logging I added to the latest 3DMigoto was all aimed at getting more info about that shader - specifically tracking where the various structured buffers came from that hold (I believe) info about which light is in which part of the grid. I was hoping these would turn out to have been calculated by a prior shader and we could add a stereo correction there, but the logging showed that they were populated from the CPU via Map() calls.
That particular option you mentioned doesn't seem to exist in any of the .ini files, though it is possible that there may be another option to do something similar.
Edit: Wait a minute... this is UE3? WTF? I just assumed it would be UE4 since it's DX11. Must be a pretty heavily modified UE3 since it is nothing like any other UE3 or UE4 game I've looked at.
Batman is actually UE4 UE3, but it's not following the same pattern as other UE4 UE3 games I've looked at so far.
You are correct that it is a compute shader (pretty sure all grid lighting shaders will be just because of how the technique works). Correcting the placement of the lights is easy enough, but convincing each tile in the grid that they should even try to draw the lights is harder. We've had some success but not enough for it to be playable yet.
I was also chasing a red herring for a while thinking that it was a standard UE4 shader (there is a grid lighting shader in the UE4 source code), but things just weren't adding up and once I wrote some scripts to identify it's name from the game files I found it was in fact a custom shader specific to that game.
The extra logging I added to the latest 3DMigoto was all aimed at getting more info about that shader - specifically tracking where the various structured buffers came from that hold (I believe) info about which light is in which part of the grid. I was hoping these would turn out to have been calculated by a prior shader and we could add a stereo correction there, but the logging showed that they were populated from the CPU via Map() calls.
That particular option you mentioned doesn't seem to exist in any of the .ini files, though it is possible that there may be another option to do something similar.
Edit: Wait a minute... this is UE3? WTF? I just assumed it would be UE4 since it's DX11. Must be a pretty heavily modified UE3 since it is nothing like any other UE3 or UE4 game I've looked at.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
I think I tried this without success, but perhaps I did something wrong - in one of the Batman config files (Engine.ini, DefaultEngine.ini, User.ini, DefaultUser.ini, etc etc etc ) is a parameter specifying a light cutoff distance or something, default value "2000". I changed that to 5000 but it did nothing, so perhaps changing it to, say 10000000 might force every light to be included in every tile so the whole "picking which ones" becomes irrelevant. It may decrease performance, but who cares, it's only one shader out of approx 500 of them per frame. Of course there is a chance that this parameter has nothing to do with this, but it still provides the idea that we might find a parameter (say in the constant buffers) that is used in, say, some depth comparison that we can manually change?
I think I tried this without success, but perhaps I did something wrong - in one of the Batman config files (Engine.ini, DefaultEngine.ini, User.ini, DefaultUser.ini, etc etc etc ) is a parameter specifying a light cutoff distance or something, default value "2000". I changed that to 5000 but it did nothing, so perhaps changing it to, say 10000000 might force every light to be included in every tile so the whole "picking which ones" becomes irrelevant. It may decrease performance, but who cares, it's only one shader out of approx 500 of them per frame. Of course there is a chance that this parameter has nothing to do with this, but it still provides the idea that we might find a parameter (say in the constant buffers) that is used in, say, some depth comparison that we can manually change?
Those settings in config files have maximum settings, you can't willy nilly put 10000000 and have the value work. I've no idea what happens when you put in a non viable value, it might default to the base game setting, the minimum engine setting, maximum setting or last viable setting entered.
Most likely, last viable.
If you can access the console panel in-game, if it permits, you can sometimes query the max setting by hitting "enter" without a value following the cvar or command.
So if the cvar was light_cutoffdistance=1200, you would type light_cutoffdistance and then press enter. The result might be something like min 380, max 1860, so you would see that 100000000 simply isn't accepted.
If you are able to access the console, you might find that the variables access in the console is limited in the released retail version and only accessible in the config file.
Those settings in config files have maximum settings, you can't willy nilly put 10000000 and have the value work. I've no idea what happens when you put in a non viable value, it might default to the base game setting, the minimum engine setting, maximum setting or last viable setting entered.
Most likely, last viable.
If you can access the console panel in-game, if it permits, you can sometimes query the max setting by hitting "enter" without a value following the cvar or command.
So if the cvar was light_cutoffdistance=1200, you would type light_cutoffdistance and then press enter. The result might be something like min 380, max 1860, so you would see that 100000000 simply isn't accepted.
If you are able to access the console, you might find that the variables access in the console is limited in the released retail version and only accessible in the config file.
[quote="D-Man11"]Those settings in config files have maximum settings, you can't willy nilly put 10000000 and have the value work. I've no idea what happens when you put in a non viable value, it might default to the base game setting, the minimum engine setting, maximum setting or last viable setting entered.
Most likely, last viable.
If you can access the console panel in-game, if it permits, you can sometimes query the max setting by hitting "enter" without a value following the cvar or command.
So if the cvar was light_cutoffdistance=1200, you would type light_cutoffdistance and then press enter. The result might be something like min 380, max 1860, so you would see that 100000000 simply isn't accepted.
If you are able to access the console, you might find that the variables access in the console is limited in the released retail version and only accessible in the config file.[/quote]
Yeah that makes sense, I tried a big number and it did not work anyway :-) This is a really tough problem though, especially if it becomes the norm for newer games. I had to disable lighting compute shaders for Fallout4 in the config file, because they did not work - given this Batman problem, it might be the same problem (it was clipping), so I could perhaps have another look at those shaders...
D-Man11 said:Those settings in config files have maximum settings, you can't willy nilly put 10000000 and have the value work. I've no idea what happens when you put in a non viable value, it might default to the base game setting, the minimum engine setting, maximum setting or last viable setting entered.
Most likely, last viable.
If you can access the console panel in-game, if it permits, you can sometimes query the max setting by hitting "enter" without a value following the cvar or command.
So if the cvar was light_cutoffdistance=1200, you would type light_cutoffdistance and then press enter. The result might be something like min 380, max 1860, so you would see that 100000000 simply isn't accepted.
If you are able to access the console, you might find that the variables access in the console is limited in the released retail version and only accessible in the config file.
Yeah that makes sense, I tried a big number and it did not work anyway :-) This is a really tough problem though, especially if it becomes the norm for newer games. I had to disable lighting compute shaders for Fallout4 in the config file, because they did not work - given this Batman problem, it might be the same problem (it was clipping), so I could perhaps have another look at those shaders...
I don't have the game, but a quick look at Steam suggests that it's using Unreal 3. So you might have a look at their docs to see if you can find anything of use.
http://udn.epicgames.com/Three/SystemSettings.html
http://udn.epicgames.com/Three/ConfigurationFiles.html
http://udn.epicgames.com/Three/ConsoleCommands.html
Also you should be able to add -nologo to the Steam Launch Options to skip the intros.
It is UE3, but it is a very heavily modified UE3, and the lighting they are using is entirely a custom job for this specific game, so it's unlikely there will be a standard configuration option for it (a custom one maybe).
That said, I was able to crack one of the two clipping issues on the tile lighting last night and I'm 99.9% confident that I know how to solve the other, so at the moment it's looking very promising :)
It is UE3, but it is a very heavily modified UE3, and the lighting they are using is entirely a custom job for this specific game, so it's unlikely there will be a standard configuration option for it (a custom one maybe).
That said, I was able to crack one of the two clipping issues on the tile lighting last night and I'm 99.9% confident that I know how to solve the other, so at the moment it's looking very promising :)
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"]It is UE3, but it is a very heavily modified UE3, and the lighting they are using is entirely a custom job for this specific game, so it's unlikely there will be a standard configuration option for it (a custom one maybe).
That said, I was able to crack one of the two clipping issues on the tile lighting last night and I'm 99.9% confident that I know how to solve the other, so at the moment it's looking very promising :)[/quote]
That sounds very promising! I am wondering if the same things we can use on the FrostByte 3 engine and their tile-based lighting there as well;)
DarkStarSword said:It is UE3, but it is a very heavily modified UE3, and the lighting they are using is entirely a custom job for this specific game, so it's unlikely there will be a standard configuration option for it (a custom one maybe).
That said, I was able to crack one of the two clipping issues on the tile lighting last night and I'm 99.9% confident that I know how to solve the other, so at the moment it's looking very promising :)
That sounds very promising! I am wondering if the same things we can use on the FrostByte 3 engine and their tile-based lighting there as well;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="DarkStarSword"]
3. Try adding the stereo correction formula instead of subtracting it (remove the - from -r31.w on the final line of the stereo correction)
4. Try adding the convergence instead of subtracting it (this is based on one pattern I still can't explain in the Unity 4 version of Stranded Deep)
[/quote]
The combination of both was the solution - now the shadows are placed correctly :) Thank You so much for the help!
Now I will to fix the dark spot under the character's feet simulating "ambiant occlusion" within buildings (also wouldn't harm to disable it). I will also try to separate the textures for HUD elements and some ground effects as they use the same VS and moving the HUD to depth messes up these effects.
DarkStarSword said:
3. Try adding the stereo correction formula instead of subtracting it (remove the - from -r31.w on the final line of the stereo correction)
4. Try adding the convergence instead of subtracting it (this is based on one pattern I still can't explain in the Unity 4 version of Stranded Deep)
The combination of both was the solution - now the shadows are placed correctly :) Thank You so much for the help!
Now I will to fix the dark spot under the character's feet simulating "ambiant occlusion" within buildings (also wouldn't harm to disable it). I will also try to separate the textures for HUD elements and some ground effects as they use the same VS and moving the HUD to depth messes up these effects.
My original display name is 3d4dd - for some reason Nvidia changed it..?!
I tried to tranfer the solution for the dynamic shadows (proving by this way that I didn't undestand how the shadow fix really works ^^ ) and added the following code:
VSDC043178:
// $s_projectionReciprocal c1 1
...
dcl_texcoord2 o3.xyz
// Add a new output to pass s_projectionReciprocal to the pixel shader:
dcl_texcoord3 o4
mov o4, c1
mul r0.xyz, c13, v1.y
...
PS903F9CB3:
...
dcl_2d s0
// New input with s_projectionReciprocal from the vertex shader:
dcl_texcoord3 v3
// Helix Mod Stereo Params:
dcl_2d s13
def c220, 0, 1, 0.0625, 0.5
frc r0.xy, vPos
...
mov r1.xyz, v0
// view-space stereo correction from r1:
texldl r31, c220.z, s13
add r31.w, r1.z, r31.y
mul r31.w, r31.w, r31.x
mad r1.x, r31.w, v3.x, r1.x
mad r0.yzw, v2.xxyz, r0.x, -r1.xxyz
...
As You can see the ambiant occlusion gets separated http://photos.3dvisionlive.com/3d4dd/image/568f9b2ee7e5642f3e000146/ but is at wrong depth. Tried -r31.y and -r31.w and used constant register instead of v3.x which had effects but didn't fix the issue. Replacing v3.x with 0 results in 2D, so sending a value from the VS to the PS seems to work as v3.x is not 0. Or should I use the correction on registers connected to v2 (dcl_texcoord2 v2.xyz)? But before I continue with experiments I just wanted to ask if my approach is somewhat usefull or if I'm on the wrong track.
My original display name is 3d4dd - for some reason Nvidia changed it..?!
I'm not sure about the shadow issue you're having, though.
Dual boot Win 7 x64 & Win 10 (1809) | Geforce Drivers 417.35
Edit: I also had to use CalcTexCRCatUpdate=true in that game - might have been related, I can't really recall.
Edit: Grass and roads had valid hashes as well (they used the same shader as the sky, so I had to rely on this to fix the skybox)
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
My original display name is 3d4dd - for some reason Nvidia changed it..?!
Regarding textures there is "Direct3DCreate9Ex Created" in the log.
As the game isn't very expensive I would be happy to gift it to a shared account. But I will only do it if You really are interested in taking a closer look at it and if You guess that a shadow fix won't take much time. It shouldn't be an obligation and I can understand that there are games (or even game engines) with much higher priority than FFLR. So feel free to tell me if You really want to spend the time to help fixing this game ;)
My original display name is 3d4dd - for some reason Nvidia changed it..?!
1. Move this line:
above the stereo correction and see what happens if you use r0.z or r0.w as the depth instead of r2.z
2. Try dividing by v2.x instead of multiplying (add an 'rcp r31.z, v2.z' on the line above it then use r31.z)
3. Try adding the stereo correction formula instead of subtracting it (remove the - from -r31.w on the final line of the stereo correction)
4. Try adding the convergence instead of subtracting it (this is based on one pattern I still can't explain in the Unity 4 version of Stranded Deep)
I'd also like you to take a look to see what matrices are available in other shaders - check the headers for any constant registers with size 4. The names vary, but you're looking for something like a projection or inverse projection matrix.
I'm happy to take a look at it, but right now I have no idea how long it would take - could be minutes, could be a day or two, could be never. I do have a few other things I'm working on as well (not the least of which is Batman's weird grid based lighting shader - now, that's a tough one). Right now I'm getting you to try some of the first experiments I would try to see if we luck out on the right pattern, but I guess I've developed a bit of an intuition now so at some point I'd just have to look at it and see how it behaves under various experiments for myself.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Does the Batman game uses Compute shaders for the Grid lighting system? I am asking because I have encountered a similar problem in Star Wars: Battlefront.
As a matter of FACT all Frostbyte engines uses this lighting system. On most games you can disable it using WorldRenderer.LightCsPathEnabled = 0 and the lighting system will just swap to a shader based one.
In SW:Battlefront however we lose some of the lighting.
I did some searching and the light is created using Compute shaders. I wasn't able to actually find any VS+PS that controls the lighting, but I found CS that once disabled removes the lighting...
I am curios if the Grid method doesn't refer to the Tile Method used in SW:B From what I've read is the same method but I can be wrong;) hence the curiosity!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
UE4UE3, but it's not following the same pattern as otherUE4UE3 games I've looked at so far.You are correct that it is a compute shader (pretty sure all grid lighting shaders will be just because of how the technique works). Correcting the placement of the lights is easy enough, but convincing each tile in the grid that they should even try to draw the lights is harder. We've had some success but not enough for it to be playable yet.
I was also chasing a red herring for a while thinking that it was a standard UE4 shader (there is a grid lighting shader in the UE4 source code), but things just weren't adding up and once I wrote some scripts to identify it's name from the game files I found it was in fact a custom shader specific to that game.
The extra logging I added to the latest 3DMigoto was all aimed at getting more info about that shader - specifically tracking where the various structured buffers came from that hold (I believe) info about which light is in which part of the grid. I was hoping these would turn out to have been calculated by a prior shader and we could add a stereo correction there, but the logging showed that they were populated from the CPU via Map() calls.
That particular option you mentioned doesn't seem to exist in any of the .ini files, though it is possible that there may be another option to do something similar.
Edit: Wait a minute... this is UE3? WTF? I just assumed it would be UE4 since it's DX11. Must be a pretty heavily modified UE3 since it is nothing like any other UE3 or UE4 game I've looked at.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Rig: Intel i7-8700K @4.7GHz, 16Gb Ram, SSD, GTX 1080Ti, Win10x64, Asus VG278
Most likely, last viable.
If you can access the console panel in-game, if it permits, you can sometimes query the max setting by hitting "enter" without a value following the cvar or command.
So if the cvar was light_cutoffdistance=1200, you would type light_cutoffdistance and then press enter. The result might be something like min 380, max 1860, so you would see that 100000000 simply isn't accepted.
If you are able to access the console, you might find that the variables access in the console is limited in the released retail version and only accessible in the config file.
Yeah that makes sense, I tried a big number and it did not work anyway :-) This is a really tough problem though, especially if it becomes the norm for newer games. I had to disable lighting compute shaders for Fallout4 in the config file, because they did not work - given this Batman problem, it might be the same problem (it was clipping), so I could perhaps have another look at those shaders...
Rig: Intel i7-8700K @4.7GHz, 16Gb Ram, SSD, GTX 1080Ti, Win10x64, Asus VG278
http://udn.epicgames.com/Three/SystemSettings.html
http://udn.epicgames.com/Three/ConfigurationFiles.html
http://udn.epicgames.com/Three/ConsoleCommands.html
Also you should be able to add -nologo to the Steam Launch Options to skip the intros.
That said, I was able to crack one of the two clipping issues on the tile lighting last night and I'm 99.9% confident that I know how to solve the other, so at the moment it's looking very promising :)
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
That sounds very promising! I am wondering if the same things we can use on the FrostByte 3 engine and their tile-based lighting there as well;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
The combination of both was the solution - now the shadows are placed correctly :) Thank You so much for the help!
Now I will to fix the dark spot under the character's feet simulating "ambiant occlusion" within buildings (also wouldn't harm to disable it). I will also try to separate the textures for HUD elements and some ground effects as they use the same VS and moving the HUD to depth messes up these effects.
My original display name is 3d4dd - for some reason Nvidia changed it..?!
Like the dynamic shadows it is 2D: http://photos.3dvisionlive.com/3d4dd/image/568f9aebe7e564e85900014d/
This should be the shader pair:
PS903F9CB3:
VSDC043178:
I tried to tranfer the solution for the dynamic shadows (proving by this way that I didn't undestand how the shadow fix really works ^^ ) and added the following code:
VSDC043178:
PS903F9CB3:
As You can see the ambiant occlusion gets separated http://photos.3dvisionlive.com/3d4dd/image/568f9b2ee7e5642f3e000146/ but is at wrong depth. Tried -r31.y and -r31.w and used constant register instead of v3.x which had effects but didn't fix the issue. Replacing v3.x with 0 results in 2D, so sending a value from the VS to the PS seems to work as v3.x is not 0. Or should I use the correction on registers connected to v2 (dcl_texcoord2 v2.xyz)? But before I continue with experiments I just wanted to ask if my approach is somewhat usefull or if I'm on the wrong track.
My original display name is 3d4dd - for some reason Nvidia changed it..?!