How to fix/disable shaders in games(DLL,guide and fixes).
156 / 167
Thanks everyone for the fast update!
There is indeed many shaders using the VS DF17CB6C ( in Lost Planet), so this is definitely the reason why I get my hud rendered in 3D + weird borders.
I used the method described in DHR post and the method posted at http://helixmod.blogspot.ca/2012/03/dlls-update.html in order to find the specific texture CRC and apply fix only on that one. after creating a DX9Settings.ini file I've inserted this:
[VSDF17CB6C]
CheckTexCRC = true
Sadly, after trying many times, the only things in my TEXTURESLOG.txt file are BeginScene and EndScene endlessly. I'va also try (from eqzitara):
[VSDF17CB6C]
CheckTexCRC = true
VBOffsetList = 0;
ValForDefined = 0
ValNotDefined = 1
TexCounterReg = 251
UseDefinedOnly = false
DefinedTexturesVS =
[VBDF17CB6C.0]
PointsList = 554;
Still not working...
I wonder if I need something special in the general section of DX9Settings.ini file? I also want to specify that my shader BDF17CB6C is already in shaderoverride folder. What did I miss?
PS: the bCalcTexCRCatStart method is not working apparently for LP, the game crash literally when this line is active.
There is indeed many shaders using the VS DF17CB6C ( in Lost Planet), so this is definitely the reason why I get my hud rendered in 3D + weird borders.
I used the method described in DHR post and the method posted at http://helixmod.blogspot.ca/2012/03/dlls-update.html in order to find the specific texture CRC and apply fix only on that one. after creating a DX9Settings.ini file I've inserted this:
[VSDF17CB6C]
CheckTexCRC = true
Sadly, after trying many times, the only things in my TEXTURESLOG.txt file are BeginScene and EndScene endlessly. I'va also try (from eqzitara):
I wonder if I need something special in the general section of DX9Settings.ini file? I also want to specify that my shader BDF17CB6C is already in shaderoverride folder. What did I miss?
PS: the bCalcTexCRCatStart method is not working apparently for LP, the game crash literally when this line is active.
The crashing may be due to the Steam Overlay dll issue([url=https://forums.geforce.com/default/topic/758587/3d-vision/steam-update-breaks-helixmod-on-some-games/post/4255351/#4255351]discussion/toggle script[/url]), the few times I've tried the logging method it only worked when I used the [url=https://s3.amazonaws.com/HelixMods/*Mainfiles/Debug.zip]120305[/url](03/05/2012) release of the debug dll.
The crashing may be due to the Steam Overlay dll issue(discussion/toggle script), the few times I've tried the logging method it only worked when I used the 120305(03/05/2012) release of the debug dll.
@mercier1200,
Post the shader with your fix... i wrote before you need to add some lines (if statement) to the VS.
If you ask me, only adding the line "CheckTexCRC = true" to [VSDF17CB6C] should work at least to found the TexCRC related to that VS, but for me happens the same (BeginScene and EndScene endlessly) if i don't add the "if" statement to the VS.
Post the shader with your fix... i wrote before you need to add some lines (if statement) to the VS.
If you ask me, only adding the line "CheckTexCRC = true" to [VSDF17CB6C] should work at least to found the TexCRC related to that VS, but for me happens the same (BeginScene and EndScene endlessly) if i don't add the "if" statement to the VS.
Now here is an interesting question.... Mostly because I don't (yet) fully understand how DirectX 9/10 fully hangs together...
The way I do my shader intercept in OpenGL is at the Shader Source Code loading level. (So I can see the full Source Code in C before being compiled & linked and off-loaded to the GPU)
I was wondering if this would be the case in DirectX as well... Or do the directx games come with the shaders already built? I am wondering this because I guess it would be easier to modify code in C rather than ASM.
In OpenGL is like this:
- After you create the OpenGL Context and stuff:
- Create a Shader Program
- Create a Vertex/Pixel Shader
- Load the shader source for the Vertex & Pixel Shaders (C code)
- Compile the Vertex/Pixel shaders -> (You get compiled shader objects)
- Link the Vertex/Pixel shaders to the Shader Program.
- (Optional step) Delete the Vertex/Pixel Shaders - This way you delete from memory the shader source as they are no longer needed and already loaded in the Shader Program.
- Use the Shader Program when needed
Can somebody make a quick explanation(Like I did above for OpenGL) on how the whole DirectX (shader-wise) hangs together?
Thanks ^_^
Now here is an interesting question.... Mostly because I don't (yet) fully understand how DirectX 9/10 fully hangs together...
The way I do my shader intercept in OpenGL is at the Shader Source Code loading level. (So I can see the full Source Code in C before being compiled & linked and off-loaded to the GPU)
I was wondering if this would be the case in DirectX as well... Or do the directx games come with the shaders already built? I am wondering this because I guess it would be easier to modify code in C rather than ASM.
In OpenGL is like this:
- After you create the OpenGL Context and stuff:
- Create a Shader Program
- Create a Vertex/Pixel Shader
- Load the shader source for the Vertex & Pixel Shaders (C code)
- Compile the Vertex/Pixel shaders -> (You get compiled shader objects)
- Link the Vertex/Pixel shaders to the Shader Program.
- (Optional step) Delete the Vertex/Pixel Shaders - This way you delete from memory the shader source as they are no longer needed and already loaded in the Shader Program.
- Use the Shader Program when needed
Can somebody make a quick explanation(Like I did above for OpenGL) on how the whole DirectX (shader-wise) hangs together?
Thanks ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
I can tell you how 3Dmigoto does this operation. Not sure about Helix, but it's likely similar.
In DirectX games, the shaders almost always come already built. This is the Microsoft recommendation, so we generally only have binary shader objects, that are passed in from the game. Sometimes, the game will compile at launch, but they'll still also send the binary shader info in the same way, so it's simpler to just catch the binary to get both types.
I haven't written a game directly, and normally think about this from the perspective of the wrapper on a given game, which is slightly different than how I'd think about it from the game perspective.
Because of that, I just catch the calls that the game makes. So for example:
1) CreateDevice called to set up the rendering environment.
2) Numerous CreatePixelShader, CreateVertexShader calls passing in binary shader code.
3) During rendering, specific shaders returned from those Create calls are activated, in turn.
4) Draw/Endscene/Present call to flip drawn back buffer to show.
So, it's pretty similar to your OpenGL path, but minus the compilation stage. In the 3Dmigoto case, it actually does do the compilation, because it previously Decompiled the binary back into HLSL. In Helix case, it will reassemble the shader from ASM text, because it previously Disassembled the binary. But in both cases, only for shaders that are overridden, not every shader.
Does that give you enough info?
I can tell you how 3Dmigoto does this operation. Not sure about Helix, but it's likely similar.
In DirectX games, the shaders almost always come already built. This is the Microsoft recommendation, so we generally only have binary shader objects, that are passed in from the game. Sometimes, the game will compile at launch, but they'll still also send the binary shader info in the same way, so it's simpler to just catch the binary to get both types.
I haven't written a game directly, and normally think about this from the perspective of the wrapper on a given game, which is slightly different than how I'd think about it from the game perspective.
Because of that, I just catch the calls that the game makes. So for example:
1) CreateDevice called to set up the rendering environment.
2) Numerous CreatePixelShader, CreateVertexShader calls passing in binary shader code.
3) During rendering, specific shaders returned from those Create calls are activated, in turn.
4) Draw/Endscene/Present call to flip drawn back buffer to show.
So, it's pretty similar to your OpenGL path, but minus the compilation stage. In the 3Dmigoto case, it actually does do the compilation, because it previously Decompiled the binary back into HLSL. In Helix case, it will reassemble the shader from ASM text, because it previously Disassembled the binary. But in both cases, only for shaders that are overridden, not every shader.
Does that give you enough info?
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]I can tell you how 3Dmigoto does this operation. Not sure about Helix, but it's likely similar.
In DirectX games, the shaders almost always come already built. This is the Microsoft recommendation, so we generally only have binary shader objects, that are passed in from the game. Sometimes, the game will compile at launch, but they'll still also send the binary shader info in the same way, so it's simpler to just catch the binary to get both types.
I haven't written a game directly, and normally think about this from the perspective of the wrapper on a given game, which is slightly different than how I'd think about it from the game perspective.
Because of that, I just catch the calls that the game makes. So for example:
1) CreateDevice called to set up the rendering environment.
2) Numerous CreatePixelShader, CreateVertexShader calls passing in binary shader code.
3) During rendering, specific shaders returned from those Create calls are activated, in turn.
4) Draw/Endscene/Present call to flip drawn back buffer to show.
So, it's pretty similar to your OpenGL path, but minus the compilation stage. In the 3Dmigoto case, it actually does do the compilation, because it previously Decompiled the binary back into HLSL. In Helix case, it will reassemble the shader from ASM text, because it previously Disassembled the binary. But in both cases, only for shaders that are overridden, not every shader.
Does that give you enough info?[/quote]
Big thanks for all the information;)) I had a feeling this was the case (with the pre-compiled object being shipped with the application).
Don't get me wrong or anything;)) I haven't worked with PS/VS 1,2,3 Assembly language before but I have worked with HLSL and GLSL quite a lot and I was just wondering about the possibility of having the original code and make all the fixes/changes there;)) I expected to be like this since I do the same thing for Surround fixes (modify the code in X86/x64 ASM since the game's source code is not "exactly shipped" either) ;))
Again big thanks for this information;)) Guess I'll have to learn DirectX ASM shader Language more to actually be able to modify some of the shaders ^_^;))
bo3b said:I can tell you how 3Dmigoto does this operation. Not sure about Helix, but it's likely similar.
In DirectX games, the shaders almost always come already built. This is the Microsoft recommendation, so we generally only have binary shader objects, that are passed in from the game. Sometimes, the game will compile at launch, but they'll still also send the binary shader info in the same way, so it's simpler to just catch the binary to get both types.
I haven't written a game directly, and normally think about this from the perspective of the wrapper on a given game, which is slightly different than how I'd think about it from the game perspective.
Because of that, I just catch the calls that the game makes. So for example:
1) CreateDevice called to set up the rendering environment.
2) Numerous CreatePixelShader, CreateVertexShader calls passing in binary shader code.
3) During rendering, specific shaders returned from those Create calls are activated, in turn.
4) Draw/Endscene/Present call to flip drawn back buffer to show.
So, it's pretty similar to your OpenGL path, but minus the compilation stage. In the 3Dmigoto case, it actually does do the compilation, because it previously Decompiled the binary back into HLSL. In Helix case, it will reassemble the shader from ASM text, because it previously Disassembled the binary. But in both cases, only for shaders that are overridden, not every shader.
Does that give you enough info?
Big thanks for all the information;)) I had a feeling this was the case (with the pre-compiled object being shipped with the application).
Don't get me wrong or anything;)) I haven't worked with PS/VS 1,2,3 Assembly language before but I have worked with HLSL and GLSL quite a lot and I was just wondering about the possibility of having the original code and make all the fixes/changes there;)) I expected to be like this since I do the same thing for Surround fixes (modify the code in X86/x64 ASM since the game's source code is not "exactly shipped" either) ;))
Again big thanks for this information;)) Guess I'll have to learn DirectX ASM shader Language more to actually be able to modify some of the shaders ^_^;))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Again big thanks for this information;)) Guess I'll have to learn DirectX ASM shader Language more to actually be able to modify some of the shaders ^_^;))[/quote]Well, only in the DX9 cases. In DX11 with 3Dmigoto it's all HLSL.
Plus like I say in School, you don't really need to know the language either ASM or HLSL, just recognize some patterns. At least until much, much later.
helifax said:Again big thanks for this information;)) Guess I'll have to learn DirectX ASM shader Language more to actually be able to modify some of the shaders ^_^;))
Well, only in the DX9 cases. In DX11 with 3Dmigoto it's all HLSL.
Plus like I say in School, you don't really need to know the language either ASM or HLSL, just recognize some patterns. At least until much, much later.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I know I should be patient and wait until I have learned enough at bo3b's school - but I'm so eager to play this game in S3D: [b]Mind: Path to Thalamus[/b] ([url]http://store.steampowered.com/app/296070/[/url]).
This game really offers mind-blowing environments and I can't wait to explore them in proper S3D. Good news: it is Dx9, uses UR3, has AllowNvidiaStereo3d=True and less issues than other UR3 games (no issues with bloom, haloes, fog, skybox, etc.). It only has 2 major issues: dynamic shadows and light shafts (god rays). The dynamic shadows are not completely messed up but not exactly at the right depth (see http://photos.3dvisionlive.com/3d4dd/image/53ff4bedd475fec94a0000c2/). "Remember me" had the same issue and Helix could fix it using a lua script. Unfortunately I don't know how to adapt this script to another game. The other issue are the light shafts which are very prominent in this game. Although the sky is at correct depth, the sun and the moon are too close (http://photos.3dvisionlive.com/3d4dd/image/53ff4e6fd475fe5a6a000125/). And it seems that their wrong position is used to calculate the light shafts (see http://photos.3dvisionlive.com/3d4dd/image/53ff4eaad475fe8705000185/ , http://photos.3dvisionlive.com/3d4dd/image/53ff4c66d475fe5a6a000122/). There also are some minor issues at water's edge but they are rare can be removed by disabling the shader. The major issues could also be prevented by setting DynamicShadows=False and bAllowLightShafts=False in UDKEngine.ini but I removes effects that are quite essential for these supernal scenarios. So it would be great if we can fix this game :)
As mentioned before I hope that I will learn one day to fix the game by myself. But if someone wants to have a look at it now I would be glad to assist - already hunted some shaders ;)
I know I should be patient and wait until I have learned enough at bo3b's school - but I'm so eager to play this game in S3D: Mind: Path to Thalamus (http://store.steampowered.com/app/296070/).
This game really offers mind-blowing environments and I can't wait to explore them in proper S3D. Good news: it is Dx9, uses UR3, has AllowNvidiaStereo3d=True and less issues than other UR3 games (no issues with bloom, haloes, fog, skybox, etc.). It only has 2 major issues: dynamic shadows and light shafts (god rays). The dynamic shadows are not completely messed up but not exactly at the right depth (see http://photos.3dvisionlive.com/3d4dd/image/53ff4bedd475fec94a0000c2/). "Remember me" had the same issue and Helix could fix it using a lua script. Unfortunately I don't know how to adapt this script to another game. The other issue are the light shafts which are very prominent in this game. Although the sky is at correct depth, the sun and the moon are too close (http://photos.3dvisionlive.com/3d4dd/image/53ff4e6fd475fe5a6a000125/). And it seems that their wrong position is used to calculate the light shafts (see http://photos.3dvisionlive.com/3d4dd/image/53ff4eaad475fe8705000185/ , http://photos.3dvisionlive.com/3d4dd/image/53ff4c66d475fe5a6a000122/). There also are some minor issues at water's edge but they are rare can be removed by disabling the shader. The major issues could also be prevented by setting DynamicShadows=False and bAllowLightShafts=False in UDKEngine.ini but I removes effects that are quite essential for these supernal scenarios. So it would be great if we can fix this game :)
As mentioned before I hope that I will learn one day to fix the game by myself. But if someone wants to have a look at it now I would be glad to assist - already hunted some shaders ;)
My original display name is 3d4dd - for some reason Nvidia changed it..?!
[quote="3d4dd"]..."Remember me" had the same issue and Helix could fix it using a lua script. Unfortunately I don't know how to adapt this script to another game...[/quote]
I was wondering myself the other day where I can find the Lua script and how to use it with an UDK engine /game... I looked on HelixMod blog but couldn't find any location for a download...
Disabling the dynamic shadows should still not kill the atmosphere that much..but the light shafts I think are more important....
I also am really interested in being able to figure out the patterns + corrections needed to sort out these type of shaders;))
3d4dd said:..."Remember me" had the same issue and Helix could fix it using a lua script. Unfortunately I don't know how to adapt this script to another game...
I was wondering myself the other day where I can find the Lua script and how to use it with an UDK engine /game... I looked on HelixMod blog but couldn't find any location for a download...
Disabling the dynamic shadows should still not kill the atmosphere that much..but the light shafts I think are more important....
I also am really interested in being able to figure out the patterns + corrections needed to sort out these type of shaders;))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]I was wondering myself the other day where I can find the Lua script and how to use it with an UDK engine /game... I looked on HelixMod blog but couldn't find any location for a download...[/quote]I'm not 100% sure but I think one thing you might be looking for is 'ShaderH3dFixer.jar', it's been present in a lot of the UDK(UE?) fixes(like the one linked above), I haven't tried it myself but(with a name like that ;)) I think it might fix certain issues ... I could be wrong but it's worth a look anyway. :)
helifax said:I was wondering myself the other day where I can find the Lua script and how to use it with an UDK engine /game... I looked on HelixMod blog but couldn't find any location for a download...
I'm not 100% sure but I think one thing you might be looking for is 'ShaderH3dFixer.jar', it's been present in a lot of the UDK(UE?) fixes(like the one linked above), I haven't tried it myself but(with a name like that ;)) I think it might fix certain issues ... I could be wrong but it's worth a look anyway. :)
As far as I know, the Lua script was never released as a standalone item, only as a part of a game fix. You should be able to snag a game-fix version of it, and drop it in, and it will run.
The actual code for the script may or may not do what you need though. The idea behind the script is to automate fixes that we might do manually, just repeating the same sequence over and over on different shaders that meet the fixing criteria. Basically that match the ones done manually. For example, there may be 400 shaders that affect shadows, all with the same 3D glitch. The Lua script will fix all 400 at launch.
By contrast and comparison, Mike does this sort of thing offline by editing the full dump of shader files with an external script that is similar in concept. Then all those shaders are shipped as part of a fix.
Not sure what the SahderH3dFixer.jar does or where it came from, but that's going to be Java code, not Lua. I think it's another offline tool of some form that eqzitara would use. I have a copy of that on my system somehow, and I did a Java decompile on that file. It's basically doing the same process the Lua script does, but offline.
Both of these work the way you might manually fix stuff by doing global searches for strings and patterns that are known bad, and patching them with known fixes.
Just like any automated process though, if your target doesn't quite match, or matches but is not a problem, it will merrily stomp on the code and add weird stuff that may make it worse. When they work, it's great, but you'll need to double check the results.
As far as I know, the Lua script was never released as a standalone item, only as a part of a game fix. You should be able to snag a game-fix version of it, and drop it in, and it will run.
The actual code for the script may or may not do what you need though. The idea behind the script is to automate fixes that we might do manually, just repeating the same sequence over and over on different shaders that meet the fixing criteria. Basically that match the ones done manually. For example, there may be 400 shaders that affect shadows, all with the same 3D glitch. The Lua script will fix all 400 at launch.
By contrast and comparison, Mike does this sort of thing offline by editing the full dump of shader files with an external script that is similar in concept. Then all those shaders are shipped as part of a fix.
Not sure what the SahderH3dFixer.jar does or where it came from, but that's going to be Java code, not Lua. I think it's another offline tool of some form that eqzitara would use. I have a copy of that on my system somehow, and I did a Java decompile on that file. It's basically doing the same process the Lua script does, but offline.
Both of these work the way you might manually fix stuff by doing global searches for strings and patterns that are known bad, and patching them with known fixes.
Just like any automated process though, if your target doesn't quite match, or matches but is not a problem, it will merrily stomp on the code and add weird stuff that may make it worse. When they work, it's great, but you'll need to double check the results.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
BTW, here's a post by eqzitara describing how to use the Lua script for UDK games:
[url]https://forums.geforce.com/default/topic/513190/3d-vision/how-to-fix-disable-shaders-in-games-dll-guide-and-fixes-/post/3797899/#3797899[/url]
For the occasional game, I can't get 3D to enable unless I attach it to a profile (Prototype, for example). Does anyone know of another way I can get the 3D to enable without using a profile?
For the occasional game, I can't get 3D to enable unless I attach it to a profile (Prototype, for example). Does anyone know of another way I can get the 3D to enable without using a profile?
@DHR: Thank You very much for the hint (once again, AC Liberations...)! Shame on me that I didn't look for the fix on Helixmod blog but only in the forum :(
The lua script method is very interesting...
@DHR: Thank You very much for the hint (once again, AC Liberations...)! Shame on me that I didn't look for the fix on Helixmod blog but only in the forum :(
The lua script method is very interesting...
My original display name is 3d4dd - for some reason Nvidia changed it..?!
There is indeed many shaders using the VS DF17CB6C ( in Lost Planet), so this is definitely the reason why I get my hud rendered in 3D + weird borders.
I used the method described in DHR post and the method posted at http://helixmod.blogspot.ca/2012/03/dlls-update.html in order to find the specific texture CRC and apply fix only on that one. after creating a DX9Settings.ini file I've inserted this:
[VSDF17CB6C]
CheckTexCRC = true
Sadly, after trying many times, the only things in my TEXTURESLOG.txt file are BeginScene and EndScene endlessly. I'va also try (from eqzitara):
[VSDF17CB6C]
CheckTexCRC = true
VBOffsetList = 0;
ValForDefined = 0
ValNotDefined = 1
TexCounterReg = 251
UseDefinedOnly = false
DefinedTexturesVS =
[VBDF17CB6C.0]
PointsList = 554;
Still not working...
I wonder if I need something special in the general section of DX9Settings.ini file? I also want to specify that my shader BDF17CB6C is already in shaderoverride folder. What did I miss?
PS: the bCalcTexCRCatStart method is not working apparently for LP, the game crash literally when this line is active.
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
Post the shader with your fix... i wrote before you need to add some lines (if statement) to the VS.
If you ask me, only adding the line "CheckTexCRC = true" to [VSDF17CB6C] should work at least to found the TexCRC related to that VS, but for me happens the same (BeginScene and EndScene endlessly) if i don't add the "if" statement to the VS.
MY WEB
Helix Mod - Making 3D Better
My 3D Screenshot Gallery
Like my fixes? you can donate to Paypal: dhr.donation@gmail.com
The way I do my shader intercept in OpenGL is at the Shader Source Code loading level. (So I can see the full Source Code in C before being compiled & linked and off-loaded to the GPU)
I was wondering if this would be the case in DirectX as well... Or do the directx games come with the shaders already built? I am wondering this because I guess it would be easier to modify code in C rather than ASM.
In OpenGL is like this:
- After you create the OpenGL Context and stuff:
- Create a Shader Program
- Create a Vertex/Pixel Shader
- Load the shader source for the Vertex & Pixel Shaders (C code)
- Compile the Vertex/Pixel shaders -> (You get compiled shader objects)
- Link the Vertex/Pixel shaders to the Shader Program.
- (Optional step) Delete the Vertex/Pixel Shaders - This way you delete from memory the shader source as they are no longer needed and already loaded in the Shader Program.
- Use the Shader Program when needed
Can somebody make a quick explanation(Like I did above for OpenGL) on how the whole DirectX (shader-wise) hangs together?
Thanks ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
In DirectX games, the shaders almost always come already built. This is the Microsoft recommendation, so we generally only have binary shader objects, that are passed in from the game. Sometimes, the game will compile at launch, but they'll still also send the binary shader info in the same way, so it's simpler to just catch the binary to get both types.
I haven't written a game directly, and normally think about this from the perspective of the wrapper on a given game, which is slightly different than how I'd think about it from the game perspective.
Because of that, I just catch the calls that the game makes. So for example:
1) CreateDevice called to set up the rendering environment.
2) Numerous CreatePixelShader, CreateVertexShader calls passing in binary shader code.
3) During rendering, specific shaders returned from those Create calls are activated, in turn.
4) Draw/Endscene/Present call to flip drawn back buffer to show.
So, it's pretty similar to your OpenGL path, but minus the compilation stage. In the 3Dmigoto case, it actually does do the compilation, because it previously Decompiled the binary back into HLSL. In Helix case, it will reassemble the shader from ASM text, because it previously Disassembled the binary. But in both cases, only for shaders that are overridden, not every shader.
Does that give you enough info?
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Big thanks for all the information;)) I had a feeling this was the case (with the pre-compiled object being shipped with the application).
Don't get me wrong or anything;)) I haven't worked with PS/VS 1,2,3 Assembly language before but I have worked with HLSL and GLSL quite a lot and I was just wondering about the possibility of having the original code and make all the fixes/changes there;)) I expected to be like this since I do the same thing for Surround fixes (modify the code in X86/x64 ASM since the game's source code is not "exactly shipped" either) ;))
Again big thanks for this information;)) Guess I'll have to learn DirectX ASM shader Language more to actually be able to modify some of the shaders ^_^;))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Plus like I say in School, you don't really need to know the language either ASM or HLSL, just recognize some patterns. At least until much, much later.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
This game really offers mind-blowing environments and I can't wait to explore them in proper S3D. Good news: it is Dx9, uses UR3, has AllowNvidiaStereo3d=True and less issues than other UR3 games (no issues with bloom, haloes, fog, skybox, etc.). It only has 2 major issues: dynamic shadows and light shafts (god rays). The dynamic shadows are not completely messed up but not exactly at the right depth (see http://photos.3dvisionlive.com/3d4dd/image/53ff4bedd475fec94a0000c2/). "Remember me" had the same issue and Helix could fix it using a lua script. Unfortunately I don't know how to adapt this script to another game. The other issue are the light shafts which are very prominent in this game. Although the sky is at correct depth, the sun and the moon are too close (http://photos.3dvisionlive.com/3d4dd/image/53ff4e6fd475fe5a6a000125/). And it seems that their wrong position is used to calculate the light shafts (see http://photos.3dvisionlive.com/3d4dd/image/53ff4eaad475fe8705000185/ , http://photos.3dvisionlive.com/3d4dd/image/53ff4c66d475fe5a6a000122/). There also are some minor issues at water's edge but they are rare can be removed by disabling the shader. The major issues could also be prevented by setting DynamicShadows=False and bAllowLightShafts=False in UDKEngine.ini but I removes effects that are quite essential for these supernal scenarios. So it would be great if we can fix this game :)
As mentioned before I hope that I will learn one day to fix the game by myself. But if someone wants to have a look at it now I would be glad to assist - already hunted some shaders ;)
My original display name is 3d4dd - for some reason Nvidia changed it..?!
I was wondering myself the other day where I can find the Lua script and how to use it with an UDK engine /game... I looked on HelixMod blog but couldn't find any location for a download...
Disabling the dynamic shadows should still not kill the atmosphere that much..but the light shafts I think are more important....
I also am really interested in being able to figure out the patterns + corrections needed to sort out these type of shaders;))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
There is a fix in the blog for that game:
http://helixmod.blogspot.com/2014/08/mind-path-to-thalamus.html
MY WEB
Helix Mod - Making 3D Better
My 3D Screenshot Gallery
Like my fixes? you can donate to Paypal: dhr.donation@gmail.com
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
The actual code for the script may or may not do what you need though. The idea behind the script is to automate fixes that we might do manually, just repeating the same sequence over and over on different shaders that meet the fixing criteria. Basically that match the ones done manually. For example, there may be 400 shaders that affect shadows, all with the same 3D glitch. The Lua script will fix all 400 at launch.
By contrast and comparison, Mike does this sort of thing offline by editing the full dump of shader files with an external script that is similar in concept. Then all those shaders are shipped as part of a fix.
Not sure what the SahderH3dFixer.jar does or where it came from, but that's going to be Java code, not Lua. I think it's another offline tool of some form that eqzitara would use. I have a copy of that on my system somehow, and I did a Java decompile on that file. It's basically doing the same process the Lua script does, but offline.
Both of these work the way you might manually fix stuff by doing global searches for strings and patterns that are known bad, and patching them with known fixes.
Just like any automated process though, if your target doesn't quite match, or matches but is not a problem, it will merrily stomp on the code and add weird stuff that may make it worse. When they work, it's great, but you'll need to double check the results.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
https://forums.geforce.com/default/topic/513190/3d-vision/how-to-fix-disable-shaders-in-games-dll-guide-and-fixes-/post/3797899/#3797899
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Dual boot Win 7 x64 & Win 10 (1809) | Geforce Drivers 417.35
The lua script method is very interesting...
My original display name is 3d4dd - for some reason Nvidia changed it..?!