I already told helifax on Steam, but the best method (I think bo3b was the one who told me, for Lords of the Fallen) is:
[code]float4 res = StereoParams.Load(int3(2,0,0));[/code]
And then using res.x and res.y. No shader override needed.
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="mx-2"]Do you think it would be easily possible to port this feature to some kind of standalone application for use with DX9? This could solve the problems with profile installation once and for all.[/quote]At some point I want to revive the DX9 version of 3DMigoto with the goal of creating a DLL that can be chainloaded with Helix Mod to complement it in areas where it is lacking, and this feature would be one of the easiest to port since the code doesn't have any DX11 dependencies. So, the short answer is yes it is possible, but I can't give you a timeframe (basically, when I'm in the mood, have enough time to devote to it and nothing else distracts me ;-).
mx-2 said:Do you think it would be easily possible to port this feature to some kind of standalone application for use with DX9? This could solve the problems with profile installation once and for all.
At some point I want to revive the DX9 version of 3DMigoto with the goal of creating a DLL that can be chainloaded with Helix Mod to complement it in areas where it is lacking, and this feature would be one of the easiest to port since the code doesn't have any DX11 dependencies. So, the short answer is yes it is possible, but I can't give you a timeframe (basically, when I'm in the mood, have enough time to devote to it and nothing else distracts me ;-).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
In most cases you can use either method to get the resolution. The subtle difference is that StereoParams gets the values from the driver, while res_width/height get it from the game. If you find they don't seem to match what the game is set to (CryEngine 3 always uses native resolution no matter what it is set to and upscales as needed and some UE4 games also have the option to render at a different resolution from the display) try changing get_resolution_from to depth_buffer and use the res_width/height in a ShaderOverride section.
Another trick is to use .GetDimensions(width, height) on a texture passed to the shader that you know matches the resolution.
In most cases you can use either method to get the resolution. The subtle difference is that StereoParams gets the values from the driver, while res_width/height get it from the game. If you find they don't seem to match what the game is set to (CryEngine 3 always uses native resolution no matter what it is set to and upscales as needed and some UE4 games also have the option to render at a different resolution from the display) try changing get_resolution_from to depth_buffer and use the res_width/height in a ShaderOverride section.
Another trick is to use .GetDimensions(width, height) on a texture passed to the shader that you know matches the resolution.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Hello again,
I tried to install 3Dmigoto 1.2.51 libraries into Steam\steamapps\common\SteamVR\bin\win64, as it seems that it is not called by DCS in VR.
And the 3dmigoto lib is called by SteamVR: the file d3d11_log.txt is created (but empty) and the directory "ShaderCache" is created. But the steamVR is not working: I had a popup saying that "a key component of steamVR isn't working properly".
I tried to chain the 3dmigoto with all the libs I found in the SteamVR/bin directory and the option "hook=recommended" but had no success.
Does anyone have an idea if there should be option to try or what library to target ? There are 7 libs other than VS redist, and a lot of chain options, so too much things to try...
Hello again,
I tried to install 3Dmigoto 1.2.51 libraries into Steam\steamapps\common\SteamVR\bin\win64, as it seems that it is not called by DCS in VR.
And the 3dmigoto lib is called by SteamVR: the file d3d11_log.txt is created (but empty) and the directory "ShaderCache" is created. But the steamVR is not working: I had a popup saying that "a key component of steamVR isn't working properly".
I tried to chain the 3dmigoto with all the libs I found in the SteamVR/bin directory and the option "hook=recommended" but had no success.
Does anyone have an idea if there should be option to try or what library to target ? There are 7 libs other than VS redist, and a lot of chain options, so too much things to try...
This might be of interest to some people - I've just released a new script in my 3d-fixes repository called "generic_shader_extractor.py".
It can extract DX11 shaders from many games, and should work with all DX11 UE3.5 and UE4 games. It will *not* work with Unity or CryEngine games, and may work with other game engines (including arbitrary custom engines) provided that they do not compress, encrypt or otherwise obfuscate their shaders in the game files. It extracts the shader binary and uses cmd_Decompiler (which is available from the 3DMigoto releases page and should be extracted to the 3d-fixes directory) to decompile and disassemble the shaders. It supports the same three shader hash types as 3DMigoto (the bytecode hash requires the crcmod package to be installed - the script tells you the commands to install it if it is missing).
It won't give you any more detail than dumping shaders with 3DMigoto will, but it can be useful for games that dump their shaders on demand (e.g. RotTR) or level by level (many UE games), or games that include GPU specific shaders that may not dump at all with 3DMigoto on a given system (e.g. Arkham Knight).
Please note that depending on the game the script can take a long time, and DX9 shaders are not supported yet.
@DJ-RK - this might be of use to you to see if KF2 includes any shaders that 3DMigoto didn't dump on your system.
This might be of interest to some people - I've just released a new script in my 3d-fixes repository called "generic_shader_extractor.py".
It can extract DX11 shaders from many games, and should work with all DX11 UE3.5 and UE4 games. It will *not* work with Unity or CryEngine games, and may work with other game engines (including arbitrary custom engines) provided that they do not compress, encrypt or otherwise obfuscate their shaders in the game files. It extracts the shader binary and uses cmd_Decompiler (which is available from the 3DMigoto releases page and should be extracted to the 3d-fixes directory) to decompile and disassemble the shaders. It supports the same three shader hash types as 3DMigoto (the bytecode hash requires the crcmod package to be installed - the script tells you the commands to install it if it is missing).
It won't give you any more detail than dumping shaders with 3DMigoto will, but it can be useful for games that dump their shaders on demand (e.g. RotTR) or level by level (many UE games), or games that include GPU specific shaders that may not dump at all with 3DMigoto on a given system (e.g. Arkham Knight).
Please note that depending on the game the script can take a long time, and DX9 shaders are not supported yet.
@DJ-RK - this might be of use to you to see if KF2 includes any shaders that 3DMigoto didn't dump on your system.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Hello again,
Stupid Me...I finally found the way to make the 3Dmogoto lib being called by SteamVR for DCS. I just had to put the mod into DCS\bin dir [u]and also[/u] into Steam\steamapps\common\SteamVR\bin\win64 dir.
It allows me to change the brightness and add a zoom effect based on a pixel shader (with the SuperSampling it really makes sense). Maybe this tip will help someone...
Hello again,
Stupid Me...I finally found the way to make the 3Dmogoto lib being called by SteamVR for DCS. I just had to put the mod into DCS\bin dir and also into Steam\steamapps\common\SteamVR\bin\win64 dir.
It allows me to change the brightness and add a zoom effect based on a pixel shader (with the SuperSampling it really makes sense). Maybe this tip will help someone...
[quote="Jan-Itor"]Any of you wizards wanna help me fix this shader? :)
[/quote]
What game? What does the shader do? What's wrong with it? Some screenshots?
And you are using an old 3Dmigoto version, by the way (or you dumped the shader long ago).
[quote="masterotaku"][quote="Jan-Itor"]Any of you wizards wanna help me fix this shader? :)
[/quote]
What game? What does the shader do? What's wrong with it? Some screenshots?
And you are using an old 3Dmigoto version, by the way (or you dumped the shader long ago).[/quote]
Oh yeah sorry, it's the HUD in The Division and I want it to toggle on/off. I used version 1.2.43 to export it, but same thing happens with any version, it says "boop" when I export it.
Jan-Itor said:Any of you wizards wanna help me fix this shader? :)
What game? What does the shader do? What's wrong with it? Some screenshots?
And you are using an old 3Dmigoto version, by the way (or you dumped the shader long ago).
Oh yeah sorry, it's the HUD in The Division and I want it to toggle on/off. I used version 1.2.43 to export it, but same thing happens with any version, it says "boop" when I export it.
1080 Ti - i7 5820k - 16Gb RAM - Win 10 version 1607 - ASUS VG236H (1920x1080@120Hz)
Check the "d3d11.log" file and see what errors it gives.
Did you try dumping the vertex shader instead of the pixel shader, in case it doesn't give any errors?
For an HUD on/off toggle, you need this (if in a pixel shader):
[code]
float4 iniparams = IniParams.Load(0);
if (iniparams.x==1)
{
discard;
}
[/code]
For example (there are other ways. If it's a pixel shader, you have to make the output "o0" to be 0 in that condition, for example, after its value has been defined by the shader). And you need in this case in "d3dx.ini" the constant "x=0", and then a hotkey in the hotkeys section:
[code]
[Key1]
Key = F2
type = toggle
x = 1
[/code]
You can take my Skyrim Special Edition fix as an example, although I used a weird formula for the HUD (for the curvature stuff). So look at other toggle hotkeys I made.
Check the "d3d11.log" file and see what errors it gives.
Did you try dumping the vertex shader instead of the pixel shader, in case it doesn't give any errors?
For an HUD on/off toggle, you need this (if in a pixel shader):
float4 iniparams = IniParams.Load(0);
if (iniparams.x==1)
{
discard;
}
For example (there are other ways. If it's a pixel shader, you have to make the output "o0" to be 0 in that condition, for example, after its value has been defined by the shader). And you need in this case in "d3dx.ini" the constant "x=0", and then a hotkey in the hotkeys section:
[Key1]
Key = F2
type = toggle
x = 1
You can take my Skyrim Special Edition fix as an example, although I used a weird formula for the HUD (for the curvature stuff). So look at other toggle hotkeys I made.
[quote="lefuneste"]Hello again,
Stupid Me...I finally found the way to make the 3Dmogoto lib being called by SteamVR for DCS. I just had to put the mod into DCS\bin dir [u]and also[/u] into Steam\steamapps\common\SteamVR\bin\win64 dir.
It allows me to change the brightness and add a zoom effect based on a pixel shader (with the SuperSampling it really makes sense). Maybe this tip will help someone...[/quote]
Thanks for letting us know. Did not occur to me that the VR launching mechanism would change the active directory.
lefuneste said:Hello again,
Stupid Me...I finally found the way to make the 3Dmogoto lib being called by SteamVR for DCS. I just had to put the mod into DCS\bin dir and also into Steam\steamapps\common\SteamVR\bin\win64 dir.
It allows me to change the brightness and add a zoom effect based on a pixel shader (with the SuperSampling it really makes sense). Maybe this tip will help someone...
Thanks for letting us know. Did not occur to me that the VR launching mechanism would change the active directory.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
masterotaku said:Check the "d3d11.log" file and see what errors it gives.
Did you try dumping the vertex shader instead of the pixel shader, in case it doesn't give any errors?
For an HUD on/off toggle, you need this (if in a pixel shader):
float4 iniparams = IniParams.Load(0);
if (iniparams.x==1)
{
discard;
}
For example (there are other ways. If it's a pixel shader, you have to make the output "o0" to be 0 in that condition, for example, after its value has been defined by the shader). And you need in this case in "d3dx.ini" the constant "x=0", and then a hotkey in the hotkeys section:
[Key1]
Key = F2
type = toggle
x = 1
You can take my Skyrim Special Edition fix as an example, although I used a weird formula for the HUD (for the curvature stuff). So look at other toggle hotkeys I made.
I know how to make a toggle, but this shader wont toggle so I assume there's something wrong in the code?
Not sure what I should be looking for in the d3d11.log, but I found this:
Tom Clancy's The Division\ShaderFixes\fc112a500bae3ab7-ps_replace.txt(62,23-24): error X3003: redefinition of 'w2'
Exporting the vertex shader I also get the boop sound, here it is:
// ---- Created with 3Dmigoto v1.2.51 on Fri Dec 16 07:22:54 2016
cbuffer cb1 : register(b1)
{
float4 cb1[1];
}
[quote="Jan-Itor"]I know how to make a toggle, but this shader wont toggle so I assume there's something wrong in the code?
Not sure what I should be looking for in the d3d11.log, but I found this:
[i]Tom Clancy's The Division\ShaderFixes\fc112a500bae3ab7-ps_replace.txt(62,23-24): error X3003: redefinition of 'w2'[/i][/quote]
For this one, it's what it says, w2 is redefined. In the shader main declaration section:
[code]...
nointerpolation float w2 : IO4_UI_UserSmall0,
nointerpolation int w2 : IO6_UI_RenderType0,
[/code]
Change the first one to x2, and change where it is used in the shader. (first one is only used once, easier to fix)
[code] r2.x = saturate(x2.x * 2 + cb5[0].y); // r2.x = saturate(w2.x * 2 + cb5[0].y);
[/code]
That will allow it to compile with no errors, and give high beep.
If you want to fix the vertex shader, that one is harder, because it needs some hand fixed code. I'm happy to do that, but only want to spend the time if you need to use it.
That will allow it to compile with no errors, and give high beep.
If you want to fix the vertex shader, that one is harder, because it needs some hand fixed code. I'm happy to do that, but only want to spend the time if you need to use it.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]
For this one, it's what it says, w2 is redefined. In the shader main declaration section:
[code]...
nointerpolation float w2 : IO4_UI_UserSmall0,
nointerpolation int w2 : IO6_UI_RenderType0,
[/code]
Change the first one to x2, and change where it is used in the shader. (first one is only used once, easier to fix)
[code] r2.x = saturate(x2.x * 2 + cb5[0].y); // r2.x = saturate(w2.x * 2 + cb5[0].y);
[/code]
That will allow it to compile with no errors, and give high beep.
[/quote]
This worked, thanks a ton! No need to fix the vertex shader.
And then using res.x and res.y. No shader override needed.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Another trick is to use .GetDimensions(width, height) on a texture passed to the shader that you know matches the resolution.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I tried to install 3Dmigoto 1.2.51 libraries into Steam\steamapps\common\SteamVR\bin\win64, as it seems that it is not called by DCS in VR.
And the 3dmigoto lib is called by SteamVR: the file d3d11_log.txt is created (but empty) and the directory "ShaderCache" is created. But the steamVR is not working: I had a popup saying that "a key component of steamVR isn't working properly".
I tried to chain the 3dmigoto with all the libs I found in the SteamVR/bin directory and the option "hook=recommended" but had no success.
Does anyone have an idea if there should be option to try or what library to target ? There are 7 libs other than VS redist, and a lot of chain options, so too much things to try...
It can extract DX11 shaders from many games, and should work with all DX11 UE3.5 and UE4 games. It will *not* work with Unity or CryEngine games, and may work with other game engines (including arbitrary custom engines) provided that they do not compress, encrypt or otherwise obfuscate their shaders in the game files. It extracts the shader binary and uses cmd_Decompiler (which is available from the 3DMigoto releases page and should be extracted to the 3d-fixes directory) to decompile and disassemble the shaders. It supports the same three shader hash types as 3DMigoto (the bytecode hash requires the crcmod package to be installed - the script tells you the commands to install it if it is missing).
It won't give you any more detail than dumping shaders with 3DMigoto will, but it can be useful for games that dump their shaders on demand (e.g. RotTR) or level by level (many UE games), or games that include GPU specific shaders that may not dump at all with 3DMigoto on a given system (e.g. Arkham Knight).
Please note that depending on the game the script can take a long time, and DX9 shaders are not supported yet.
@DJ-RK - this might be of use to you to see if KF2 includes any shaders that 3DMigoto didn't dump on your system.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Stupid Me...I finally found the way to make the 3Dmogoto lib being called by SteamVR for DCS. I just had to put the mod into DCS\bin dir and also into Steam\steamapps\common\SteamVR\bin\win64 dir.
It allows me to change the brightness and add a zoom effect based on a pixel shader (with the SuperSampling it really makes sense). Maybe this tip will help someone...
1080 Ti - i7 5820k - 16Gb RAM - Win 10 version 1607 - ASUS VG236H (1920x1080@120Hz)
What game? What does the shader do? What's wrong with it? Some screenshots?
And you are using an old 3Dmigoto version, by the way (or you dumped the shader long ago).
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Oh yeah sorry, it's the HUD in The Division and I want it to toggle on/off. I used version 1.2.43 to export it, but same thing happens with any version, it says "boop" when I export it.
1080 Ti - i7 5820k - 16Gb RAM - Win 10 version 1607 - ASUS VG236H (1920x1080@120Hz)
Did you try dumping the vertex shader instead of the pixel shader, in case it doesn't give any errors?
For an HUD on/off toggle, you need this (if in a pixel shader):
For example (there are other ways. If it's a pixel shader, you have to make the output "o0" to be 0 in that condition, for example, after its value has been defined by the shader). And you need in this case in "d3dx.ini" the constant "x=0", and then a hotkey in the hotkeys section:
You can take my Skyrim Special Edition fix as an example, although I used a weird formula for the HUD (for the curvature stuff). So look at other toggle hotkeys I made.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Thanks for letting us know. Did not occur to me that the VR launching mechanism would change the active directory.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I know how to make a toggle, but this shader wont toggle so I assume there's something wrong in the code?
Not sure what I should be looking for in the d3d11.log, but I found this:
Tom Clancy's The Division\ShaderFixes\fc112a500bae3ab7-ps_replace.txt(62,23-24): error X3003: redefinition of 'w2'
Exporting the vertex shader I also get the boop sound, here it is:
1080 Ti - i7 5820k - 16Gb RAM - Win 10 version 1607 - ASUS VG236H (1920x1080@120Hz)
For this one, it's what it says, w2 is redefined. In the shader main declaration section:
Change the first one to x2, and change where it is used in the shader. (first one is only used once, easier to fix)
That will allow it to compile with no errors, and give high beep.
If you want to fix the vertex shader, that one is harder, because it needs some hand fixed code. I'm happy to do that, but only want to spend the time if you need to use it.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
This worked, thanks a ton! No need to fix the vertex shader.
1080 Ti - i7 5820k - 16Gb RAM - Win 10 version 1607 - ASUS VG236H (1920x1080@120Hz)