[quote="helifax"]Hi DarkStarSword,
I am trying to use the DX11 Unity scripts and I am hitting some walls:(
[code]
unity_asset_extractor.py *_Data/Resources/* *_Data/*.assets # THIS WORKS
cd extracted # THIS WORKS
extract_unity_shaders.py */*.shader --type=d3d11 # THIS SEEM TO WORK
cd ShaderFNVs # FAILED
[/code]
After the above scripts are run it fails to cd on "ShaderFNVs" folder as is missing. I did copy cmd_Decompiler.exe to both "3d-fixes" and game folders and from what I can see it should work?
Am I doing something wrong? Thank you in advance![/quote]
Open the command prompt in the game directory and try typing this:
[code]extract_unity_shaders.py extracted/*/*.shader --type=d3d11[/code]
I am trying to use the DX11 Unity scripts and I am hitting some walls:(
unity_asset_extractor.py *_Data/Resources/* *_Data/*.assets # THIS WORKS
cd extracted # THIS WORKS
extract_unity_shaders.py */*.shader --type=d3d11 # THIS SEEM TO WORK
cd ShaderFNVs # FAILED
After the above scripts are run it fails to cd on "ShaderFNVs" folder as is missing. I did copy cmd_Decompiler.exe to both "3d-fixes" and game folders and from what I can see it should work?
Am I doing something wrong? Thank you in advance!
Open the command prompt in the game directory and try typing this:
[quote="helifax"]Yupp, both autofix.sh and autofix53.sh fail there...
I updated the specified files here: [url=http://3dsurroundgaming.com/3DVision/insideData_unity.rar]Assets Files[/url]
Thank you again for taking the time and looking into this![/quote]
:facepalm:
This was a regression I introduced when adding support for Unity 5.3 that broke all older versions, but how I didn't catch this before... I guess I've only been working on 5.3 games since then...
Anyway, all fixed now - update the scripts and try again.
helifax said:Yupp, both autofix.sh and autofix53.sh fail there...
I updated the specified files here: Assets Files
Thank you again for taking the time and looking into this!
:facepalm:
This was a regression I introduced when adding support for Unity 5.3 that broke all older versions, but how I didn't catch this before... I guess I've only been working on 5.3 games since then...
Anyway, all fixed now - update the scripts and try again.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="masterotaku"]Am I mixing syntax in the ini with things like "Preset1" and "PRES3"?[/quote]Yeah, [Preset*] are for old versions, [PRES*] are for new versions - see this page:
http://wiki.bo3b.net/index.php?title=HelixMod_Feature_List
[quote="DarkStarSword"][quote="helifax"]Yupp, both autofix.sh and autofix53.sh fail there...
I updated the specified files here: [url=http://3dsurroundgaming.com/3DVision/insideData_unity.rar]Assets Files[/url]
Thank you again for taking the time and looking into this![/quote]
:facepalm:
This was a regression I introduced when adding support for Unity 5.3 that broke all older versions, but how I didn't catch this before... I guess I've only been working on 5.3 games since then...
Anyway, all fixed now - update the scripts and try again.[/quote]
Thanks DSS! I'll check it out later today and report back;)
Thank you for looking into this!
helifax said:Yupp, both autofix.sh and autofix53.sh fail there...
I updated the specified files here: Assets Files
Thank you again for taking the time and looking into this!
:facepalm:
This was a regression I introduced when adding support for Unity 5.3 that broke all older versions, but how I didn't catch this before... I guess I've only been working on 5.3 games since then...
Anyway, all fixed now - update the scripts and try again.
Thanks DSS! I'll check it out later today and report back;)
Thank you for looking into this!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
You need to use "Const2 = 0x3f800000" for 1.0 - those values are (annoyingly) hexadecimal representations of 32bit floats. If that doesn't work, try using a lower constant register as some games have issues with high register numbers that aren't fully understood.
You need to use "Const2 = 0x3f800000" for 1.0 - those values are (annoyingly) hexadecimal representations of 32bit floats. If that doesn't work, try using a lower constant register as some games have issues with high register numbers that aren't fully understood.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"]You need to use "Const2 = 0x3f800000" for 1.0 - those values are (annoyingly) hexadecimal representations of 32bit floats.[/quote]
Thank you! That worked :). I'll use the converter from now on (http://gregstoll.dyndns.org/~gregstoll/floattohex/).
Everything clear (at least until I try changing HUD depth and other things I haven't tried yet in DX9).
uniform float4x4 _Object2World;
uniform float4x4 _World2Object;
uniform float4 unity_LODFade; // x is the fade value ranging within [0,1]. y is x quantized into 16 levels
uniform float4 unity_WorldTransformParams; // w is usually 1.0, or -1.0 for odd-negative scale transforms
}
/****************************** COMPILE ERRORS ******************************
D:\Temp2\steamapps\common\INSIDE\ShaderFixes\0b98b564f7fe945d-ps_replace.txt(135,9-10): error X3003: redefinition of 'w1'
compilation failed; no code produced
****************************** COMPILE ERRORS ******************************/
I see that ".w1" is defined 2 times. I checked this shader with the raw extracted one from "extracted" folder and indeed "w1" is defined 2 times there.
Is there a possibility the shader god dumped wrong?
Is there any way to know how to manually fix it (except the obvious by manually trying it out)?
Other shaders fail since the subscript .yzw is out of range as the variable is declared as float rather than float4.
Thank you in advance!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote]I see that ".w1" is defined 2 times. I checked this shader with the raw extracted one from "extracted" folder and indeed "w1" is defined 2 times there.[/quote]
That will be a Decompiler bug that generated the duplicate w1. Without the ASM I can't say for sure how to fix it, but you'll want to just manually change the second one to a different variable like x1. Check the ASM to determine which one to change.
This happens because the fxc compiler aggressively packs all variables into single inputs, even though there are plenty of available inputs and there is no performance penalty for unpacking them.
I can likely fix that problem however once I get back to working on the Decompiler.
I see that ".w1" is defined 2 times. I checked this shader with the raw extracted one from "extracted" folder and indeed "w1" is defined 2 times there.
That will be a Decompiler bug that generated the duplicate w1. Without the ASM I can't say for sure how to fix it, but you'll want to just manually change the second one to a different variable like x1. Check the ASM to determine which one to change.
This happens because the fxc compiler aggressively packs all variables into single inputs, even though there are plenty of available inputs and there is no performance penalty for unpacking them.
I can likely fix that problem however once I get back to working on the Decompiler.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
compilation failed; no code produced
****************************** COMPILE ERRORS ******************************/
I don't know about this one though... Should I keed o3 as float3 and correct line 152 or should I make it a float4?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
I've been intending to switch the DX11 script over to use asm (now that I have that swanky asmtool and I've made it as simple as possible to write new DX11 asm patterns), but haven't quite got around to that yet. Mostly those errors will be ok to ignore - the script patches a lot of shaders that are not necessarily broken (e.g. it might move a reflection from surface depth to correct depth, but leaving it at surface depth would have been acceptable in most cases), and if any fail to compile it just names them ~failed to denote this.
There is a possibility of rendering issues caused by decompiler bugs that still manage to compile, and with the script patching so many shaders the probability of these increases. For now you need to manually identify effects broken in this way and remove (or hand fix) the patched shader, but once I switch over to asm this problem should go away (there will still be the occasional effect that my script legitimately breaks, but the goal is to minimise the amount of manual effort required).
I've been intending to switch the DX11 script over to use asm (now that I have that swanky asmtool and I've made it as simple as possible to write new DX11 asm patterns), but haven't quite got around to that yet. Mostly those errors will be ok to ignore - the script patches a lot of shaders that are not necessarily broken (e.g. it might move a reflection from surface depth to correct depth, but leaving it at surface depth would have been acceptable in most cases), and if any fail to compile it just names them ~failed to denote this.
There is a possibility of rendering issues caused by decompiler bugs that still manage to compile, and with the script patching so many shaders the probability of these increases. For now you need to manually identify effects broken in this way and remove (or hand fix) the patched shader, but once I switch over to asm this problem should go away (there will still be the occasional effect that my script legitimately breaks, but the goal is to minimise the amount of manual effort required).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
I will test it out and if required I will try to manually patch/unpatch the shaders needed!
Big thank you again!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
compilation failed; no code produced
****************************** COMPILE ERRORS ******************************/
I don't know about this one though... Should I keed o3 as float3 and correct line 152 or should I make it a float4?
For this one, you'd want to break the instruction into two pieces, so:
o3.xyz = r2.xyz;
p3.w = r2.w;
We can't change the input signature, because the data flows from this VS to the PS, and unless we change all PS usages of the inputs, we'd get a data mismatch. It's easiest to keep the input/output signatures unchanged as much as possible.
No idea why fxc decided to break those up, probably has to do with how it's used in the PS.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Well, I just discovered that there are two alternate IDs for some of the profile settings, which I've listed here:
[url]http://wiki.bo3b.net/index.php?title=Driver_Profile_Settings#Notes_on_settings_with_multiple_IDs[/url]
Interestingly it looks like a number of built in profiles (e.g. Biohazard 5, Civ5, RE5, SR3) ship with both the primary setting ID and one of the two possible alternate IDs. In this case, I believe the alternate ID will take precedence over the primary ID, *but* only one of the two possible alternate IDs can be used for a given game and I haven't quite worked out how to work out which it will be.
I've updated my [url=https://raw.githubusercontent.com/DarkStarSword/3d-fixes/master/CustomSettingNames_en-EN.xml]CustomSettingNames_en-EN.xml[/url] with these additional IDs (note that this file is for the old version of NVIDIA Inspector, though will work with the new one at a pinch by removing the _en-EN from the filename. I'm working on polishing up a version to send upstream).
Also, I am now getting quite convinced that the internal settings flag changes the encoding scheme of all the settings, but I don't know how yet. My reasoning is that StereoConvergence is clearly a floating point value for user profiles, but apparently garbage for built in profiles. Likewise, all the string properties that don't decode properly are the ones with the internal settings flag.
Interestingly it looks like a number of built in profiles (e.g. Biohazard 5, Civ5, RE5, SR3) ship with both the primary setting ID and one of the two possible alternate IDs. In this case, I believe the alternate ID will take precedence over the primary ID, *but* only one of the two possible alternate IDs can be used for a given game and I haven't quite worked out how to work out which it will be.
I've updated my CustomSettingNames_en-EN.xml with these additional IDs (note that this file is for the old version of NVIDIA Inspector, though will work with the new one at a pinch by removing the _en-EN from the filename. I'm working on polishing up a version to send upstream).
Also, I am now getting quite convinced that the internal settings flag changes the encoding scheme of all the settings, but I don't know how yet. My reasoning is that StereoConvergence is clearly a floating point value for user profiles, but apparently garbage for built in profiles. Likewise, all the string properties that don't decode properly are the ones with the internal settings flag.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"]Also, I am now getting quite convinced that the internal settings flag changes the encoding scheme of all the settings, but I don't know how yet. My reasoning is that StereoConvergence is clearly a floating point value for user profiles, but apparently garbage for built in profiles. Likewise, all the string properties that don't decode properly are the ones with the internal settings flag.[/quote]I've worked out something pretty significant about the internal fields that explains *A LOT* of mysteries about the built in profiles, like why their values are so strange and don't match our documentation, and why NVIDIA Inspector can't be used to edit these without damaging them. I'm still missing a pretty key piece of information that I need to decode arbitrary settings...
DarkStarSword said:Also, I am now getting quite convinced that the internal settings flag changes the encoding scheme of all the settings, but I don't know how yet. My reasoning is that StereoConvergence is clearly a floating point value for user profiles, but apparently garbage for built in profiles. Likewise, all the string properties that don't decode properly are the ones with the internal settings flag.
I've worked out something pretty significant about the internal fields that explains *A LOT* of mysteries about the built in profiles, like why their values are so strange and don't match our documentation, and why NVIDIA Inspector can't be used to edit these without damaging them. I'm still missing a pretty key piece of information that I need to decode arbitrary settings...
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Open the command prompt in the game directory and try typing this:
Dual boot Win 7 x64 & Win 10 (1809) | Geforce Drivers 417.35
:facepalm:
This was a regression I introduced when adding support for Unity 5.3 that broke all older versions, but how I didn't catch this before... I guess I've only been working on 5.3 games since then...
Anyway, all fixed now - update the scripts and try again.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
http://wiki.bo3b.net/index.php?title=HelixMod_Feature_List
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Thanks DSS! I'll check it out later today and report back;)
Thank you for looking into this!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Thanks, I got the convergence presets to work with the latest DLL (April 2014). The constants still don't work, though.
DX9Settings.ini:
In the shaders, somehow the constant equals 0. Are my presets 3 to 6 correct? 5 and 6 are for this shader(C5AB437F.txt):
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Thank you! That worked :). I'll use the converter from now on (http://gregstoll.dyndns.org/~gregstoll/floattohex/).
Everything clear (at least until I try changing HUD depth and other things I haven't tried yet in DX9).
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
The new scripts seem to work;) I had to "dos2unix" the "*.sh" files though as they lost their UNIX encoding.
Everything seems to have worked fine;)
However I get some weird failed "shaders".
An example:
I see that ".w1" is defined 2 times. I checked this shader with the raw extracted one from "extracted" folder and indeed "w1" is defined 2 times there.
Is there a possibility the shader god dumped wrong?
Is there any way to know how to manually fix it (except the obvious by manually trying it out)?
Other shaders fail since the subscript .yzw is out of range as the variable is declared as float rather than float4.
Thank you in advance!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
That will be a Decompiler bug that generated the duplicate w1. Without the ASM I can't say for sure how to fix it, but you'll want to just manually change the second one to a different variable like x1. Check the ASM to determine which one to change.
This happens because the fxc compiler aggressively packs all variables into single inputs, even though there are plenty of available inputs and there is no performance penalty for unpacking them.
I can likely fix that problem however once I get back to working on the Decompiler.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Big thanks for that input!
For this shader
I don't know about this one though... Should I keed o3 as float3 and correct line 152 or should I make it a float4?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
There is a possibility of rendering issues caused by decompiler bugs that still manage to compile, and with the script patching so many shaders the probability of these increases. For now you need to manually identify effects broken in this way and remove (or hand fix) the patched shader, but once I switch over to asm this problem should go away (there will still be the occasional effect that my script legitimately breaks, but the goal is to minimise the amount of manual effort required).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I will test it out and if required I will try to manually patch/unpatch the shaders needed!
Big thank you again!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
For this one, you'd want to break the instruction into two pieces, so:
We can't change the input signature, because the data flows from this VS to the PS, and unless we change all PS usages of the inputs, we'd get a data mismatch. It's easiest to keep the input/output signatures unchanged as much as possible.
No idea why fxc decided to break those up, probably has to do with how it's used in the PS.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
http://wiki.bo3b.net/index.php?title=Driver_Profile_Settings#Notes_on_settings_with_multiple_IDs
Interestingly it looks like a number of built in profiles (e.g. Biohazard 5, Civ5, RE5, SR3) ship with both the primary setting ID and one of the two possible alternate IDs. In this case, I believe the alternate ID will take precedence over the primary ID, *but* only one of the two possible alternate IDs can be used for a given game and I haven't quite worked out how to work out which it will be.
I've updated my CustomSettingNames_en-EN.xml with these additional IDs (note that this file is for the old version of NVIDIA Inspector, though will work with the new one at a pinch by removing the _en-EN from the filename. I'm working on polishing up a version to send upstream).
Also, I am now getting quite convinced that the internal settings flag changes the encoding scheme of all the settings, but I don't know how yet. My reasoning is that StereoConvergence is clearly a floating point value for user profiles, but apparently garbage for built in profiles. Likewise, all the string properties that don't decode properly are the ones with the internal settings flag.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword