[quote="DJ-RK"]Question. I just had to redo my KillingFloor 2 fix since one of the updates broke the previous one entirely. This time I decided to try using bytecode shader hashes, since my interpretation of the description is that it might possible future proof a fix from being broken by updates (guess my first question is whether this is a correct assumption or not?).[/quote]
The bytecode hash is a good choice if you see any debug info in the assembly version of the shaders, which looks like this:
[code]
...
dcl_output_siv o2.xyzw, position
dcl_temps 4
#line 62 "d:\work\master\git_clone\_intermediate\win64\GrModelShaders_dx11_win64\Release\NewTerrain_Shadow_ln_vs.hlsl"
mov r0.xy, v0.xyxx // NTerrainTransformInput_transformInput_inPosition<0,1>
nop
mov r0.xy, r0.xyxx // inPosition<0,1>
#line 216
mul r0.xy, r0.xyxx, cb5[9].xyxx
...
[/code]
Because the debug info contains the path on the developer machine, so it can change if the developer compiles the game from somewhere else. e.g. the 3DMigoto hash of the above shader from MGSVTPP was 25524bfa54a9562c, but the hash of this identical shader in MGO:
[code]
...
dcl_output_siv o2.xyzw, position
dcl_temps 4
#line 62 "d:\jenkins\workspace\game_steam\_intermediate\win64\GrModelShaders_dx11_win64\Release\NewTerrain_Depth_vs.hlsl"
mov r0.xy, v0.xyxx // NTerrainTransformInput_transformInput_inPosition<0,1>
nop
mov r0.xy, r0.xyxx // inPosition<0,1>
#line 216
mul r0.xy, r0.xyxx, cb5[9].xyxx
...
[/code]
was 02f65f6b2cc30e62. Using the bytecode hash gave f6ddc313 for both (this isn't one of the fixed shaders, I just grabbed it as an example of the reason the 3dmigoto and embedded hashes can change).
BTW as a side note - I am fucking amazed at how smoothly MGSVTPP runs given they compiled their shaders with debug info, which normally suggests that they were not compiled for speed - yet, that game outperforms every other modern AAA game!
[quote]I did a full HLSL dump (as far as I can tell) and then fixed a whole bunch of shaders using an offline approach. Got the game entirely fixed, and released the fix. Since then, tehace reported in the thread for the game that shadows were broken for him. He actually managed to hunt for and find what appears to be the correct shader. The strange thing is that the hash for the shader does not match any of the shaders that were dumped for me. Is this a potential quirk or bug of using bytecode? Is it possible for different configurations to generate different hashes for the same shaders? As far as I'm aware, we were both using the same settings: 1080P, all different levels of shadows settings, etc, so I'm at a complete loss for why I didn't have that shader hash generated at all.[/quote]
Did the hash he have still look like a bytecode hash like 000000000xxxxxxxx? See if you can get him to send us a copy of the shader - if possible, the binary version (use the undocumented export_binary=1 option) of the shader is the best option since we can check all possible hashes it should generate and see exactly what the difference was. Also, if there was an update make sure you are both on the same version / Steam branch, just to make sure that isn't the issue.
Remind me... Killing Floor 2 - that was UE right? Do you know if it is UE3.5 or UE4? One of the problems we found in Arkham Knight (UE3.5) was that there were *GPU specific shaders* - my laptop dumped completely different lighting shaders to Mike's desktop! When I extracted the names from the game files I found that they were called things like "FTileShadingComputeShaderTSCS_Wetness00NVOPS_KeplerOptimised". Unfortunately, you may have seen my rant in the ABZU thread - writing a script to extract all shaders from any UE3/4 game like I do for Unity games has ... significant challenges. I might be able to come up with a more generic script that can extract DX11 shaders from almost any game (but wouldn't extract the names or other info), which might give us an alternate means to dump all shaders that should ignore GPU differences or games that only dump shaders level be level.
DJ-RK said:Question. I just had to redo my KillingFloor 2 fix since one of the updates broke the previous one entirely. This time I decided to try using bytecode shader hashes, since my interpretation of the description is that it might possible future proof a fix from being broken by updates (guess my first question is whether this is a correct assumption or not?).
The bytecode hash is a good choice if you see any debug info in the assembly version of the shaders, which looks like this:
Because the debug info contains the path on the developer machine, so it can change if the developer compiles the game from somewhere else. e.g. the 3DMigoto hash of the above shader from MGSVTPP was 25524bfa54a9562c, but the hash of this identical shader in MGO:
was 02f65f6b2cc30e62. Using the bytecode hash gave f6ddc313 for both (this isn't one of the fixed shaders, I just grabbed it as an example of the reason the 3dmigoto and embedded hashes can change).
BTW as a side note - I am fucking amazed at how smoothly MGSVTPP runs given they compiled their shaders with debug info, which normally suggests that they were not compiled for speed - yet, that game outperforms every other modern AAA game!
I did a full HLSL dump (as far as I can tell) and then fixed a whole bunch of shaders using an offline approach. Got the game entirely fixed, and released the fix. Since then, tehace reported in the thread for the game that shadows were broken for him. He actually managed to hunt for and find what appears to be the correct shader. The strange thing is that the hash for the shader does not match any of the shaders that were dumped for me. Is this a potential quirk or bug of using bytecode? Is it possible for different configurations to generate different hashes for the same shaders? As far as I'm aware, we were both using the same settings: 1080P, all different levels of shadows settings, etc, so I'm at a complete loss for why I didn't have that shader hash generated at all.
Did the hash he have still look like a bytecode hash like 000000000xxxxxxxx? See if you can get him to send us a copy of the shader - if possible, the binary version (use the undocumented export_binary=1 option) of the shader is the best option since we can check all possible hashes it should generate and see exactly what the difference was. Also, if there was an update make sure you are both on the same version / Steam branch, just to make sure that isn't the issue.
Remind me... Killing Floor 2 - that was UE right? Do you know if it is UE3.5 or UE4? One of the problems we found in Arkham Knight (UE3.5) was that there were *GPU specific shaders* - my laptop dumped completely different lighting shaders to Mike's desktop! When I extracted the names from the game files I found that they were called things like "FTileShadingComputeShaderTSCS_Wetness00NVOPS_KeplerOptimised". Unfortunately, you may have seen my rant in the ABZU thread - writing a script to extract all shaders from any UE3/4 game like I do for Unity games has ... significant challenges. I might be able to come up with a more generic script that can extract DX11 shaders from almost any game (but wouldn't extract the names or other info), which might give us an alternate means to dump all shaders that should ignore GPU differences or games that only dump shaders level be level.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
It's also worth checking what CPUs you both have - that is not supposed to matter, but the bytecode hash is using hardware acceleration provided by the CPU, so a hardware bug in the CPU could theoretically change the result.
It's also worth checking what CPUs you both have - that is not supposed to matter, but the bytecode hash is using hardware acceleration provided by the CPU, so a hardware bug in the CPU could theoretically change the result.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Hmmm, some interesting details.
Yes, the hash was still in the bytecode format, and sure I'll ask about having him send me the binary version of the shader.
And yes, KF2 is on UE3.5. I checked his shader and didn't see anything that looked odd or GPU specific. Based on his sig we both appear to have the same GPU model (980 Ti), but his is a EVGA and mine is a GB. Different CPU generations though.
Overall, I found I don't really like working with bytecode hashes (ie. if I do a frame analysis dump, and then want to find the shader in the shader cache, it's tricky to determine if it's 8, 9, or maybe even 10 0s lined up in a row), so from what I gathered in your response is that it's probably best to only use them as needed, rather than hopefully as a proactive approach.
Yes, the hash was still in the bytecode format, and sure I'll ask about having him send me the binary version of the shader.
And yes, KF2 is on UE3.5. I checked his shader and didn't see anything that looked odd or GPU specific. Based on his sig we both appear to have the same GPU model (980 Ti), but his is a EVGA and mine is a GB. Different CPU generations though.
Overall, I found I don't really like working with bytecode hashes (ie. if I do a frame analysis dump, and then want to find the shader in the shader cache, it's tricky to determine if it's 8, 9, or maybe even 10 0s lined up in a row), so from what I gathered in your response is that it's probably best to only use them as needed, rather than hopefully as a proactive approach.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
In theory we can improve it so it only uses 8 characters in the filenames when using the bytecode hash, but there's a lot of places internally we'd need to change and it's not quite as simple as when we changed the texture hash to crc32c since we still need to display 16 characters for other hash types.
The embedded hash on the other hand has one significant advantage - since it is present in the original shader binary it can be used to locate the binary shader in the game files, and in the case of UE3/4 it is possible to see some plain text strings nearby that tell you it's name and potentially some info about the bound resources (plus it's the fastest of the three, but all the hashes are fast, so we don't really care about saving a fraction of a second at startup) :-)
Either way, IMO it's a good idea to keep a copy of the ShaderCache folder created with export_binary=1 around, so you can use one of my rename2*.sh scripts in case you need to change the hash type later (it is still a hassle, but not as much as redoing the whole thing from scratch).
In theory we can improve it so it only uses 8 characters in the filenames when using the bytecode hash, but there's a lot of places internally we'd need to change and it's not quite as simple as when we changed the texture hash to crc32c since we still need to display 16 characters for other hash types.
The embedded hash on the other hand has one significant advantage - since it is present in the original shader binary it can be used to locate the binary shader in the game files, and in the case of UE3/4 it is possible to see some plain text strings nearby that tell you it's name and potentially some info about the bound resources (plus it's the fastest of the three, but all the hashes are fast, so we don't really care about saving a fraction of a second at startup) :-)
Either way, IMO it's a good idea to keep a copy of the ShaderCache folder created with export_binary=1 around, so you can use one of my rename2*.sh scripts in case you need to change the hash type later (it is still a hassle, but not as much as redoing the whole thing from scratch).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="bo3b"]
...
A test case that worked is running GunJack (Unreal 4 engine) using SteamVR.
...
So, this successful test case demonstrates with assurance that there is no fundamental reason why 3Dmigoto cannot run against VR games, even with 3D Vision disabled.
[/quote]
Thanks to have spent some time on my request. The weird thing is that 3Dmigoto with DCS in VR has worked for days on my config and it is still working for another guy...
Anyway, I tried with WarThunder, that is also free (You just have to launch directly "aces.exe" in the win64 program otherwise the standard launcher will clean out its install): same problem than DCS ! Ok in no VR mode, and KO in VR mode...
So I do not know if it is linked to the use of OSVR in addition to SteamVR or if this game is also a problem for 3Dmigoto in VR....
bo3b said:
...
A test case that worked is running GunJack (Unreal 4 engine) using SteamVR.
...
So, this successful test case demonstrates with assurance that there is no fundamental reason why 3Dmigoto cannot run against VR games, even with 3D Vision disabled.
Thanks to have spent some time on my request. The weird thing is that 3Dmigoto with DCS in VR has worked for days on my config and it is still working for another guy...
Anyway, I tried with WarThunder, that is also free (You just have to launch directly "aces.exe" in the win64 program otherwise the standard launcher will clean out its install): same problem than DCS ! Ok in no VR mode, and KO in VR mode...
So I do not know if it is linked to the use of OSVR in addition to SteamVR or if this game is also a problem for 3Dmigoto in VR....
[quote="lefuneste"][quote="bo3b"]
...
A test case that worked is running GunJack (Unreal 4 engine) using SteamVR.
...
So, this successful test case demonstrates with assurance that there is no fundamental reason why 3Dmigoto cannot run against VR games, even with 3D Vision disabled.[/quote]Thanks to have spent some time on my request. The weird thing is that 3Dmigoto with DCS in VR has worked for days on my config and it is still working for another guy...
Anyway, I tried with WarThunder, that is also free (You just have to launch directly "aces.exe" in the win64 program otherwise the standard launcher will clean out its install): same problem than DCS ! Ok in no VR mode, and KO in VR mode...
So I do not know if it is linked to the use of OSVR in addition to SteamVR or if this game is also a problem for 3Dmigoto in VR....[/quote]
Yes, sorry, no idea what happens there, it's some game-specific problem or incompatibility. I could not find a combination that would allow DCS to launch in VR with 3Dmigoto enabled.
Most likely this is one of those force-fed game updates, that changed how they interact with other tools. I won't be surprised if other things like Fraps or Afterburner or other overlays also cause problems.
bo3b said:
...
A test case that worked is running GunJack (Unreal 4 engine) using SteamVR.
...
So, this successful test case demonstrates with assurance that there is no fundamental reason why 3Dmigoto cannot run against VR games, even with 3D Vision disabled.
Thanks to have spent some time on my request. The weird thing is that 3Dmigoto with DCS in VR has worked for days on my config and it is still working for another guy...
Anyway, I tried with WarThunder, that is also free (You just have to launch directly "aces.exe" in the win64 program otherwise the standard launcher will clean out its install): same problem than DCS ! Ok in no VR mode, and KO in VR mode...
So I do not know if it is linked to the use of OSVR in addition to SteamVR or if this game is also a problem for 3Dmigoto in VR....
Yes, sorry, no idea what happens there, it's some game-specific problem or incompatibility. I could not find a combination that would allow DCS to launch in VR with 3Dmigoto enabled.
Most likely this is one of those force-fed game updates, that changed how they interact with other tools. I won't be surprised if other things like Fraps or Afterburner or other overlays also cause problems.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
@bo3b/DarkStarSword
Can you make in a future 3Dmigoto version a setting to make a game not start with 3D enabled by default? I know about the "
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\NVIDIA Corporation\Global\Stereo3D\StereoDefaultOn" regedit setting, but I want it just for one game.
I'm going to release a Star Wars Rogue Squadron "fix". It will be just some keyboard/mouse/controller convergence and separation hotkeys and instructions on how to properly use 3D with this game. What happens is that if 3D is enabled and then a polygonal level loads after a loading screen, some surfaces have broken textures (similar to using "surface_createmode=1"). If I enable 3D after the level has loaded, 3D is perfect, and it's really nice to play using the cockpit view. So having a setting to have the game in 2D by default would be better for me to not forget about turning it off first :p.
If you can't or don't want (other things surely have a higher priority for you, like making Dishonored 2 work with 3Dmigoto), don't worry. It's just an additional button press at the start.
Can you make in a future 3Dmigoto version a setting to make a game not start with 3D enabled by default? I know about the "
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\NVIDIA Corporation\Global\Stereo3D\StereoDefaultOn" regedit setting, but I want it just for one game.
I'm going to release a Star Wars Rogue Squadron "fix". It will be just some keyboard/mouse/controller convergence and separation hotkeys and instructions on how to properly use 3D with this game. What happens is that if 3D is enabled and then a polygonal level loads after a loading screen, some surfaces have broken textures (similar to using "surface_createmode=1"). If I enable 3D after the level has loaded, 3D is perfect, and it's really nice to play using the cockpit view. So having a setting to have the game in 2D by default would be better for me to not forget about turning it off first :p.
If you can't or don't want (other things surely have a higher priority for you, like making Dishonored 2 work with 3Dmigoto), don't worry. It's just an additional button press at the start.
Hi,
I was wondering, is there a way to specify a default convergence?:) in the ini file?
Maybe the new [Profile] can do this?:)
How about the default separation value?
Thank you in advance!
Hi,
I was wondering, is there a way to specify a default convergence?:) in the ini file?
Maybe the new [Profile] can do this?:)
How about the default separation value?
Thank you in advance!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Hi,
I was wondering, is there a way to specify a default convergence?:) in the ini file?
Maybe the new [Profile] can do this?:)[/quote]
Yes, it's in the ini. The third commented setting in the "[Profile]" section.
";StereoConvergence = 0.5"
I don't see separation there. It's probably a global setting in the drivers, right? I always use max depth anyway, unless I make a hotkey for 0 separation in some of my fixes, but that's for rare uses.
helifax said:Hi,
I was wondering, is there a way to specify a default convergence?:) in the ini file?
Maybe the new [Profile] can do this?:)
Yes, it's in the ini. The third commented setting in the "[Profile]" section.
";StereoConvergence = 0.5"
I don't see separation there. It's probably a global setting in the drivers, right? I always use max depth anyway, unless I make a hotkey for 0 separation in some of my fixes, but that's for rare uses.
Yep, like masterotaku said, use StereoConvergence to set the initial convergence. 3DMigoto will allow it to differ provided that the user has set it and not the driver, but if it needs to update the profile for any other reason it will set it to whatever you specify in the d3dx.ini. The idea is to allow the user to save their custom convergence and respect it by default, while still providing us a way to set it when the fix is first installed or updated.
There is a setting called StereoSeparation, but AFAICT it doesn't seem to do anything. It is listed in the AngleLib profile (set to 2), so maybe it does something, but certainly doesn't seem to set the separation :-/
We had always planned to add separation & convergence to the constants section to set initial values, but we ran into issues where we couldn't reliably set them depending on how the game started up, so they've been left to keyboard shortcuts ever since. We might be able to do better nowadays since we have DXGI hooked and can tell when the game switches graphics modes, etc. so we might revisit that.
That said, unless you have a good reason to override the separation (like the fix depends on it being a specific value) that is probably better left to the user to decide.
Yep, like masterotaku said, use StereoConvergence to set the initial convergence. 3DMigoto will allow it to differ provided that the user has set it and not the driver, but if it needs to update the profile for any other reason it will set it to whatever you specify in the d3dx.ini. The idea is to allow the user to save their custom convergence and respect it by default, while still providing us a way to set it when the fix is first installed or updated.
There is a setting called StereoSeparation, but AFAICT it doesn't seem to do anything. It is listed in the AngleLib profile (set to 2), so maybe it does something, but certainly doesn't seem to set the separation :-/
We had always planned to add separation & convergence to the constants section to set initial values, but we ran into issues where we couldn't reliably set them depending on how the game started up, so they've been left to keyboard shortcuts ever since. We might be able to do better nowadays since we have DXGI hooked and can tell when the game switches graphics modes, etc. so we might revisit that.
That said, unless you have a good reason to override the separation (like the fix depends on it being a specific value) that is probably better left to the user to decide.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Thank you both;)
Now I have one more question;)
Is it possible to have access to my resolution W and H in a shader? (or the aspect ratio) ?
I know it used to work, but I can't find it anywhere...
I can't remember how I had to load the StereoParams :(
Now I have one more question;)
Is it possible to have access to my resolution W and H in a shader? (or the aspect ratio) ?
I know it used to work, but I can't find it anywhere...
I can't remember how I had to load the StereoParams :(
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="DarkStarSword"]
[...]
This release brings a long awaited feature to 3DMigoto - the ability to write the driver profiles directly without needing external tools. There is a new [Profile] section in the d3dx.ini, which you can fill out with whatever settings you need. The ini file has documentation for a number of common settings that people often need, but will accept any setting known to the driver (by name or hex ID). The intended use for this setting is to allow you to set just the settings that you need for a fix, a good example would be as follows:
[...]
[/quote]Great update, DarkStarSword and all others working on 3Dmigoto !
Finally, there is no more need for the quite difficult profile installation procedure - especially for people who do it the first time.
Do you think it would be easily possible to port this feature to some kind of standalone application for use with DX9? This could solve the problems with profile installation once and for all.
DarkStarSword said:
[...]
This release brings a long awaited feature to 3DMigoto - the ability to write the driver profiles directly without needing external tools. There is a new [Profile] section in the d3dx.ini, which you can fill out with whatever settings you need. The ini file has documentation for a number of common settings that people often need, but will accept any setting known to the driver (by name or hex ID). The intended use for this setting is to allow you to set just the settings that you need for a fix, a good example would be as follows:
[...]
Great update, DarkStarSword and all others working on 3Dmigoto !
Finally, there is no more need for the quite difficult profile installation procedure - especially for people who do it the first time.
Do you think it would be easily possible to port this feature to some kind of standalone application for use with DX9? This could solve the problems with profile installation once and for all.
[quote="helifax"]Thank you both;)
Now I have one more question;)
Is it possible to have access to my resolution W and H in a shader? (or the aspect ratio) ?
I know it used to work, but I can't find it anywhere...
I can't remember how I had to load the StereoParams :([/quote]
I believe you do this by creating a shader override section in the d3dx.ini file like:
[ShaderOverrideWhatever]
hash = _______
x1=rt_width
y1=rt_height
z1=res_width
w1=res_height
and the in the shader itself add
float4 ResParams = IniParams.Load(int2(1,0));
and ResParams.z will have the resolution width, and ResParams.w will have the height.
Haven't had to do this too often, and I'm typing this out by hand off the top of my head, so forgive me if I'm off, but I'm about 95% sure this is correct.
Now I have one more question;)
Is it possible to have access to my resolution W and H in a shader? (or the aspect ratio) ?
I know it used to work, but I can't find it anywhere...
I can't remember how I had to load the StereoParams :(
I believe you do this by creating a shader override section in the d3dx.ini file like:
and ResParams.z will have the resolution width, and ResParams.w will have the height.
Haven't had to do this too often, and I'm typing this out by hand off the top of my head, so forgive me if I'm off, but I'm about 95% sure this is correct.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
The bytecode hash is a good choice if you see any debug info in the assembly version of the shaders, which looks like this:
Because the debug info contains the path on the developer machine, so it can change if the developer compiles the game from somewhere else. e.g. the 3DMigoto hash of the above shader from MGSVTPP was 25524bfa54a9562c, but the hash of this identical shader in MGO:
was 02f65f6b2cc30e62. Using the bytecode hash gave f6ddc313 for both (this isn't one of the fixed shaders, I just grabbed it as an example of the reason the 3dmigoto and embedded hashes can change).
BTW as a side note - I am fucking amazed at how smoothly MGSVTPP runs given they compiled their shaders with debug info, which normally suggests that they were not compiled for speed - yet, that game outperforms every other modern AAA game!
Did the hash he have still look like a bytecode hash like 000000000xxxxxxxx? See if you can get him to send us a copy of the shader - if possible, the binary version (use the undocumented export_binary=1 option) of the shader is the best option since we can check all possible hashes it should generate and see exactly what the difference was. Also, if there was an update make sure you are both on the same version / Steam branch, just to make sure that isn't the issue.
Remind me... Killing Floor 2 - that was UE right? Do you know if it is UE3.5 or UE4? One of the problems we found in Arkham Knight (UE3.5) was that there were *GPU specific shaders* - my laptop dumped completely different lighting shaders to Mike's desktop! When I extracted the names from the game files I found that they were called things like "FTileShadingComputeShaderTSCS_Wetness00NVOPS_KeplerOptimised". Unfortunately, you may have seen my rant in the ABZU thread - writing a script to extract all shaders from any UE3/4 game like I do for Unity games has ... significant challenges. I might be able to come up with a more generic script that can extract DX11 shaders from almost any game (but wouldn't extract the names or other info), which might give us an alternate means to dump all shaders that should ignore GPU differences or games that only dump shaders level be level.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Yes, the hash was still in the bytecode format, and sure I'll ask about having him send me the binary version of the shader.
And yes, KF2 is on UE3.5. I checked his shader and didn't see anything that looked odd or GPU specific. Based on his sig we both appear to have the same GPU model (980 Ti), but his is a EVGA and mine is a GB. Different CPU generations though.
Overall, I found I don't really like working with bytecode hashes (ie. if I do a frame analysis dump, and then want to find the shader in the shader cache, it's tricky to determine if it's 8, 9, or maybe even 10 0s lined up in a row), so from what I gathered in your response is that it's probably best to only use them as needed, rather than hopefully as a proactive approach.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
Like my fixes? Dontations can be made to: www.paypal.me/DShanz or rshannonca@gmail.com
Like electronic music? Check out: www.soundcloud.com/dj-ryan-king
The embedded hash on the other hand has one significant advantage - since it is present in the original shader binary it can be used to locate the binary shader in the game files, and in the case of UE3/4 it is possible to see some plain text strings nearby that tell you it's name and potentially some info about the bound resources (plus it's the fastest of the three, but all the hashes are fast, so we don't really care about saving a fraction of a second at startup) :-)
Either way, IMO it's a good idea to keep a copy of the ShaderCache folder created with export_binary=1 around, so you can use one of my rename2*.sh scripts in case you need to change the hash type later (it is still a hassle, but not as much as redoing the whole thing from scratch).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Thanks to have spent some time on my request. The weird thing is that 3Dmigoto with DCS in VR has worked for days on my config and it is still working for another guy...
Anyway, I tried with WarThunder, that is also free (You just have to launch directly "aces.exe" in the win64 program otherwise the standard launcher will clean out its install): same problem than DCS ! Ok in no VR mode, and KO in VR mode...
So I do not know if it is linked to the use of OSVR in addition to SteamVR or if this game is also a problem for 3Dmigoto in VR....
Yes, sorry, no idea what happens there, it's some game-specific problem or incompatibility. I could not find a combination that would allow DCS to launch in VR with 3Dmigoto enabled.
Most likely this is one of those force-fed game updates, that changed how they interact with other tools. I won't be surprised if other things like Fraps or Afterburner or other overlays also cause problems.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Can you make in a future 3Dmigoto version a setting to make a game not start with 3D enabled by default? I know about the "
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\NVIDIA Corporation\Global\Stereo3D\StereoDefaultOn" regedit setting, but I want it just for one game.
I'm going to release a Star Wars Rogue Squadron "fix". It will be just some keyboard/mouse/controller convergence and separation hotkeys and instructions on how to properly use 3D with this game. What happens is that if 3D is enabled and then a polygonal level loads after a loading screen, some surfaces have broken textures (similar to using "surface_createmode=1"). If I enable 3D after the level has loaded, 3D is perfect, and it's really nice to play using the cockpit view. So having a setting to have the game in 2D by default would be better for me to not forget about turning it off first :p.
If you can't or don't want (other things surely have a higher priority for you, like making Dishonored 2 work with 3Dmigoto), don't worry. It's just an additional button press at the start.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
I was wondering, is there a way to specify a default convergence?:) in the ini file?
Maybe the new [Profile] can do this?:)
How about the default separation value?
Thank you in advance!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Yes, it's in the ini. The third commented setting in the "[Profile]" section.
";StereoConvergence = 0.5"
I don't see separation there. It's probably a global setting in the drivers, right? I always use max depth anyway, unless I make a hotkey for 0 separation in some of my fixes, but that's for rare uses.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
There is a setting called StereoSeparation, but AFAICT it doesn't seem to do anything. It is listed in the AngleLib profile (set to 2), so maybe it does something, but certainly doesn't seem to set the separation :-/
We had always planned to add separation & convergence to the constants section to set initial values, but we ran into issues where we couldn't reliably set them depending on how the game started up, so they've been left to keyboard shortcuts ever since. We might be able to do better nowadays since we have DXGI hooked and can tell when the game switches graphics modes, etc. so we might revisit that.
That said, unless you have a good reason to override the separation (like the fix depends on it being a specific value) that is probably better left to the user to decide.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Now I have one more question;)
Is it possible to have access to my resolution W and H in a shader? (or the aspect ratio) ?
I know it used to work, but I can't find it anywhere...
I can't remember how I had to load the StereoParams :(
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Finally, there is no more need for the quite difficult profile installation procedure - especially for people who do it the first time.
Do you think it would be easily possible to port this feature to some kind of standalone application for use with DX9? This could solve the problems with profile installation once and for all.
My 3D fixes with Helixmod for the Risen series on GitHub
Bo3b's School for Shaderhackers - starting point for your first 3D fix
I believe you do this by creating a shader override section in the d3dx.ini file like:
[ShaderOverrideWhatever]
hash = _______
x1=rt_width
y1=rt_height
z1=res_width
w1=res_height
and the in the shader itself add
float4 ResParams = IniParams.Load(int2(1,0));
and ResParams.z will have the resolution width, and ResParams.w will have the height.
Haven't had to do this too often, and I'm typing this out by hand off the top of my head, so forgive me if I'm off, but I'm about 95% sure this is correct.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
Like my fixes? Dontations can be made to: www.paypal.me/DShanz or rshannonca@gmail.com
Like electronic music? Check out: www.soundcloud.com/dj-ryan-king