The Witcher 3: Wild Hunt
  30 / 152    
Would a bin -> ASM -> fixed ASM -> bin workflow be helping in any way.
Would a bin -> ASM -> fixed ASM -> bin workflow be helping in any way.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 05/22/2015 10:08 PM   
[quote="Flugan"]Would a bin -> ASM -> fixed ASM -> bin workflow be helping in any way.[/quote]Fully stripped shaders makes the Decompilers job harder and more prone to error, because now it has only the text itself to rely upon. That is possibly good enough, but is a change in direction from the underlying approach of using the James-Jones crosscompiler. This is the second game we've seen stripped headers on, so it could be that next-gen also means 'hostile.' The only problem with going straight binary->ASM and back is that it's a completely different workflow for the shaderhacker. It's viable, it's just harder. Harder sometimes means fewer people can do it. That's the main reason I push for the Decompiler as an easier path. The Decompiler is perhaps 2/3 complete on SM5. This game has two new instructions I've seen (that are presently not needed for a fix). And these reflection problems, which means having a straight-text path. My goal has always been that as time goes by and we fix more and more games, that the Decompiler keeps getting better and better. Until eventually we have a fully debugged Decompiler for all Shader Models. And also, my goal has never been to make the Decompiler some stand-alone computer science project and finish all the paths used or not, which is why I never work on it unless there are issues. That may not be the right approach as it leads to the perception that the Decompiler is a piece of junk and barely works. I'm sure I can fix these problems with the Decompiler here, it's just that it's not as fast as all the people here want. For reasons that escape me, people *always* want to play the game the day it's released. Even though we know for an absolute fact that waiting 2 weeks will be when the real version ships. So, my approach for the Decompiler doesn't work well in that scenario because people have to wait. The spot where direct ASM would work well would be for doing very fast turnaround of brand new games.
Flugan said:Would a bin -> ASM -> fixed ASM -> bin workflow be helping in any way.
Fully stripped shaders makes the Decompilers job harder and more prone to error, because now it has only the text itself to rely upon. That is possibly good enough, but is a change in direction from the underlying approach of using the James-Jones crosscompiler.

This is the second game we've seen stripped headers on, so it could be that next-gen also means 'hostile.'

The only problem with going straight binary->ASM and back is that it's a completely different workflow for the shaderhacker. It's viable, it's just harder. Harder sometimes means fewer people can do it. That's the main reason I push for the Decompiler as an easier path.


The Decompiler is perhaps 2/3 complete on SM5. This game has two new instructions I've seen (that are presently not needed for a fix). And these reflection problems, which means having a straight-text path. My goal has always been that as time goes by and we fix more and more games, that the Decompiler keeps getting better and better. Until eventually we have a fully debugged Decompiler for all Shader Models.

And also, my goal has never been to make the Decompiler some stand-alone computer science project and finish all the paths used or not, which is why I never work on it unless there are issues. That may not be the right approach as it leads to the perception that the Decompiler is a piece of junk and barely works.


I'm sure I can fix these problems with the Decompiler here, it's just that it's not as fast as all the people here want. For reasons that escape me, people *always* want to play the game the day it's released. Even though we know for an absolute fact that waiting 2 weeks will be when the real version ships.

So, my approach for the Decompiler doesn't work well in that scenario because people have to wait. The spot where direct ASM would work well would be for doing very fast turnaround of brand new games.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 05/22/2015 10:23 PM   
[quote="bo3b"][quote="helifax"]Ok here is the save file. It was taken with a fresh/unaltered shader (just this shader was swapped) for shadows: f9c83c25de6eaa51-ps_replace.txt I am not sure what this is but it looks like a shader compiling issue?[/quote] Sounds good thanks. I figured out what the problem is with resinfo Decompiling and it's worse than I thought- CDPR are not only stripping the text headers, they are also stripping the reflection information from the shader. That means that automated tools can't inspect the binary to get things like data types. That's why my carefully crafted resinfo decompiler (from Mordor) is failing here. They apparently really don't want modders here. It's weird that only some of the shaders are this way, and some are still complete. Could be something weird like NVidia supplying PCSR shadows in a stripped out form. Or vice-versa. I'll take a look in a bit to see if there is a workaround. At a minimum I can have it dump the ASM and we can hand fix the ones we need.[/quote] Interesting.... Doesn't surprise me... Another thing that I found interesting is the fact that SLI is not actually working... I always GET the same fps in SLI (no matter what monitor config) as in Single GPU... Normally in 2D surround I get 30fps with original flag in SLI. I disable SLI (still in 2D Surround) I get, guess what...30 fps... I change the SLI flag to another one (like sleeping dogs), guess what, I get 60fps in 2D Surround (with a lot of flickering....) I looked through the profile in detail and I noticed they are also setting some interesting flags which stripped make the game render incomplete... Flags are DX10DSCBNumber and DX10VSCBNumber. And I am talking about PLAIN 2D HERE (3D Vision is not even enabled)... The game is a mess tbh... This smells of hacky-hacks rather than optimizations... (either nvidia's side or CDProjectRed)... I really hope they will fix this stuff soon (both sides) Other game that uses the same flags is Shadow of Mordor (different values ofc). Without them 3D Vision wouldn't work properly not plain 2D. No idea exactly what they are controlling but it most be some rendering mode/setup. The flickering that is introduced (when changing the SLI flag) with proper SLI scaling is directly correlated with them. I guess it was too much to expect a proper game release (that got postponed like 2 times and all the other stuff that people are reporting/debating).... Sad... Oh well, until then hopefully we can fix all the other problems related with a proper 3D Vision fix ^_^
bo3b said:
helifax said:Ok here is the save file. It was taken with a fresh/unaltered shader (just this shader was swapped) for shadows:
f9c83c25de6eaa51-ps_replace.txt

I am not sure what this is but it looks like a shader compiling issue?

Sounds good thanks.

I figured out what the problem is with resinfo Decompiling and it's worse than I thought- CDPR are not only stripping the text headers, they are also stripping the reflection information from the shader. That means that automated tools can't inspect the binary to get things like data types. That's why my carefully crafted resinfo decompiler (from Mordor) is failing here. They apparently really don't want modders here.

It's weird that only some of the shaders are this way, and some are still complete. Could be something weird like NVidia supplying PCSR shadows in a stripped out form. Or vice-versa.

I'll take a look in a bit to see if there is a workaround. At a minimum I can have it dump the ASM and we can hand fix the ones we need.


Interesting.... Doesn't surprise me...
Another thing that I found interesting is the fact that SLI is not actually working...
I always GET the same fps in SLI (no matter what monitor config) as in Single GPU...
Normally in 2D surround I get 30fps with original flag in SLI. I disable SLI (still in 2D Surround) I get, guess what...30 fps...

I change the SLI flag to another one (like sleeping dogs), guess what, I get 60fps in 2D Surround (with a lot of flickering....)

I looked through the profile in detail and I noticed they are also setting some interesting flags which stripped make the game render incomplete... Flags are DX10DSCBNumber and DX10VSCBNumber.
And I am talking about PLAIN 2D HERE (3D Vision is not even enabled)...

The game is a mess tbh... This smells of hacky-hacks rather than optimizations... (either nvidia's side or CDProjectRed)... I really hope they will fix this stuff soon (both sides)

Other game that uses the same flags is Shadow of Mordor (different values ofc). Without them 3D Vision wouldn't work properly not plain 2D. No idea exactly what they are controlling but it most be some rendering mode/setup.
The flickering that is introduced (when changing the SLI flag) with proper SLI scaling is directly correlated with them.

I guess it was too much to expect a proper game release (that got postponed like 2 times and all the other stuff that people are reporting/debating)....
Sad...

Oh well, until then hopefully we can fix all the other problems related with a proper 3D Vision fix ^_^

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

Posted 05/22/2015 10:47 PM   
Sounds good my ASM -> bin compiler is far from tested with real ASM fixes. As it has never been used there would be some initial issues. In addition to getting the compiler more robust the only change would be making 3Dmigoto dump binary shaders and then decompile to ASM using my external tool and compiling the hopefully fixed ASM file. You've managed great so far only fixing the HLSL decompiler where needed and doing manual conversions. By using my external tool minimal changes need to be made to 3Dmigoto as it can already load binary files. I'm just thinking out loud.
Sounds good my ASM -> bin compiler is far from tested with real ASM fixes.

As it has never been used there would be some initial issues.

In addition to getting the compiler more robust the only change would be making 3Dmigoto dump binary shaders and then decompile to ASM using my external tool and compiling the hopefully fixed ASM file.

You've managed great so far only fixing the HLSL decompiler where needed and doing manual conversions.

By using my external tool minimal changes need to be made to 3Dmigoto as it can already load binary files.

I'm just thinking out loud.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 05/22/2015 10:48 PM   
Hey guys, Reading this thread is still a lot of fun even for us who only have the very basic ideas what you're talking about (I can follow what you're saying but could never expound upon or advance the conversation). The game is truly amazing, and from the bottom of my 3D heart, thanks for working on this. I for one am happy to play with the CM fix posted earlier until all is sorted. I get good FPS on a mid range rig with all the important things turned on (I despise almost all of the things in the "post processing" options - why ruin great graphics with shite like chromatic aberration imo?). The halos are bad in cutscenes but it's a price to pay for being able to play this so quickly. Please don't ever think this community is "demanding" anything, we are all a passionate bunch else we wouldn't be reading this right? I am thankful for this, and will save the "true" fix for when all the effects are fixed one day - even if that's for a replay. CM mode isn't ideal but again it looks oddly good in this game, gets great FPS from a demanding game, and all effects work perfectly - sunshafts, reflections, bloom etc. But again please don't feel any pressure - sooner is always better but nobody ever died from not playing a game ;) Best Regards
Hey guys,
Reading this thread is still a lot of fun even for us who only have the very basic ideas what you're talking about (I can follow what you're saying but could never expound upon or advance the conversation). The game is truly amazing, and from the bottom of my 3D heart, thanks for working on this. I for one am happy to play with the CM fix posted earlier until all is sorted. I get good FPS on a mid range rig with all the important things turned on (I despise almost all of the things in the "post processing" options - why ruin great graphics with shite like chromatic aberration imo?). The halos are bad in cutscenes but it's a price to pay for being able to play this so quickly. Please don't ever think this community is "demanding" anything, we are all a passionate bunch else we wouldn't be reading this right? I am thankful for this, and will save the "true" fix for when all the effects are fixed one day - even if that's for a replay. CM mode isn't ideal but again it looks oddly good in this game, gets great FPS from a demanding game, and all effects work perfectly - sunshafts, reflections, bloom etc. But again please don't feel any pressure - sooner is always better but nobody ever died from not playing a game ;)

Best Regards

Core i7 920 @ 3.6Ghz, 6GB 3 Channel, SLi GTX670 2GB, SSD

Posted 05/23/2015 04:10 AM   
One thing that concerns me is that some of the shaders (including the shadow shader f9c83c25de6eaa51-ps_replace.txt) is using integer data types, which I don't believe 3Dmigoto handles correctly. 3Dmigoto currently casts ints to floats, which means any value larger than 16777216 may get truncated unless it is a power of 2, or does not require more than 24 significant bits to store. In other words - we are depending on luck, or a truncation not causing a noticeable difference. I believe we should be using asint(), asuint() and asfloat() on any instruction that works on integers instead of casting (these reinterpret the bit pattern, which is different to a cast), however so far I haven't had much luck getting these to behave consistently for me (I don't get the original result and the compiler is now optimising out some code I didn't expect), and I'm not sure if I'm missing something (could be - this is a significant change) or if there are fundamental issues with these functions that make this approach unworkable. I'll keep looking.
One thing that concerns me is that some of the shaders (including the shadow shader f9c83c25de6eaa51-ps_replace.txt) is using integer data types, which I don't believe 3Dmigoto handles correctly.

3Dmigoto currently casts ints to floats, which means any value larger than 16777216 may get truncated unless it is a power of 2, or does not require more than 24 significant bits to store. In other words - we are depending on luck, or a truncation not causing a noticeable difference.

I believe we should be using asint(), asuint() and asfloat() on any instruction that works on integers instead of casting (these reinterpret the bit pattern, which is different to a cast), however so far I haven't had much luck getting these to behave consistently for me (I don't get the original result and the compiler is now optimising out some code I didn't expect), and I'm not sure if I'm missing something (could be - this is a significant change) or if there are fundamental issues with these functions that make this approach unworkable.

I'll keep looking.

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

Posted 05/23/2015 05:20 AM   
[quote="vaelo"]I for one am happy to play with the CM fix posted earlier until all is sorted. /quote] Actually the alpha fix posted by Mike is running almost 3D vision ready. Skybox is perfect, shadows perfect, lights are excellent, the only thing - but so minor - I recognized was the reflection of a water puddle... but who cares? ;) Water reflections are also all excellent. So actually no need to play in CM with those ugly halos. The fix is IMHO already excellent.
vaelo said:I for one am happy to play with the CM fix posted earlier until all is sorted. /quote]

Actually the alpha fix posted by Mike is running almost 3D vision ready. Skybox is perfect, shadows perfect, lights are excellent, the only thing - but so minor - I recognized was the reflection of a water puddle... but who cares? ;) Water reflections are also all excellent. So actually no need to play in CM with those ugly halos. The fix is IMHO already excellent.

Intel Core i7-3820, 4 X 3,60 GHz overclocked to 4,50 GHz ; EVGA Titan X 12VRAM ; 16 GB Corsair Vengeance DDR-1600 (4x 4 GB) ; Asus VG278H 27-inch incl. 3D vision 2 glasses, integrated transmitter ; Xbox One Elite wireless controller ; Windows 10HTC VIVE 2,5 m2 roomscale3D VISION GAMERS - VISIT ME ON STEAM and feel free to add me: http://steamcommunity.com/profiles/76561198064106555 YOUTUBE: https://www.youtube.com/channel/UC1UE5TPoF0HX0HVpF_E4uPQ STEAM CURATOR: https://store.steampowered.com/curator/33611530-Streaming-Deluxe/ Image

Posted 05/23/2015 08:22 AM   
Sorry for noob question but in the last threads I read that there are many problems with 3D and many users are using the CM fix (which is fake 3D?). But there is already a fix for REAL 3D (from Mike, page 24) which works very good!!? Or do I get something wrong? So what is the problem and why should one use CM-mode? And then there are threads like "3D vision is dead" but thanks to some heroes games like "The witcher 3" could be fixed within a few hours after release! Thank you so much!
Sorry for noob question but in the last threads I read that there are many problems with 3D and many users are using the CM fix (which is fake 3D?).
But there is already a fix for REAL 3D (from Mike, page 24) which works very good!!? Or do I get something wrong? So what is the problem and why should one use CM-mode?
And then there are threads like "3D vision is dead" but thanks to some heroes games like "The witcher 3" could be fixed within a few hours after release! Thank you so much!

Win 8.1 pro 64 bit, Gigabyte Z87X-D3H - i7-4770K@3.5 - 32 GB, 2x Geforce TITAN SLI (EVGA), 3x ASUS 3D-Vision-Monitors

Posted 05/23/2015 08:50 AM   
[quote="DarkStarSword"]One thing that concerns me is that some of the shaders (including the shadow shader f9c83c25de6eaa51-ps_replace.txt) is using integer data types, which I don't believe 3Dmigoto handles correctly. 3Dmigoto currently casts ints to floats, which means any value larger than 16777216 may get truncated unless it is a power of 2, or does not require more than 24 significant bits to store. In other words - we are depending on luck, or a truncation not causing a noticeable difference. I believe we should be using asint(), asuint() and asfloat() on any instruction that works on integers instead of casting (these reinterpret the bit pattern, which is different to a cast), however so far I haven't had much luck getting these to behave consistently for me (I don't get the original result and the compiler is now optimising out some code I didn't expect), and I'm not sure if I'm missing something (could be - this is a significant change) or if there are fundamental issues with these functions that make this approach unworkable. I'll keep looking.[/quote] I don't know if I haven't asked this already some time ago... I am not as familiar with DirectX as I am with OpenGL and thus I need to ask. Does a DirectX game ship with pre-compiled shaders or are the shaders compiled at runtime and used? For example in OpenGL you always compile them at runtime (because the compiler might optimize things differently on certain hardware and you can't ship an universal shader set that works on all GPUS). So, in OGL you can have access to the original source code during creation time. Any shader swapping can be done there, or later after the shaders have been created, you can break them down and re-create them. Does DirectX works like this ? or is it shipped with the binary information and that binary information is used instead or recompiling the shaders ? @DarkStarSword & bo3b: Are you guys looking into the issue with the missing specular/reflection information that is found in some shaders ? or there isn't anything we can do about it currently? Should I wait for a fix or should I go further and try to identify new shaders and fix them ? Saw a couple decals that are in 2D ;))
DarkStarSword said:One thing that concerns me is that some of the shaders (including the shadow shader f9c83c25de6eaa51-ps_replace.txt) is using integer data types, which I don't believe 3Dmigoto handles correctly.

3Dmigoto currently casts ints to floats, which means any value larger than 16777216 may get truncated unless it is a power of 2, or does not require more than 24 significant bits to store. In other words - we are depending on luck, or a truncation not causing a noticeable difference.

I believe we should be using asint(), asuint() and asfloat() on any instruction that works on integers instead of casting (these reinterpret the bit pattern, which is different to a cast), however so far I haven't had much luck getting these to behave consistently for me (I don't get the original result and the compiler is now optimising out some code I didn't expect), and I'm not sure if I'm missing something (could be - this is a significant change) or if there are fundamental issues with these functions that make this approach unworkable.

I'll keep looking.


I don't know if I haven't asked this already some time ago... I am not as familiar with DirectX as I am with OpenGL and thus I need to ask.
Does a DirectX game ship with pre-compiled shaders or are the shaders compiled at runtime and used? For example in OpenGL you always compile them at runtime (because the compiler might optimize things differently on certain hardware and you can't ship an universal shader set that works on all GPUS). So, in OGL you can have access to the original source code during creation time. Any shader swapping can be done there, or later after the shaders have been created, you can break them down and re-create them.

Does DirectX works like this ? or is it shipped with the binary information and that binary information is used instead or recompiling the shaders ?

@DarkStarSword & bo3b:
Are you guys looking into the issue with the missing specular/reflection information that is found in some shaders ? or there isn't anything we can do about it currently?
Should I wait for a fix or should I go further and try to identify new shaders and fix them ? Saw a couple decals that are in 2D ;))

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

Posted 05/23/2015 08:52 AM   
[quote="3D_Mike"]Sorry for noob question but in the last threads I read that there are many problems with 3D and many users are using the CM fix (which is fake 3D?). But there is already a fix for REAL 3D (from Mike, page 24) which works very good!!? Or do I get something wrong? So what is the problem and why should one use CM-mode? And then there are threads like "3D vision is dead" but thanks to some heroes games like "The witcher 3" could be fixed within a few hours after release! Thank you so much! [/quote] The fix is very good but still a work in progress hairworks doesn't work with the fix and according to other user's shadows and water effects are not right. I'm using CM mode so I can use the other features like hairworks when Mike's fix is final I will use his fix.
3D_Mike said:Sorry for noob question but in the last threads I read that there are many problems with 3D and many users are using the CM fix (which is fake 3D?).
But there is already a fix for REAL 3D (from Mike, page 24) which works very good!!? Or do I get something wrong? So what is the problem and why should one use CM-mode?
And then there are threads like "3D vision is dead" but thanks to some heroes games like "The witcher 3" could be fixed within a few hours after release! Thank you so much!


The fix is very good but still a work in progress hairworks doesn't work with the fix and according to other user's shadows and water effects are not right. I'm using CM mode so I can use the other features like hairworks when Mike's fix is final I will use his fix.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 05/23/2015 09:18 AM   
[quote="helifax"][quote="DarkStarSword"]One thing that concerns me is that some of the shaders (including the shadow shader f9c83c25de6eaa51-ps_replace.txt) is using integer data types, which I don't believe 3Dmigoto handles correctly. 3Dmigoto currently casts ints to floats, which means any value larger than 16777216 may get truncated unless it is a power of 2, or does not require more than 24 significant bits to store. In other words - we are depending on luck, or a truncation not causing a noticeable difference. I believe we should be using asint(), asuint() and asfloat() on any instruction that works on integers instead of casting (these reinterpret the bit pattern, which is different to a cast), however so far I haven't had much luck getting these to behave consistently for me (I don't get the original result and the compiler is now optimising out some code I didn't expect), and I'm not sure if I'm missing something (could be - this is a significant change) or if there are fundamental issues with these functions that make this approach unworkable. I'll keep looking.[/quote]I don't know if I haven't asked this already some time ago... I am not as familiar with DirectX as I am with OpenGL and thus I need to ask. Does a DirectX game ship with pre-compiled shaders or are the shaders compiled at runtime and used? For example in OpenGL you always compile them at runtime (because the compiler might optimize things differently on certain hardware and you can't ship an universal shader set that works on all GPUS). So, in OGL you can have access to the original source code during creation time. Any shader swapping can be done there, or later after the shaders have been created, you can break them down and re-create them. Does DirectX works like this ? or is it shipped with the binary information and that binary information is used instead or recompiling the shaders ?[/quote] It varies depending upon the game, but the vast majority ship only binary shaders. This is based on early use of 3Dmigoto where d3dcompiler_46,43,42,41,&39 were also wrapped. I didn't see any of our early games using those compilers at game runtime, and removed them for simplicity. Although now that you mention it, it would be interesting to hook d3dcompiler_46 and d3dcompiler_47 to see if current generation are compiling at runtime. I see Witcher3 include d3dcompiler_47 as part of their runtime, which you'd only do by accident, or if you were compiling at runtime. With regard to the use of ints vs. floats- my experience here is that in 99.99% of the cases, this just doesn't matter. We only patch a very small handful of shaders in a game, and we look for anomalies in each case. Sometimes we see them, sometimes we don't. If it causes something noticeable, we definitely take a look and tune it better manually. As you've found, it's really hard to get the fxc to generate the same code, because it's a very aggressive compiler, and there are no good switches to control its behavior. I've hand-checked over a thousand shaders, and in something like 95% of the cases the code is different, but actually has the same runtime effect. It's also worth noting that as far as compilers go, fxc is pretty buggy. I had one case where it clearly generates bad code, and had to rewrite the sequence to avoid it. There are a lot of online examples of overly aggressive optimizations stripping code that should be included as well. Also, I've spent a fair amount of time playing with asint and (int) casts, and I don't think there is a difference that fxc cares about. There should be a difference, but as near as I can tell from code generation the fxc treats them the same. For some hand-fix scenarios I'll switch some of the variable definitions to be int or uint, and that will let fxc generate non-converting scenarios. But this is usually at the expense of code that is wildly different from the original ASM. It's still correct as near as I can tell, but essentially there are zero matching lines in orig-ASM to recomp-ASM diffs. Given that we are hackers here, in my opinion, it's not worth worrying about the float to int conversions except in the super rare cases where we see them damage the image. This may just be a difference of approach though. I'm genuinely interested only in results- games fixed, not having a 'pure' decompiler. If you are looking for perfection, the only possible answer is to use ASM. Remembering of course that the games themselves are far from perfect, and people using SweetFX are essentially saying that the game look as shipped is wrong. These are far more dramatic changes than stuff we'd see with integer clipping. Maybe more to the point- I personally am completely happy to play games with [i]disabled[/i] effects, let alone imperfectly fixed effects. But maybe HLSL is not the way to go. At the end of the day this is just a judgment call, and maybe if you are a shaderhacker, HLSL is irrelevant and ASM is the way to go. I've had three different people who have experienced both say they prefer HLSL, but they of course would be able to get results either way. The Decompiler has a lot of bugs, no question. But for real-world software, it's important to keep in mind that it's still possible to get a lot of good results with buggy software. It all depends upon what is most important to you. As an example, I know that Mike has a 90% rule for games, knowing that the last 10% of 3D glitches are usually another 90% of the time spent on a fix. Mike and I both have the engineers mindset that we'd much rather spend our time producing more fixes, more games, deliberately at the expense of perfection. Everyone has a different bar for what is acceptable, so there is no question there will be disagreements about when a given fix is 'done' or worth playing. Just my opinion, but I think that as 3D gamers we are better served with a larger number of lower-quality fixes. This is why I emphasize quick fixes like disabled effects and shortcuts like HLSL.
helifax said:
DarkStarSword said:One thing that concerns me is that some of the shaders (including the shadow shader f9c83c25de6eaa51-ps_replace.txt) is using integer data types, which I don't believe 3Dmigoto handles correctly.

3Dmigoto currently casts ints to floats, which means any value larger than 16777216 may get truncated unless it is a power of 2, or does not require more than 24 significant bits to store. In other words - we are depending on luck, or a truncation not causing a noticeable difference.

I believe we should be using asint(), asuint() and asfloat() on any instruction that works on integers instead of casting (these reinterpret the bit pattern, which is different to a cast), however so far I haven't had much luck getting these to behave consistently for me (I don't get the original result and the compiler is now optimising out some code I didn't expect), and I'm not sure if I'm missing something (could be - this is a significant change) or if there are fundamental issues with these functions that make this approach unworkable.

I'll keep looking.
I don't know if I haven't asked this already some time ago... I am not as familiar with DirectX as I am with OpenGL and thus I need to ask.
Does a DirectX game ship with pre-compiled shaders or are the shaders compiled at runtime and used? For example in OpenGL you always compile them at runtime (because the compiler might optimize things differently on certain hardware and you can't ship an universal shader set that works on all GPUS). So, in OGL you can have access to the original source code during creation time. Any shader swapping can be done there, or later after the shaders have been created, you can break them down and re-create them.

Does DirectX works like this ? or is it shipped with the binary information and that binary information is used instead or recompiling the shaders ?

It varies depending upon the game, but the vast majority ship only binary shaders. This is based on early use of 3Dmigoto where d3dcompiler_46,43,42,41,&39 were also wrapped. I didn't see any of our early games using those compilers at game runtime, and removed them for simplicity.

Although now that you mention it, it would be interesting to hook d3dcompiler_46 and d3dcompiler_47 to see if current generation are compiling at runtime. I see Witcher3 include d3dcompiler_47 as part of their runtime, which you'd only do by accident, or if you were compiling at runtime.


With regard to the use of ints vs. floats- my experience here is that in 99.99% of the cases, this just doesn't matter. We only patch a very small handful of shaders in a game, and we look for anomalies in each case. Sometimes we see them, sometimes we don't. If it causes something noticeable, we definitely take a look and tune it better manually.

As you've found, it's really hard to get the fxc to generate the same code, because it's a very aggressive compiler, and there are no good switches to control its behavior. I've hand-checked over a thousand shaders, and in something like 95% of the cases the code is different, but actually has the same runtime effect. It's also worth noting that as far as compilers go, fxc is pretty buggy. I had one case where it clearly generates bad code, and had to rewrite the sequence to avoid it. There are a lot of online examples of overly aggressive optimizations stripping code that should be included as well.

Also, I've spent a fair amount of time playing with asint and (int) casts, and I don't think there is a difference that fxc cares about. There should be a difference, but as near as I can tell from code generation the fxc treats them the same.

For some hand-fix scenarios I'll switch some of the variable definitions to be int or uint, and that will let fxc generate non-converting scenarios. But this is usually at the expense of code that is wildly different from the original ASM. It's still correct as near as I can tell, but essentially there are zero matching lines in orig-ASM to recomp-ASM diffs.

Given that we are hackers here, in my opinion, it's not worth worrying about the float to int conversions except in the super rare cases where we see them damage the image. This may just be a difference of approach though. I'm genuinely interested only in results- games fixed, not having a 'pure' decompiler. If you are looking for perfection, the only possible answer is to use ASM.

Remembering of course that the games themselves are far from perfect, and people using SweetFX are essentially saying that the game look as shipped is wrong. These are far more dramatic changes than stuff we'd see with integer clipping.

Maybe more to the point- I personally am completely happy to play games with disabled effects, let alone imperfectly fixed effects.


But maybe HLSL is not the way to go. At the end of the day this is just a judgment call, and maybe if you are a shaderhacker, HLSL is irrelevant and ASM is the way to go. I've had three different people who have experienced both say they prefer HLSL, but they of course would be able to get results either way.

The Decompiler has a lot of bugs, no question. But for real-world software, it's important to keep in mind that it's still possible to get a lot of good results with buggy software. It all depends upon what is most important to you.

As an example, I know that Mike has a 90% rule for games, knowing that the last 10% of 3D glitches are usually another 90% of the time spent on a fix. Mike and I both have the engineers mindset that we'd much rather spend our time producing more fixes, more games, deliberately at the expense of perfection. Everyone has a different bar for what is acceptable, so there is no question there will be disagreements about when a given fix is 'done' or worth playing.


Just my opinion, but I think that as 3D gamers we are better served with a larger number of lower-quality fixes. This is why I emphasize quick fixes like disabled effects and shortcuts like HLSL.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 05/23/2015 09:58 AM   
@bo3b: Agreed! Also, you can always revisit a fix later on and improve it;)) (Like I did with Wolfie: The Old Order and others). But as long as 90% is working properly for the rest you can turn a blind eye (in this case literally ^_^). Can you check if the shaders are compiled at runtime? I think this would help us very much in this case as we should have access to the full source code of the original shaders thus we can see exactly what is what;))
@bo3b: Agreed!
Also, you can always revisit a fix later on and improve it;)) (Like I did with Wolfie: The Old Order and others). But as long as 90% is working properly for the rest you can turn a blind eye (in this case literally ^_^).

Can you check if the shaders are compiled at runtime? I think this would help us very much in this case as we should have access to the full source code of the original shaders thus we can see exactly what is what;))

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

Posted 05/23/2015 10:25 AM   
I found simply solution to launch game in 720p. Need to set desktop resolution in Windows 1280x720. No any tweak needed.
I found simply solution to launch game in 720p. Need to set desktop resolution in Windows 1280x720. No any tweak needed.

Posted 05/23/2015 11:29 AM   
Hello, first many thanks for the work. I played it some hours. I just noticed that the shadows are a little bit to dark outside after enable the fixes.
Hello,

first many thanks for the work.
I played it some hours.
I just noticed that the shadows are a little bit to dark outside after enable the fixes.

Posted 05/23/2015 11:43 AM   
How do you know whether CM mode is enabled or not? I pressed CTRL+ALT+F11 but nothing seems to happen. I tried remapping to another key combination but I don't see any visible effect. If it can be useful, I tried Mike's fix also and it seems to be working fine. The only problems I noticed happen when you alt-tab to the desktop and come back. Sometimes the shadows are all messed up. If you die and reload it will crash to the desktop very often.
How do you know whether CM mode is enabled or not? I pressed CTRL+ALT+F11 but nothing seems to happen. I tried remapping to another key combination but I don't see any visible effect.

If it can be useful, I tried Mike's fix also and it seems to be working fine. The only problems I noticed happen when you alt-tab to the desktop and come back. Sometimes the shadows are all messed up. If you die and reload it will crash to the desktop very often.

Posted 05/23/2015 12:25 PM   
  30 / 152    
Scroll To Top