3Dmigoto now open-source...
  25 / 143    
Your reasoning for not hooking asm up to CopyOnMark sounds good to me :) I'll give the new version a go this weekend. On another note - we need to make some improvements to the hunting toggle key binding. I'm hitting a few issues repeatedly: - A shader is not fixed when launching the game. While playing I dump it out and replace it and reload. As soon as I hit insert to turn off hunting mode, the original shader is used instead of the one I just fixed. - I have hunting mode disabled while loading a level and I find I cannot dump any shaders loaded with the level. - I have hunting mode disabled while loading a level and show_original does not work for any shaders that were replaced while it was off. I'm thinking we probably want to distinguish between hunting being hard disabled in the ini file, and soft disabled by the key binding. If hard disabled I don't expect any of the above to work, but if it's only soft disabled we should compromise so that we still have enough code enabled so the above works as expected. When it boils down to it I'm not really using the toggle key to turn off hunting mode (when I get around to Ryse I may want this for performance) - I'm really using it to turn off the overlay, so a quicker fix might just be to just add a separate toggle for the overlay instead.
Your reasoning for not hooking asm up to CopyOnMark sounds good to me :)

I'll give the new version a go this weekend.


On another note - we need to make some improvements to the hunting toggle key binding. I'm hitting a few issues repeatedly:
- A shader is not fixed when launching the game. While playing I dump it out and replace it and reload. As soon as I hit insert to turn off hunting mode, the original shader is used instead of the one I just fixed.
- I have hunting mode disabled while loading a level and I find I cannot dump any shaders loaded with the level.
- I have hunting mode disabled while loading a level and show_original does not work for any shaders that were replaced while it was off.

I'm thinking we probably want to distinguish between hunting being hard disabled in the ini file, and soft disabled by the key binding. If hard disabled I don't expect any of the above to work, but if it's only soft disabled we should compromise so that we still have enough code enabled so the above works as expected.

When it boils down to it I'm not really using the toggle key to turn off hunting mode (when I get around to Ryse I may want this for performance) - I'm really using it to turn off the overlay, so a quicker fix might just be to just add a separate toggle for the overlay instead.

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

Posted 05/29/2015 05:36 PM   
OK, thanks. Should be a fairly usable work flow now with ASM too. For the Hunting toggle, yeah, that sounds like bugs. My goal there was have it ignore the master switch for any loading of shaders, and always load them. Sounds busted, I'll take a look. Not certain, but I think that all the performance gain came from disabling the active shader maps, which is where the Insert key would toggle that on/off. If that doesn't seem doable, or performance takes a hit, then we can switch it to just overlay on/off, although there wouldn't be much point because the performance would still be bad. I probably missed allowing shaders to be loaded, and will prioritize looking at this. Also of note, I just pushed up a fix for the crashes when the evil-update is NOT installed. Was passing a HackerDevice to create the real SwapChain, and it wasn't too happy. The big surprise is that the platform update/evil update actually made that work. After I test again to be sure that doesn't break it the other way around, I'll make a new build.
OK, thanks. Should be a fairly usable work flow now with ASM too.


For the Hunting toggle, yeah, that sounds like bugs. My goal there was have it ignore the master switch for any loading of shaders, and always load them. Sounds busted, I'll take a look.

Not certain, but I think that all the performance gain came from disabling the active shader maps, which is where the Insert key would toggle that on/off. If that doesn't seem doable, or performance takes a hit, then we can switch it to just overlay on/off, although there wouldn't be much point because the performance would still be bad.

I probably missed allowing shaders to be loaded, and will prioritize looking at this.


Also of note, I just pushed up a fix for the crashes when the evil-update is NOT installed. Was passing a HackerDevice to create the real SwapChain, and it wasn't too happy. The big surprise is that the platform update/evil update actually made that work.

After I test again to be sure that doesn't break it the other way around, I'll make a new build.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 05/29/2015 06:16 PM   
Just put up the latest build 1.1.12. Fixes both these problems I think, with hunting, and the crash with no evil update. [url]https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.12.zip[/url] The hunting fix is marginal at the moment, but should fix the immediate problems. I want to take a closer look at performance before finalizing anything there. I tested the evil-update both ways, with it installed, and with it absent, and it no longer crashes.
Just put up the latest build 1.1.12. Fixes both these problems I think, with hunting, and the crash with no evil update.

https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.12.zip


The hunting fix is marginal at the moment, but should fix the immediate problems. I want to take a closer look at performance before finalizing anything there.

I tested the evil-update both ways, with it installed, and with it absent, and it no longer crashes.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 05/29/2015 07:41 PM   
[quote="bo3b"]Just put up the latest build 1.1.12. Fixes both these problems I think, with hunting, and the crash with no evil update. [url]https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.12.zip[/url] The hunting fix is marginal at the moment, but should fix the immediate problems. I want to take a closer look at performance before finalizing anything there. I tested the evil-update both ways, with it installed, and with it absent, and it no longer crashes.[/quote] - Tested this version with DA:I, still crashes (Evil Update installed). - Tested Witcher 3 - Works. However it crashes if I put a proxylib :( If you need the log from Witcher 3 + ProxyLib, let me know and I can PM it;))
bo3b said:Just put up the latest build 1.1.12. Fixes both these problems I think, with hunting, and the crash with no evil update.

https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.12.zip


The hunting fix is marginal at the moment, but should fix the immediate problems. I want to take a closer look at performance before finalizing anything there.

I tested the evil-update both ways, with it installed, and with it absent, and it no longer crashes.


- Tested this version with DA:I, still crashes (Evil Update installed).
- Tested Witcher 3 - Works. However it crashes if I put a proxylib :(
If you need the log from Witcher 3 + ProxyLib, let me know and I can PM it;))

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

Posted 05/29/2015 08:47 PM   
I've reintroduced the HLSL decompiler into my toolset. It is currently sitting in my assembler testing tool. Is appears that the HLSL decompiler either fails gracefully or simply crashes the program with an access error exception. It is clearly useful form marked shaders but seem unreliable on a witcher 3 dump. I'm exporting bin files from the wrapper which allow me to generate ASM or HLSL unless it fails. I don't know how to go from ASM to HLSL without the binary. Just dumping ASM makes it impossible to fix those shaders using HLSL. I need to test this setup in practice. I don't dare to put the HLSL decompiler inside the wrapper if it crashes the wrapper when used. I also having some issues in DX10 but that can wait, nothing too important.
I've reintroduced the HLSL decompiler into my toolset.

It is currently sitting in my assembler testing tool.

Is appears that the HLSL decompiler either fails gracefully or simply crashes the program with an access error exception. It is clearly useful form marked shaders but seem unreliable on a witcher 3 dump. I'm exporting bin files from the wrapper which allow me to generate ASM or HLSL unless it fails. I don't know how to go from ASM to HLSL without the binary. Just dumping ASM makes it impossible to fix those shaders using HLSL.

I need to test this setup in practice. I don't dare to put the HLSL decompiler inside the wrapper if it crashes the wrapper when used.

I also having some issues in DX10 but that can wait, nothing too important.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 05/31/2015 07:31 PM   
[quote=""]I've reintroduced the HLSL decompiler into my toolset. It is currently sitting in my assembler testing tool. Is appears that the HLSL decompiler either fails gracefully or simply crashes the program with an access error exception. It is clearly useful form marked shaders but seem unreliable on a witcher 3 dump. I'm exporting bin files from the wrapper which allow me to generate ASM or HLSL unless it fails. I don't know how to go from ASM to HLSL without the binary. Just dumping ASM makes it impossible to fix those shaders using HLSL. I need to test this setup in practice. I don't dare to put the HLSL decompiler inside the wrapper if it crashes the wrapper when used. I also having some issues in DX10 but that can wait, nothing too important.[/quote] Yes, the Decompiler can get get exceptions when running across things it hasn't seen, bugs, or even syntax errors typed in. It could be more robust, but the way to handle this is to wrap the Decompile operation with a try/catch block. For Witcher3, there are definitely exceptions that we can see in the log. So far they don't seem to be anything we've needed for the fix, so I haven't prioritized looking into them. Speaking of the Assembler, how do I know if there is a failure? Like someone typing in a syntax error into the ASM code? People will make mistakes, and it'd be good if we can report them clearly. Right now I don't see any error returns possible, so I currently re-disassemble the binary as a check. I neglected to update that 1.1.14 is up, and fixes the crashes with Dragon Age:I, so the 1.1 branch works there now too, and includes DarkStarSwords fix for Witcher3 hangs on shader decompile errors. Latest version: https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.14.zip
said:I've reintroduced the HLSL decompiler into my toolset.

It is currently sitting in my assembler testing tool.

Is appears that the HLSL decompiler either fails gracefully or simply crashes the program with an access error exception. It is clearly useful form marked shaders but seem unreliable on a witcher 3 dump. I'm exporting bin files from the wrapper which allow me to generate ASM or HLSL unless it fails. I don't know how to go from ASM to HLSL without the binary. Just dumping ASM makes it impossible to fix those shaders using HLSL.

I need to test this setup in practice. I don't dare to put the HLSL decompiler inside the wrapper if it crashes the wrapper when used.

I also having some issues in DX10 but that can wait, nothing too important.

Yes, the Decompiler can get get exceptions when running across things it hasn't seen, bugs, or even syntax errors typed in. It could be more robust, but the way to handle this is to wrap the Decompile operation with a try/catch block.

For Witcher3, there are definitely exceptions that we can see in the log. So far they don't seem to be anything we've needed for the fix, so I haven't prioritized looking into them.


Speaking of the Assembler, how do I know if there is a failure? Like someone typing in a syntax error into the ASM code? People will make mistakes, and it'd be good if we can report them clearly. Right now I don't see any error returns possible, so I currently re-disassemble the binary as a check.


I neglected to update that 1.1.14 is up, and fixes the crashes with Dragon Age:I, so the 1.1 branch works there now too, and includes DarkStarSwords fix for Witcher3 hangs on shader decompile errors.

Latest version:


https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.14.zip

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 05/31/2015 07:57 PM   
As you've noticed there is little error handling as it is not based around a normal parser. It is very strict about syntax and is not very good at handling humans. I test the decompiled ASM files and binaries to see if I can generate matching binary to the original binary. It has been tested against most DX11 games I own and some DX10 games. It's kind of backwards as I don't know the correct output should be for a text instruction unless I've seen it in binary code. Usually the fix is a lot smaller than the rest of the code so it could be handled without error handling. I'm not saying it wouldn't be good. You could even do assembly disassembly instruction by instruction. I still don't know how Microsoft handles bad binary so that could be a bad idea. Edit: It is hard to detect errors as it could be that the error generates valid code. Just generating the correct binary code is hard by itself. When working with computer generated code I did take shortcuts compared to a user friendly assembler which starts with a parser.
As you've noticed there is little error handling as it is not based around a normal parser.

It is very strict about syntax and is not very good at handling humans.

I test the decompiled ASM files and binaries to see if I can generate matching binary to the original binary. It has been tested against most DX11 games I own and some DX10 games.

It's kind of backwards as I don't know the correct output should be for a text instruction unless I've seen it in binary code.

Usually the fix is a lot smaller than the rest of the code so it could be handled without error handling.

I'm not saying it wouldn't be good. You could even do assembly disassembly instruction by instruction. I still don't know how Microsoft handles bad binary so that could be a bad idea.

Edit: It is hard to detect errors as it could be that the error generates valid code.
Just generating the correct binary code is hard by itself. When working with computer generated code I did take shortcuts compared to a user friendly assembler which starts with a parser.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 05/31/2015 08:31 PM   
[quote=""]As you've noticed there is little error handling as it is not based around a normal parser. It is very strict about syntax and is not very good at handling humans. I test the decompiled ASM files and binaries to see if I can generate matching binary to the original binary. It has been tested against most DX11 games I own and some DX10 games. It's kind of backwards as I don't know the correct output should be for a text instruction unless I've seen it in binary code. Usually the fix is a lot smaller than the rest of the code so it could be handled without error handling. I'm not saying it wouldn't be good. You could even do assembly disassembly instruction by instruction. I still don't know how Microsoft handles bad binary so that could be a bad idea. Edit: It is hard to detect errors as it could be that the error generates valid code. Just generating the correct binary code is hard by itself. When working with computer generated code I did take shortcuts compared to a user friendly assembler which starts with a parser.[/quote] OK, good to know. I'd like to suggest one change that I think would help a lot for catching errors- you parse the instruction text and look for a match, right? If there is no match, it would be good if it could bail out with a null bytecode. That would handle any missing mystery instructions as well as catch the most common syntax errors.
said:As you've noticed there is little error handling as it is not based around a normal parser.

It is very strict about syntax and is not very good at handling humans.

I test the decompiled ASM files and binaries to see if I can generate matching binary to the original binary. It has been tested against most DX11 games I own and some DX10 games.

It's kind of backwards as I don't know the correct output should be for a text instruction unless I've seen it in binary code.

Usually the fix is a lot smaller than the rest of the code so it could be handled without error handling.

I'm not saying it wouldn't be good. You could even do assembly disassembly instruction by instruction. I still don't know how Microsoft handles bad binary so that could be a bad idea.

Edit: It is hard to detect errors as it could be that the error generates valid code.
Just generating the correct binary code is hard by itself. When working with computer generated code I did take shortcuts compared to a user friendly assembler which starts with a parser.

OK, good to know.

I'd like to suggest one change that I think would help a lot for catching errors- you parse the instruction text and look for a match, right? If there is no match, it would be good if it could bail out with a null bytecode. That would handle any missing mystery instructions as well as catch the most common syntax errors.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

Posted 06/02/2015 03:56 AM   
It should be quite easy to throw an exception if an opcode does not exist Same goes for variable but might be more tricky. Making sure add has three variables etc. Harder to account for mistakes that result in valid code. Doing a recompile test can make sure the original code decompiles as it should but really don't cover manually changed code. But to be honest the code would have to be pretty much rewritten if it needs to handle human error.
It should be quite easy to throw an exception if an opcode does not exist

Same goes for variable but might be more tricky.

Making sure add has three variables etc.

Harder to account for mistakes that result in valid code.

Doing a recompile test can make sure the original code decompiles as it should but really don't cover manually changed code.

But to be honest the code would have to be pretty much rewritten if it needs to handle human error.

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 06/02/2015 09:57 AM   
I have a 3Dmigoto issue. I'm taking a look at a Dx11 UE3 game and I need "AllowNvidiaStereo3d" enabled. With the flag set to "True", the game runs ok without the 3Dmigoto files. But when I add the files, the game crashes almost immediately. I can only get 3Dmigoto to work if I set "AllowNvidiaStereo3d" to "false". I'm using version 1.1.1. Log attached.
I have a 3Dmigoto issue. I'm taking a look at a Dx11 UE3 game and I need "AllowNvidiaStereo3d" enabled. With the flag set to "True", the game runs ok without the 3Dmigoto files. But when I add the files, the game crashes almost immediately. I can only get 3Dmigoto to work if I set "AllowNvidiaStereo3d" to "false". I'm using version 1.1.1. Log attached.
Attachments

d3d11_log.txt.jpg

Temporary Account

Posted 06/08/2015 11:01 AM   
1.1.1 isn't worth using. Always use either the latest from the 1.1.x series (currently 1.1.16) or the old 1.0.1 stable version. I fixed a couple of crashing bugs in ARK (a UE4 game) the other day, which are in 1.1.16. Give that version a go and see if the crashes still occur: https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.16.zip Which game is it? I've only come across one UE3 game that officially supported DX11 (Eleusis), but the devs later patched the DX11 support out. There is a standard command line parameter to UE3 games to select DX9 or DX11, but in most games I've tried DX11 mode just crashes straight away.
1.1.1 isn't worth using. Always use either the latest from the 1.1.x series (currently 1.1.16) or the old 1.0.1 stable version.

I fixed a couple of crashing bugs in ARK (a UE4 game) the other day, which are in 1.1.16. Give that version a go and see if the crashes still occur:


https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.16.zip


Which game is it? I've only come across one UE3 game that officially supported DX11 (Eleusis), but the devs later patched the DX11 support out. There is a standard command line parameter to UE3 games to select DX9 or DX11, but in most games I've tried DX11 mode just crashes straight away.

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

Posted 06/08/2015 11:31 AM   
Thanks DarkStar! 1.1.16 works. I mistakenly thought that 1.0.1 was the most recent release (I meant to write 1.0.1 in my previous post). I'm looking at D4. Enabling [i]AllowNvidiaStereo3d[/i] fixed the crashing issues I was getting whenever I Alt-Tab out of the game. I'm still not sure if I can fix this game, though. Some issues only occur during cutscenes. Also, pausing the game makes the shaders for the scene go inactive (preventing me from cycling through them). Off the top of my head, Mortal Kombat X & Killing Floor 2 are also Dx11 UE3.
Thanks DarkStar! 1.1.16 works. I mistakenly thought that 1.0.1 was the most recent release (I meant to write 1.0.1 in my previous post).

I'm looking at D4. Enabling AllowNvidiaStereo3d fixed the crashing issues I was getting whenever I Alt-Tab out of the game. I'm still not sure if I can fix this game, though. Some issues only occur during cutscenes. Also, pausing the game makes the shaders for the scene go inactive (preventing me from cycling through them).

Off the top of my head, Mortal Kombat X & Killing Floor 2 are also Dx11 UE3.

Temporary Account

Posted 06/08/2015 01:37 PM   
Hello there, 3D enthusiast community. I have a question. I'm using an older-dx-to-dx11 wrapper (dgVoodoo) to enable 3D vision for the classic Tomb Raider series (dx5-dx6 games.) It's working great, except the HUD is at wrong depth, so much that it goes out of screen. Is it possible to catch the HUD and return it to screen depth some way? I tried to fiddle with nvidia game profile gving "RHWgreateratscreen" different values, because that's what used to fix it with the legacy stereo drivers back when that setting was in the registry rather than in 3d profiles, but this time it seems to make no change.
Hello there, 3D enthusiast community.

I have a question. I'm using an older-dx-to-dx11 wrapper (dgVoodoo) to enable 3D vision for the classic Tomb Raider series (dx5-dx6 games.) It's working great, except the HUD is at wrong depth, so much that it goes out of screen.


Is it possible to catch the HUD and return it to screen depth some way?

I tried to fiddle with nvidia game profile gving "RHWgreateratscreen" different values, because that's what used to fix it with the legacy stereo drivers back when that setting was in the registry rather than in 3d profiles, but this time it seems to make no change.

Posted 06/08/2015 10:25 PM   
[quote="innuendo1231b"]Is it possible to catch the HUD and return it to screen depth some way?[/quote]Try adding it to the Tomb Raider: Anniversary profile as mentioned by Kingping1 in this thread: [url]https://forums.geforce.com/default/topic/828843/3d-vision/old-dx7-and-below-games-3d-vision/[/url]
innuendo1231b said:Is it possible to catch the HUD and return it to screen depth some way?
Try adding it to the Tomb Raider: Anniversary profile as mentioned by Kingping1 in this thread:

https://forums.geforce.com/default/topic/828843/3d-vision/old-dx7-and-below-games-3d-vision/
Posted 06/08/2015 11:05 PM   
[quote="4everAwake"]I'm still not sure if I can fix this game, though. Some issues only occur during cutscenes. Also, pausing the game makes the shaders for the scene go inactive (preventing me from cycling through them).[/quote] Ah, then you might be interested in a brand new feature I just implemented in 1.1.17: https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.17.zip Open the d3dx.ini and find this line: [code];analyse_frame=VK_F8[/code] Uncomment it, then when you see a broken shader in a cutscene hit F8 (hunting mode must be enabled). The game will hang for several minutes and your hard drive light will go on - this is normal. You will also need a *lot* of disk space every time you press this (at the moment it's consuming over 16GB every time it is used in The Witcher 3, and this is why the feature is not enabled by default). Once it's finished you will probably have a few thousand files like this: ...\FrameAnalysis-2015-06-09-010253\001073-0-9d2db31d7bb1bd59-f522a33026aa87af.jps The filename is FrameAnalysis-date-time\draw#-rendertarget-vertexshader-pixelshader. The JPS files are easier to work with, but may be missing information found in the DDS files (e.g. alpha channel), and are not present for every render target (in particular, they are not generated for the depth targets in The Witcher 3). They are the contents of each and every Texture2D render target & depth target (D) after every single draw call made in a single frame. You can open them to see how the frame was constructed and identify which shaders were used for the broken effects. Note that some of the render targets may not be cleared at the start of each frame, so they may still contain the image from the previous frame making it hard to see when they draw each object - I plan to add a feature to allow 3Dmigoto to optionally clear them instead of the game to make this easier. This is still a pretty new feature... feel free to give me feedback on it or suggestions of ways it could be improved.
4everAwake said:I'm still not sure if I can fix this game, though. Some issues only occur during cutscenes. Also, pausing the game makes the shaders for the scene go inactive (preventing me from cycling through them).

Ah, then you might be interested in a brand new feature I just implemented in 1.1.17:

https://github.com/bo3b/3Dmigoto/releases/download/0.99.50-alpha/3Dmigoto-1.1.17.zip

Open the d3dx.ini and find this line:

;analyse_frame=VK_F8


Uncomment it, then when you see a broken shader in a cutscene hit F8 (hunting mode must be enabled). The game will hang for several minutes and your hard drive light will go on - this is normal. You will also need a *lot* of disk space every time you press this (at the moment it's consuming over 16GB every time it is used in The Witcher 3, and this is why the feature is not enabled by default).

Once it's finished you will probably have a few thousand files like this:

...\FrameAnalysis-2015-06-09-010253\001073-0-9d2db31d7bb1bd59-f522a33026aa87af.jps

The filename is FrameAnalysis-date-time\draw#-rendertarget-vertexshader-pixelshader. The JPS files are easier to work with, but may be missing information found in the DDS files (e.g. alpha channel), and are not present for every render target (in particular, they are not generated for the depth targets in The Witcher 3).

They are the contents of each and every Texture2D render target & depth target (D) after every single draw call made in a single frame. You can open them to see how the frame was constructed and identify which shaders were used for the broken effects. Note that some of the render targets may not be cleared at the start of each frame, so they may still contain the image from the previous frame making it hard to see when they draw each object - I plan to add a feature to allow 3Dmigoto to optionally clear them instead of the game to make this easier.

This is still a pretty new feature... feel free to give me feedback on it or suggestions of ways it could be improved.

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

Posted 06/09/2015 02:00 AM   
  25 / 143    
Scroll To Top