[quote="it_"]I found simply solution to launch game in 720p. Need to set desktop resolution in Windows 1280x720. No any tweak needed.[/quote]
Goes with you Witcher 3 in 720p 3D TV Play ? For me it is only when I set the game full screen with edge , in full screen , the resolution goes back to 1280x768 pixels . In the game thus goes indeed to 3D but I have a 2D image . Can you confirm or located on the Alpha 2 Fix ?
it_ said:I found simply solution to launch game in 720p. Need to set desktop resolution in Windows 1280x720. No any tweak needed.
Goes with you Witcher 3 in 720p 3D TV Play ? For me it is only when I set the game full screen with edge , in full screen , the resolution goes back to 1280x768 pixels . In the game thus goes indeed to 3D but I have a 2D image . Can you confirm or located on the Alpha 2 Fix ?
[quote="helifax"]@bo3b: Agreed!
Also, you can always revisit a fix later on and improve it;)) (Like I did with Wolfie: The Old Order and others). But as long as 90% is working properly for the rest you can turn a blind eye (in this case literally ^_^).
Can you check if the shaders are compiled at runtime? I think this would help us very much in this case as we should have access to the full source code of the original shaders thus we can see exactly what is what;))[/quote]
That was a particularly good idea, so I made a test branch with compiler_47. It's hooking and logging, and within the constraints of a short test, it unfortunately doesn't show any shader compilation, only Blob creation.
helifax said:@bo3b: Agreed!
Also, you can always revisit a fix later on and improve it;)) (Like I did with Wolfie: The Old Order and others). But as long as 90% is working properly for the rest you can turn a blind eye (in this case literally ^_^).
Can you check if the shaders are compiled at runtime? I think this would help us very much in this case as we should have access to the full source code of the original shaders thus we can see exactly what is what;))
That was a particularly good idea, so I made a test branch with compiler_47. It's hooking and logging, and within the constraints of a short test, it unfortunately doesn't show any shader compilation, only Blob creation.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"][quote="helifax"]@bo3b: Agreed!
Also, you can always revisit a fix later on and improve it;)) (Like I did with Wolfie: The Old Order and others). But as long as 90% is working properly for the rest you can turn a blind eye (in this case literally ^_^).
Can you check if the shaders are compiled at runtime? I think this would help us very much in this case as we should have access to the full source code of the original shaders thus we can see exactly what is what;))[/quote]
That was a particularly good idea, so I made a test branch with compiler_47. It's hooking and logging, and within the constraints of a short test, it unfortunately doesn't show any shader compilation, only Blob creation. [/quote]
Crap:( I wished it was that easy:P :)) But it was a good attempt and at least now we know it comes with pre-compiled bin files:( Any idea on how we might get the extra missing information for/from that shader?
helifax said:@bo3b: Agreed!
Also, you can always revisit a fix later on and improve it;)) (Like I did with Wolfie: The Old Order and others). But as long as 90% is working properly for the rest you can turn a blind eye (in this case literally ^_^).
Can you check if the shaders are compiled at runtime? I think this would help us very much in this case as we should have access to the full source code of the original shaders thus we can see exactly what is what;))
That was a particularly good idea, so I made a test branch with compiler_47. It's hooking and logging, and within the constraints of a short test, it unfortunately doesn't show any shader compilation, only Blob creation.
Crap:( I wished it was that easy:P :)) But it was a good attempt and at least now we know it comes with pre-compiled bin files:( Any idea on how we might get the extra missing information for/from that shader?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
One feature I've been thinking about adding to 3Dmigoto for a while is tracking constant buffer usage, like we already do for shaders, textures, render targets and depth targets (turn on dump_usage, dump something then look at the ShaderUsage.txt that is produced).
That would allow us to see which constant buffers are shared between shaders, so if one shader has headers for a given constant buffer we would be able to use that to fill in the missing information in other shaders.
What do you think?
One feature I've been thinking about adding to 3Dmigoto for a while is tracking constant buffer usage, like we already do for shaders, textures, render targets and depth targets (turn on dump_usage, dump something then look at the ShaderUsage.txt that is produced).
That would allow us to see which constant buffers are shared between shaders, so if one shader has headers for a given constant buffer we would be able to use that to fill in the missing information in other shaders.
What do you think?
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Regarding the ints vs. floats - I don't know if that is the issue with this shadow shader or not, but it concerns me that we are fundamentally handling these wrong. Probably in most cases it won't matter in practice, since for it to matter it needs a number that is greater than pow(2,24) and has low order bits that have a significant contribution to the output, but it is something to keep in mind at the very least.
fxc definitely does not produce the same code when using asint() and other reinterprets as it does when casting, but I'm also not certain if it is doing the right thing either. Explicitly using an int type is probably the more natural approach for HLSL, but may be difficult in practice as registers may be reused with different types in non-trivial ways.
Regarding the ints vs. floats - I don't know if that is the issue with this shadow shader or not, but it concerns me that we are fundamentally handling these wrong. Probably in most cases it won't matter in practice, since for it to matter it needs a number that is greater than pow(2,24) and has low order bits that have a significant contribution to the output, but it is something to keep in mind at the very least.
fxc definitely does not produce the same code when using asint() and other reinterprets as it does when casting, but I'm also not certain if it is doing the right thing either. Explicitly using an int type is probably the more natural approach for HLSL, but may be difficult in practice as registers may be reused with different types in non-trivial ways.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="steffen00"]Hello,
first many thanks for the work.
I played it some hours.
I just noticed that the shadows are a little bit to dark outside after enable the fixes.[/quote]
Hi - yeah I caught this and fixed it. Next release will sort it out :-)
I need some assistance, got every file placed correctly, but I have white fog all around me in game and everything else is messy too. ctrl+alt+f11 do nothing ctrl+t just freezes the picture. Is there anything else I need to install except for the "Flugan Witcher 3" uzipped files ?
\GOG Games\The Witcher 3 Wild Hunt\bin\ShaderFixes
\GOG Games\The Witcher 3 Wild Hunt\bin\x64\ShaderFixes
\GOG Games\The Witcher 3 Wild Hunt\bin\x64\d3dx.ini
\GOG Games\The Witcher 3 Wild Hunt\bin\x64\D3D11.dll
Edit: okay nvm, works great after pc reboot, thanks.
I need some assistance, got every file placed correctly, but I have white fog all around me in game and everything else is messy too. ctrl+alt+f11 do nothing ctrl+t just freezes the picture. Is there anything else I need to install except for the "Flugan Witcher 3" uzipped files ?
[quote="DarkStarSword"]Regarding the ints vs. floats - I don't know if that is the issue with this shadow shader or not, but it concerns me that we are fundamentally handling these wrong. Probably in most cases it won't matter in practice, since for it to matter it needs a number that is greater than pow(2,24) and has low order bits that have a significant contribution to the output, but it is something to keep in mind at the very least.
fxc definitely does not produce the same code when using asint() and other reinterprets as it does when casting, but I'm also not certain if it is doing the right thing either. Explicitly using an int type is probably the more natural approach for HLSL, but may be difficult in practice as registers may be reused with different types in non-trivial ways.[/quote]
Are we talking about the same thing?
[url]https://msdn.microsoft.com/en-us/library/windows/desktop/bb509631%28v=vs.85%29.aspx#Binary_Casts[/url]
That documentation says it's a cast operation. Every experiment I've made using asint() gives me the same fxc output as a normal cast. It's easy enough to change the output to use asint(), but I haven't seen it make a difference.
DarkStarSword said:Regarding the ints vs. floats - I don't know if that is the issue with this shadow shader or not, but it concerns me that we are fundamentally handling these wrong. Probably in most cases it won't matter in practice, since for it to matter it needs a number that is greater than pow(2,24) and has low order bits that have a significant contribution to the output, but it is something to keep in mind at the very least.
fxc definitely does not produce the same code when using asint() and other reinterprets as it does when casting, but I'm also not certain if it is doing the right thing either. Explicitly using an int type is probably the more natural approach for HLSL, but may be difficult in practice as registers may be reused with different types in non-trivial ways.
That documentation says it's a cast operation. Every experiment I've made using asint() gives me the same fxc output as a normal cast. It's easy enough to change the output to use asint(), but I haven't seen it make a difference.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
"Binary casts may also be performed using Intrinsic Functions (DirectX HLSL), which reinterpret the bit representation of a number into the target data type."
"Binary casts may also be performed using Intrinsic Functions (DirectX HLSL), which reinterpret the bit representation of a number into the target data type."
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Small example:
[code] r0.w = (int)0 < (int)cb13[35].x;[/code]
results in assembly that includes a float to int conversion:
[code]ftoi r0.z, cb13[35].x
ilt r0.w, l(0), r0.z
[/code]
Changing it to use asint():
[code] r0.w = (int)0 < asint(cb13[35].x);
[/code]
does not have the conversion in the resulting assembly:
[code]
ilt r0.z, l(0), cb13[35].x
[/code]
When I die in Witcher 3, if I try to reload the game always crashes like everyone else here.
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Hope this helps other people.
When I die in Witcher 3, if I try to reload the game always crashes like everyone else here.
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Ah Ha!
So it turns out that my "small example" was exactly the problem with the shadows in caves that Helifax reported. Since we are missing headers we assume everything in the constant buffers is a float, but that is not true and we end up incorrectly treating the bit pattern of an integer as a float, then casting that (garbage) into an integer leading the shader to make the wrong decisions.
Here's the change that fixes it by adding asint() around affected loads:
https://github.com/bo3b/3Dmigoto/commit/50536be419b55e8f7ed1411a170cc932b2485c9f
So it turns out that my "small example" was exactly the problem with the shadows in caves that Helifax reported. Since we are missing headers we assume everything in the constant buffers is a float, but that is not true and we end up incorrectly treating the bit pattern of an integer as a float, then casting that (garbage) into an integer leading the shader to make the wrong decisions.
Here's the change that fixes it by adding asint() around affected loads:
[quote="amourfouu"]When I die in Witcher 3, if I try to reload the game always crashes like everyone else here.
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Hope this helps other people.[/quote]
Yeah, I found that too, however, wasn't sure. If I not choosing first option, game takes significantly longer to load, however, in this case video initialized properly and no crash occur.
amourfouu said:When I die in Witcher 3, if I try to reload the game always crashes like everyone else here.
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Hope this helps other people.
Yeah, I found that too, however, wasn't sure. If I not choosing first option, game takes significantly longer to load, however, in this case video initialized properly and no crash occur.
ASUS Prime z370-A, i5 8600K, ASUS GTX 1080 Ti Strix, 16 GB DDR4, Corsair AX860i PSU, ASUS VG278HR, 3D Vision 2, Sound Blaster Z, Astro A50 Headphones, Windows 10 64-bit Home
[quote="Seregin"][quote="amourfouu"]When I die in Witcher 3, if I try to reload the game always crashes like everyone else here.
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Hope this helps other people.[/quote]
Yeah, I found that too, however, wasn't sure. If I not choosing first option, game takes significantly longer to load, however, in this case video initialized properly and no crash occur.[/quote]
Thanks for sharing. This is one of my favorite games ever so I am cutting it some slack, but I was becoming frustrated with the crashing once every 3 deaths on the load. I am playing on Death March difficulty so would crash every half hour or so! Ill give this a try.
amourfouu said:When I die in Witcher 3, if I try to reload the game always crashes like everyone else here.
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Hope this helps other people.
Yeah, I found that too, however, wasn't sure. If I not choosing first option, game takes significantly longer to load, however, in this case video initialized properly and no crash occur.
Thanks for sharing. This is one of my favorite games ever so I am cutting it some slack, but I was becoming frustrated with the crashing once every 3 deaths on the load. I am playing on Death March difficulty so would crash every half hour or so! Ill give this a try.
Goes with you Witcher 3 in 720p 3D TV Play ? For me it is only when I set the game full screen with edge , in full screen , the resolution goes back to 1280x768 pixels . In the game thus goes indeed to 3D but I have a 2D image . Can you confirm or located on the Alpha 2 Fix ?
That was a particularly good idea, so I made a test branch with compiler_47. It's hooking and logging, and within the constraints of a short test, it unfortunately doesn't show any shader compilation, only Blob creation.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Crap:( I wished it was that easy:P :)) But it was a good attempt and at least now we know it comes with pre-compiled bin files:( Any idea on how we might get the extra missing information for/from that shader?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
That would allow us to see which constant buffers are shared between shaders, so if one shader has headers for a given constant buffer we would be able to use that to fill in the missing information in other shaders.
What do you think?
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
fxc definitely does not produce the same code when using asint() and other reinterprets as it does when casting, but I'm also not certain if it is doing the right thing either. Explicitly using an int type is probably the more natural approach for HLSL, but may be difficult in practice as registers may be reused with different types in non-trivial ways.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Hi - yeah I caught this and fixed it. Next release will sort it out :-)
Rig: Intel i7-8700K @4.7GHz, 16Gb Ram, SSD, GTX 1080Ti, Win10x64, Asus VG278
\GOG Games\The Witcher 3 Wild Hunt\bin\ShaderFixes
\GOG Games\The Witcher 3 Wild Hunt\bin\x64\ShaderFixes
\GOG Games\The Witcher 3 Wild Hunt\bin\x64\d3dx.ini
\GOG Games\The Witcher 3 Wild Hunt\bin\x64\D3D11.dll
Edit: okay nvm, works great after pc reboot, thanks.
Are we talking about the same thing?
https://msdn.microsoft.com/en-us/library/windows/desktop/bb509631%28v=vs.85%29.aspx#Binary_Casts
That documentation says it's a cast operation. Every experiment I've made using asint() gives me the same fxc output as a normal cast. It's easy enough to change the output to use asint(), but I haven't seen it make a difference.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
results in assembly that includes a float to int conversion:
Changing it to use asint():
does not have the conversion in the resulting assembly:
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
However, when I die, I now select to go back to the Main Menu, then I load the save I want, never choose Continue. This way I can quickly get back into the game with no crash. If I choose anything other than the Main Menu I always crash the game.
Hope this helps other people.
So it turns out that my "small example" was exactly the problem with the shadows in caves that Helifax reported. Since we are missing headers we assume everything in the constant buffers is a float, but that is not true and we end up incorrectly treating the bit pattern of an integer as a float, then casting that (garbage) into an integer leading the shader to make the wrong decisions.
Here's the change that fixes it by adding asint() around affected loads:
https://github.com/bo3b/3Dmigoto/commit/50536be419b55e8f7ed1411a170cc932b2485c9f
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Yeah, I found that too, however, wasn't sure. If I not choosing first option, game takes significantly longer to load, however, in this case video initialized properly and no crash occur.
ASUS Prime z370-A, i5 8600K, ASUS GTX 1080 Ti Strix, 16 GB DDR4, Corsair AX860i PSU, ASUS VG278HR, 3D Vision 2, Sound Blaster Z, Astro A50 Headphones, Windows 10 64-bit Home
Thanks for sharing. This is one of my favorite games ever so I am cutting it some slack, but I was becoming frustrated with the crashing once every 3 deaths on the load. I am playing on Death March difficulty so would crash every half hour or so! Ill give this a try.