Hello 3DMigoto-Team,
i dont really know if this the right thread for it (if not then please excuse me :-))
but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)
P.S. Thank you guys for all the job you do for the community.
Hello 3DMigoto-Team,
i dont really know if this the right thread for it (if not then please excuse me :-))
but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)
P.S. Thank you guys for all the job you do for the community.
[quote="Oomek"]Bug?
3DM probably skips the shaders with warnings. I was unable to dump it with cycling through vertex shaders.[/quote]Hmm, that's odd - warnings shouldn't cause any issues. Are there any issues in the version you got from ShaderCache, like missing instructions or declarations?
[quote="ColAngel"]Hello 3DMigoto-Team,
i dont really know if this the right thread for it (if not then please excuse me :-))[/quote]Can you add this as a feature request here: https://github.com/bo3b/3Dmigoto/issues
[quote]but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)[/quote]I think I understand what you are after. It's theoretically doable, but it will need some more support in 3DMigoto to make it work - we would need to give the game a resource to use instead of the back buffer that is the resolution the game expects, then upscale that to the real back buffer on the present call. Plus we may need to lie to the game anywhere it queries the resolution, though we do already have support to force a given resolution.
3DM probably skips the shaders with warnings. I was unable to dump it with cycling through vertex shaders.
Hmm, that's odd - warnings shouldn't cause any issues. Are there any issues in the version you got from ShaderCache, like missing instructions or declarations?
ColAngel said:Hello 3DMigoto-Team,
i dont really know if this the right thread for it (if not then please excuse me :-))
but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)
I think I understand what you are after. It's theoretically doable, but it will need some more support in 3DMigoto to make it work - we would need to give the game a resource to use instead of the back buffer that is the resolution the game expects, then upscale that to the real back buffer on the present call. Plus we may need to lie to the game anywhere it queries the resolution, though we do already have support to force a given resolution.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"][quote="Oomek"]Bug?
3DM probably skips the shaders with warnings. I was unable to dump it with cycling through vertex shaders.[/quote]Hmm, that's odd - warnings shouldn't cause any issues. Are there any issues in the version you got from ShaderCache, like missing instructions or declarations?[/quote]
No, there is nothing missing, just the usual o0 related warnings. What's more, some shaders become unavailable when cycling after my edits. So far It's random for me, as I haven't found yet any pattern of the occurence of that problem.
3DM probably skips the shaders with warnings. I was unable to dump it with cycling through vertex shaders.
Hmm, that's odd - warnings shouldn't cause any issues. Are there any issues in the version you got from ShaderCache, like missing instructions or declarations?
No, there is nothing missing, just the usual o0 related warnings. What's more, some shaders become unavailable when cycling after my edits. So far It's random for me, as I haven't found yet any pattern of the occurence of that problem.
ok, I've occasionally observed some behavior that makes me suspect we have a bug somewhere in the shader management code (things like 3DMigoto losing track of the original shader, or dumping out a shader with StereoParams declared twice), but it seems random and rare and I haven't managed to work out a pattern for it either. My haunch is that the bug is in an error path somewhere, but until I can reliably reproduce it I can't be sure.
ok, I've occasionally observed some behavior that makes me suspect we have a bug somewhere in the shader management code (things like 3DMigoto losing track of the original shader, or dumping out a shader with StereoParams declared twice), but it seems random and rare and I haven't managed to work out a pattern for it either. My haunch is that the bug is in an error path somewhere, but until I can reliably reproduce it I can't be sure.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"]ok, I've occasionally observed some behavior that makes me suspect we have a bug somewhere in the shader management code (things like 3DMigoto losing track of the original shader, or dumping out a shader with StereoParams declared twice), but it seems random and rare and I haven't managed to work out a pattern for it either. My haunch is that the bug is in an error path somewhere, but until I can reliably reproduce it I can't be sure.[/quote]
I'll keep an eye on it and let you know if I find some consistent behaviour.
DarkStarSword said:ok, I've occasionally observed some behavior that makes me suspect we have a bug somewhere in the shader management code (things like 3DMigoto losing track of the original shader, or dumping out a shader with StereoParams declared twice), but it seems random and rare and I haven't managed to work out a pattern for it either. My haunch is that the bug is in an error path somewhere, but until I can reliably reproduce it I can't be sure.
I'll keep an eye on it and let you know if I find some consistent behaviour.
[quote="DarkStarSword"][quote="Oomek"]Bug?
3DM probably skips the shaders with warnings. I was unable to dump it with cycling through vertex shaders.[/quote]Hmm, that's odd - warnings shouldn't cause any issues. Are there any issues in the version you got from ShaderCache, like missing instructions or declarations?
[quote="ColAngel"]Hello 3DMigoto-Team,
i dont really know if this the right thread for it (if not then please excuse me :-))[/quote]Can you add this as a feature request here: https://github.com/bo3b/3Dmigoto/issues
[quote]but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)[/quote]I think I understand what you are after. It's theoretically doable, but it will need some more support in 3DMigoto to make it work - we would need to give the game a resource to use instead of the back buffer that is the resolution the game expects, then upscale that to the real back buffer on the present call. Plus we may need to lie to the game anywhere it queries the resolution, though we do already have support to force a given resolution.
[/quote]
Thx for quick respond. Yes that is basically what i mean.
3DM probably skips the shaders with warnings. I was unable to dump it with cycling through vertex shaders.
Hmm, that's odd - warnings shouldn't cause any issues. Are there any issues in the version you got from ShaderCache, like missing instructions or declarations?
ColAngel said:Hello 3DMigoto-Team,
i dont really know if this the right thread for it (if not then please excuse me :-))
but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)
I think I understand what you are after. It's theoretically doable, but it will need some more support in 3DMigoto to make it work - we would need to give the game a resource to use instead of the back buffer that is the resolution the game expects, then upscale that to the real back buffer on the present call. Plus we may need to lie to the game anywhere it queries the resolution, though we do already have support to force a given resolution.
Thx for quick respond. Yes that is basically what i mean.
Using the 3Dmigoto fix for ROTTR caused a significant performance drop compared to CM or official game patch. Installing the game on SSD reduced the loading time for the thousands of fixed shaders but didn't improve the performance within the levels.
But I would like to use the 3Dmigoto fix instead of the other solutions. So I have some questions about 3Dmigoto and performance:
- does 3Dmigoto put more stress on CPU or GPU?
- is there a way to disable/enable 3Dmigoto while the game is running? This would allow me to switch temporarily to CM in situations when the game is not playable any more - without restarting the game.
Just to make sure: I'm not at all complaining that 3Dmigoto eats too much performance, etc. I know that it is the fault of my low level hardware. I only want to find out what I could do to improve the performance (OC, etc.) or what could be a workaround...
Using the 3Dmigoto fix for ROTTR caused a significant performance drop compared to CM or official game patch. Installing the game on SSD reduced the loading time for the thousands of fixed shaders but didn't improve the performance within the levels.
But I would like to use the 3Dmigoto fix instead of the other solutions. So I have some questions about 3Dmigoto and performance:
- does 3Dmigoto put more stress on CPU or GPU?
- is there a way to disable/enable 3Dmigoto while the game is running? This would allow me to switch temporarily to CM in situations when the game is not playable any more - without restarting the game.
Just to make sure: I'm not at all complaining that 3Dmigoto eats too much performance, etc. I know that it is the fault of my low level hardware. I only want to find out what I could do to improve the performance (OC, etc.) or what could be a workaround...
My original display name is 3d4dd - for some reason Nvidia changed it..?!
[quote="DarkStarSword"]ok, I've occasionally observed some behavior that makes me suspect we have a bug somewhere in the shader management code (things like 3DMigoto losing track of the original shader, or dumping out a shader with StereoParams declared twice), but it seems random and rare and I haven't managed to work out a pattern for it either. My haunch is that the bug is in an error path somewhere, but until I can reliably reproduce it I can't be sure.[/quote]
DSS i found a pattern for the StereoParams declared twice and other strange behavior.
- Go in to the game with a shader fixed
- While the game is running delete that shader and press F10 to see the effect back to original
- Don't quit the game and hunt again that shader and dump
- Now you should ear the boop sound, because the StereoParams is declared twice, also happens a few times that the shader dumped in the previous point is other different shader with the StereoParams declared twice.
Hope this can help
DarkStarSword said:ok, I've occasionally observed some behavior that makes me suspect we have a bug somewhere in the shader management code (things like 3DMigoto losing track of the original shader, or dumping out a shader with StereoParams declared twice), but it seems random and rare and I haven't managed to work out a pattern for it either. My haunch is that the bug is in an error path somewhere, but until I can reliably reproduce it I can't be sure.
DSS i found a pattern for the StereoParams declared twice and other strange behavior.
- Go in to the game with a shader fixed
- While the game is running delete that shader and press F10 to see the effect back to original
- Don't quit the game and hunt again that shader and dump
- Now you should ear the boop sound, because the StereoParams is declared twice, also happens a few times that the shader dumped in the previous point is other different shader with the StereoParams declared twice.
I mentioned that once, but the issue is still there. 3DM is dumping cached shaders instead of the original ones.
1. dump a shader.
2. modify it
3. press F10
4. delete shader
5. press F10 again
6. dump the same shader (it will contain the modifications you've made before)
I mentioned that once, but the issue is still there. 3DM is dumping cached shaders instead of the original ones.
1. dump a shader.
2. modify it
3. press F10
4. delete shader
5. press F10 again
6. dump the same shader (it will contain the modifications you've made before)
[quote="3d4dd"]Using the 3Dmigoto fix for ROTTR caused a significant performance drop compared to CM or official game patch. Installing the game on SSD reduced the loading time for the thousands of fixed shaders but didn't improve the performance within the levels.
But I would like to use the 3Dmigoto fix instead of the other solutions. So I have some questions about 3Dmigoto and performance:
- does 3Dmigoto put more stress on CPU or GPU?
- is there a way to disable/enable 3Dmigoto while the game is running? This would allow me to switch temporarily to CM in situations when the game is not playable any more - without restarting the game.
Just to make sure: I'm not at all complaining that 3Dmigoto eats too much performance, etc. I know that it is the fault of my low level hardware. I only want to find out what I could do to improve the performance (OC, etc.) or what could be a workaround... [/quote]
Taking a quick look here- this is probably happening because the fix is using so many ASM shaders, and that section of code in 3Dmigoto is not complete. It looks to me like we are generating an _reasm.txt and a _reasm.bin file on every file that is loaded. And secondarily, we don't create a .bin cache for ASM shaders, nor do we read the _reasm.bin which is roughly the same thing.
So... Unless I've misread this, we are not only not caching the ASM results, we are regenerating them for every new shader seen. Which with this game using 1000s and 1000s is clearly a problem.
Unless DarkStarSword gets to this earlier, I'll likely fix this this weekend. (In our good friend: HackerDevice::ReplaceShader)
As a general idea, I've put in a lot of effort to minimize the 3Dmigoto impact, so it should only be using 0.8% of the CPU or thereabouts. So, any big impact on frame rates will very likely be a bug like this.
Edit: BTW, you can tell us if this is the problem, by enabling the logging with calls=1, and see if you get repeated "Assembling replacement code" messages.
3d4dd said:Using the 3Dmigoto fix for ROTTR caused a significant performance drop compared to CM or official game patch. Installing the game on SSD reduced the loading time for the thousands of fixed shaders but didn't improve the performance within the levels.
But I would like to use the 3Dmigoto fix instead of the other solutions. So I have some questions about 3Dmigoto and performance:
- does 3Dmigoto put more stress on CPU or GPU?
- is there a way to disable/enable 3Dmigoto while the game is running? This would allow me to switch temporarily to CM in situations when the game is not playable any more - without restarting the game.
Just to make sure: I'm not at all complaining that 3Dmigoto eats too much performance, etc. I know that it is the fault of my low level hardware. I only want to find out what I could do to improve the performance (OC, etc.) or what could be a workaround...
Taking a quick look here- this is probably happening because the fix is using so many ASM shaders, and that section of code in 3Dmigoto is not complete. It looks to me like we are generating an _reasm.txt and a _reasm.bin file on every file that is loaded. And secondarily, we don't create a .bin cache for ASM shaders, nor do we read the _reasm.bin which is roughly the same thing.
So... Unless I've misread this, we are not only not caching the ASM results, we are regenerating them for every new shader seen. Which with this game using 1000s and 1000s is clearly a problem.
Unless DarkStarSword gets to this earlier, I'll likely fix this this weekend. (In our good friend: HackerDevice::ReplaceShader)
As a general idea, I've put in a lot of effort to minimize the 3Dmigoto impact, so it should only be using 0.8% of the CPU or thereabouts. So, any big impact on frame rates will very likely be a bug like this.
Edit: BTW, you can tell us if this is the problem, by enabling the logging with calls=1, and see if you get repeated "Assembling replacement code" messages.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I have enabled logging and indeed after loading a level I got "Assembling replacement ASM code" 5000 times in the log.
But it is obvious that the performance drop with the 3Dmigoto fix compared to CM and the official patch is not only caused by 3Dmigoto. It is also (and predominantly) the need to use the routines of the Nvidia drivers for 3D whereas CM and the official patch are based on other methods. So whithout 3Dmigoto and only the "real 3D" enabled I get a significant performance hit. Nevertheless it would be nice if the performance of the fix could be improved. Because on weak hardware as mine every single FPS can make the difference between playble and too laggy ;)
May I repeat my question if there is a way to disable or enable the 3Dmigoto fixes while the game is running (without restart)?
I have enabled logging and indeed after loading a level I got "Assembling replacement ASM code" 5000 times in the log.
But it is obvious that the performance drop with the 3Dmigoto fix compared to CM and the official patch is not only caused by 3Dmigoto. It is also (and predominantly) the need to use the routines of the Nvidia drivers for 3D whereas CM and the official patch are based on other methods. So whithout 3Dmigoto and only the "real 3D" enabled I get a significant performance hit. Nevertheless it would be nice if the performance of the fix could be improved. Because on weak hardware as mine every single FPS can make the difference between playble and too laggy ;)
May I repeat my question if there is a way to disable or enable the 3Dmigoto fixes while the game is running (without restart)?
My original display name is 3d4dd - for some reason Nvidia changed it..?!
[quote="3d4dd"]
May I repeat my question if there is a way to disable or enable the 3Dmigoto fixes while the game is running (without restart)?[/quote]
You can press F9 while hunting.
[quote="Oomek"]
You can press F9 while hunting.[/quote]
Thank You! I will try that.
Update: Tried it but it doesn't help. 3Dmigoto is still active and causes a performance drop compared to starting the game without 3Dmigoto. Besides from that I would have to hold F9 all the time and the overlay is displayed.
Is there another way to dis/enable 3Dmigoto without restarting the game every time?
Update: Tried it but it doesn't help. 3Dmigoto is still active and causes a performance drop compared to starting the game without 3Dmigoto. Besides from that I would have to hold F9 all the time and the overlay is displayed.
Is there another way to dis/enable 3Dmigoto without restarting the game every time?
My original display name is 3d4dd - for some reason Nvidia changed it..?!
[quote="3d4dd"][quote="Oomek"]
You can press F9 while hunting.[/quote]Thank You! I will try that.
Update: Tried it but it doesn't help. 3Dmigoto is still active and causes a performance drop compared to starting the game without 3Dmigoto. Besides from that I would have to hold F9 all the time and the overlay is displayed.
Is there another way to dis/enable 3Dmigoto without restarting the game every time?[/quote]
I don't understand, why would you want to do that?
Update: Tried it but it doesn't help. 3Dmigoto is still active and causes a performance drop compared to starting the game without 3Dmigoto. Besides from that I would have to hold F9 all the time and the overlay is displayed.
Is there another way to dis/enable 3Dmigoto without restarting the game every time?
I don't understand, why would you want to do that?
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
i dont really know if this the right thread for it (if not then please excuse me :-))
but i have an idea (or feature request for the 3dmigoto). The thing is nowdays there are many passive 3d tvs with 4k resolution out there and as you surely know it is possible with some EDID "hacking" to force "Optimized For 3D Vision" mode . The problem is the interlacing mode works only with native tv resolution (4k or 2xHD with 3D on). This can cause exteme performance drops on many games. My suggetion is (of cause if it is possible at all and you guys have time for it) to add something like SSAA with factor lesser than x1. Similar thechnique is used in GTA V for example. One can set 4K resolution in game then choose factor x0.5. It has noticeably better quality than side-by-side or top-bottom without performance issues.
Another possibility is to add something like simple upscaling to a user defined resolution. (Maybe custom RT, just like 3DMigoto already does with the new CustomShader technique)
P.S. Thank you guys for all the job you do for the community.
Can you add this as a feature request here: https://github.com/bo3b/3Dmigoto/issues
I think I understand what you are after. It's theoretically doable, but it will need some more support in 3DMigoto to make it work - we would need to give the game a resource to use instead of the back buffer that is the resolution the game expects, then upscale that to the real back buffer on the present call. Plus we may need to lie to the game anywhere it queries the resolution, though we do already have support to force a given resolution.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
No, there is nothing missing, just the usual o0 related warnings. What's more, some shaders become unavailable when cycling after my edits. So far It's random for me, as I haven't found yet any pattern of the occurence of that problem.
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I'll keep an eye on it and let you know if I find some consistent behaviour.
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
Thx for quick respond. Yes that is basically what i mean.
But I would like to use the 3Dmigoto fix instead of the other solutions. So I have some questions about 3Dmigoto and performance:
- does 3Dmigoto put more stress on CPU or GPU?
- is there a way to disable/enable 3Dmigoto while the game is running? This would allow me to switch temporarily to CM in situations when the game is not playable any more - without restarting the game.
Just to make sure: I'm not at all complaining that 3Dmigoto eats too much performance, etc. I know that it is the fault of my low level hardware. I only want to find out what I could do to improve the performance (OC, etc.) or what could be a workaround...
My original display name is 3d4dd - for some reason Nvidia changed it..?!
DSS i found a pattern for the StereoParams declared twice and other strange behavior.
- Go in to the game with a shader fixed
- While the game is running delete that shader and press F10 to see the effect back to original
- Don't quit the game and hunt again that shader and dump
- Now you should ear the boop sound, because the StereoParams is declared twice, also happens a few times that the shader dumped in the previous point is other different shader with the StereoParams declared twice.
Hope this can help
MY WEB
Helix Mod - Making 3D Better
My 3D Screenshot Gallery
Like my fixes? you can donate to Paypal: dhr.donation@gmail.com
1. dump a shader.
2. modify it
3. press F10
4. delete shader
5. press F10 again
6. dump the same shader (it will contain the modifications you've made before)
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
Taking a quick look here- this is probably happening because the fix is using so many ASM shaders, and that section of code in 3Dmigoto is not complete. It looks to me like we are generating an _reasm.txt and a _reasm.bin file on every file that is loaded. And secondarily, we don't create a .bin cache for ASM shaders, nor do we read the _reasm.bin which is roughly the same thing.
So... Unless I've misread this, we are not only not caching the ASM results, we are regenerating them for every new shader seen. Which with this game using 1000s and 1000s is clearly a problem.
Unless DarkStarSword gets to this earlier, I'll likely fix this this weekend. (In our good friend: HackerDevice::ReplaceShader)
As a general idea, I've put in a lot of effort to minimize the 3Dmigoto impact, so it should only be using 0.8% of the CPU or thereabouts. So, any big impact on frame rates will very likely be a bug like this.
Edit: BTW, you can tell us if this is the problem, by enabling the logging with calls=1, and see if you get repeated "Assembling replacement code" messages.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
But it is obvious that the performance drop with the 3Dmigoto fix compared to CM and the official patch is not only caused by 3Dmigoto. It is also (and predominantly) the need to use the routines of the Nvidia drivers for 3D whereas CM and the official patch are based on other methods. So whithout 3Dmigoto and only the "real 3D" enabled I get a significant performance hit. Nevertheless it would be nice if the performance of the fix could be improved. Because on weak hardware as mine every single FPS can make the difference between playble and too laggy ;)
May I repeat my question if there is a way to disable or enable the 3Dmigoto fixes while the game is running (without restart)?
My original display name is 3d4dd - for some reason Nvidia changed it..?!
You can press F9 while hunting.
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
Thank You! I will try that.
Update: Tried it but it doesn't help. 3Dmigoto is still active and causes a performance drop compared to starting the game without 3Dmigoto. Besides from that I would have to hold F9 all the time and the overlay is displayed.
Is there another way to dis/enable 3Dmigoto without restarting the game every time?
My original display name is 3d4dd - for some reason Nvidia changed it..?!
I don't understand, why would you want to do that?
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers