You can fire off the custom shader from multiple regular shaders as needed to get everything. I'm not sure what we can do to simplify that - there's currently no way to fire one off on every draw call without listing all the relevant shaders, and even if we added that it would mean it was being run for things unrelated to the scene that you don't want (shadow maps, HUD, reflections, etc) so I see little value in adding that without some additional logic to have 3DMigoto determine whether or not it should run the custom shader... maybe if I ever get around to adding conditional logic into the command lists I've been talking about we could find something, but for now just identifying all the relevant vertex shaders will be easier (hint: use the frame analysis log and/or render target dumps).
Performance might also be a bit of a concern - this is adding state backups, changes and restores and additional draw calls to every draw call, and that would be tricky to optimise (a game developer might minimise the state changes and do one pass rendering all the geometry normally and a second rendering the back side of all geometry rather than switching state every draw call, but that is not really an option for us).
What are you using this for? Is there a particular effect that needs this?
You can fire off the custom shader from multiple regular shaders as needed to get everything. I'm not sure what we can do to simplify that - there's currently no way to fire one off on every draw call without listing all the relevant shaders, and even if we added that it would mean it was being run for things unrelated to the scene that you don't want (shadow maps, HUD, reflections, etc) so I see little value in adding that without some additional logic to have 3DMigoto determine whether or not it should run the custom shader... maybe if I ever get around to adding conditional logic into the command lists I've been talking about we could find something, but for now just identifying all the relevant vertex shaders will be easier (hint: use the frame analysis log and/or render target dumps).
Performance might also be a bit of a concern - this is adding state backups, changes and restores and additional draw calls to every draw call, and that would be tricky to optimise (a game developer might minimise the state changes and do one pass rendering all the geometry normally and a second rendering the back side of all geometry rather than switching state every draw call, but that is not really an option for us).
What are you using this for? Is there a particular effect that needs this?
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Nothing crucial, just loose thoughts. By having an access to a thickness buffer and combining it with a binary search you can optimize the ssr significantly. Even with high stride you can have very satisfying results.
Nothing crucial, just loose thoughts. By having an access to a thickness buffer and combining it with a binary search you can optimize the ssr significantly. Even with high stride you can have very satisfying results.
As I don't have (many) experiences with 3Dmigoto I stumbled across a strange issue when I tried to modify this fix for TESO http://helixmod.blogspot.de/2014/05/the-elder-scrolls-online-3d-vision-fix.html. As there are some issues with specular glow I want to offer an option to reduce the intensity of this effect to different degrees. To do this I want to assign a different values for variable to a hotkey and use this variable in the modified shader. A simple task which worked in other (hot)fixes I have made so far:
I declare the key and the variable in d3dx.ini:
[code]
[Constants]
;Specular Glow
x = 0.7
[Key3]
Key = u
x = 0.7, 0.5, 0.3, 0, 1.0
type = cycle[/code]
and in the shaders I include this code:
[code]float4 iniParam = IniParams.Load(0);
r5.w = iniParam.x * r5.w; [/code]
This code works in my modified TESO fix - but ONLY with variable x! When I use variable z instead (which is not used in the original fix) with the same code ( z = 0.7, 0.5,...; iniParam.z) the hotkey doesn't work. Only when the game is running and I enable hunting mode and reload the fix suddenly the hotkey works with z=... Besides from beeing not very comfortable the workaround with reloading the fix also brakes the shadows.
So my question: Why does only x work for the variable? Do I have to declare additional parameters for the hotkey and variable I want to use?
As I don't have (many) experiences with 3Dmigoto I stumbled across a strange issue when I tried to modify this fix for TESO http://helixmod.blogspot.de/2014/05/the-elder-scrolls-online-3d-vision-fix.html. As there are some issues with specular glow I want to offer an option to reduce the intensity of this effect to different degrees. To do this I want to assign a different values for variable to a hotkey and use this variable in the modified shader. A simple task which worked in other (hot)fixes I have made so far:
I declare the key and the variable in d3dx.ini:
[Constants]
;Specular Glow
x = 0.7
[Key3]
Key = u
x = 0.7, 0.5, 0.3, 0, 1.0
type = cycle
This code works in my modified TESO fix - but ONLY with variable x! When I use variable z instead (which is not used in the original fix) with the same code ( z = 0.7, 0.5,...; iniParam.z) the hotkey doesn't work. Only when the game is running and I enable hunting mode and reload the fix suddenly the hotkey works with z=... Besides from beeing not very comfortable the workaround with reloading the fix also brakes the shadows.
So my question: Why does only x work for the variable? Do I have to declare additional parameters for the hotkey and variable I want to use?
My original display name is 3d4dd - for some reason Nvidia changed it..?!
In the face of Your expertise the issue became scared and fled ;) The same code that didn't work 2 days ago works perfectly now. Obviously TESO is trying to make a fool out of me: first loading the wrong version of the fix, then using wrong shadow settings and now this... Nevertheless thank You for trying to help me :)
In the face of Your expertise the issue became scared and fled ;) The same code that didn't work 2 days ago works perfectly now. Obviously TESO is trying to make a fool out of me: first loading the wrong version of the fix, then using wrong shadow settings and now this... Nevertheless thank You for trying to help me :)
My original display name is 3d4dd - for some reason Nvidia changed it..?!
I would like to come back to that issue with the resource not clearing with the custom shader when MSAA is enabled. Is there something I could do to overcome it, or would it require some changes to the 3DM code?
I would like to come back to that issue with the resource not clearing with the custom shader when MSAA is enabled. Is there something I could do to overcome it, or would it require some changes to the 3DM code?
Hello, I would like to know if that a known issue that when using CustomShader3DVision2SBS there is fps drop from 50 fps to 40 in witcher 3 at 1440p for exemple on SLI 1080.
Maybe I'm missing some parameter?
Also when using CustomShader3DVision2SBS in Fallout4 SLI doesn't kick in 1440p but only in 1080p.
I'm using this param to play on my projector, in regular 3d vision mode (pc monitor) SLI is working fine.
Thanks by advance for your reply.
Hello, I would like to know if that a known issue that when using CustomShader3DVision2SBS there is fps drop from 50 fps to 40 in witcher 3 at 1440p for exemple on SLI 1080.
Maybe I'm missing some parameter?
Also when using CustomShader3DVision2SBS in Fallout4 SLI doesn't kick in 1440p but only in 1080p.
I'm using this param to play on my projector, in regular 3d vision mode (pc monitor) SLI is working fine.
Thanks by advance for your reply.
@kalhohan not surprising - the SBS shader (or more specifically the reverse stereo blit it depends on) requires a hell of a lot of bandwidth over the SLI bridge, which becomes a problem at anything above 1920x1080. The only way to overcome this is to downscale the image on each GPU first, which is possible by adding another custom shader before it to scale to either 50% width or 50% height depending on which mode is in use, then do the reverse blit on that downscaled image instead of the back buffer. At some point I want to add some conditional logic to 3DMigoto so it can select the correct downscaling option automatically, but that's not there yet.
@oomek - which issue is this?
@kalhohan not surprising - the SBS shader (or more specifically the reverse stereo blit it depends on) requires a hell of a lot of bandwidth over the SLI bridge, which becomes a problem at anything above 1920x1080. The only way to overcome this is to downscale the image on each GPU first, which is possible by adding another custom shader before it to scale to either 50% width or 50% height depending on which mode is in use, then do the reverse blit on that downscaled image instead of the back buffer. At some point I want to add some conditional logic to 3DMigoto so it can select the correct downscaling option automatically, but that's not there yet.
@oomek - which issue is this?
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[center][color="orange"][size="XL"]3Dmigoto 1.2.64[/size][/color][/center][center][color="green"]https://github.com/bo3b/3Dmigoto/releases[/color][/center]
[.]Entirely new ini parser, which is approximately 200x faster than the old one. The upcoming DX11 Dreamfall Chapters mod was taking close to two minutes to parse the d3dx.ini with the old parser, and now takes under one second with the new parser.[/.]
[.]#include now works properly in custom shaders and will search the same directory as the custom shader for any included shaders (e.g. you should use #include "hud.hlsl" instead of #include "ShaderFixes/hud.hlsl"). Regular shaders already had this fix.[/.]
[.]Log messages for [re]creating resource caches and substantiating resources are clearer, including the line from the d3dx.ini which triggered them[/.]
[.]Some other misc bug fixes / logging issues in the ini parsing and presets[/.]
I'm releasing this now because I judge the new ini parser as "slightly risky but should probably be fine", and I also judge some of the upcoming changes as "just a tad more risky". If something does go wrong I'd like a release between these two that we can use to test with so we can easily work out which one is the culprit.
Entirely new ini parser, which is approximately 200x faster than the old one. The upcoming DX11 Dreamfall Chapters mod was taking close to two minutes to parse the d3dx.ini with the old parser, and now takes under one second with the new parser.
#include now works properly in custom shaders and will search the same directory as the custom shader for any included shaders (e.g. you should use #include "hud.hlsl" instead of #include "ShaderFixes/hud.hlsl"). Regular shaders already had this fix.
Log messages for [re]creating resource caches and substantiating resources are clearer, including the line from the d3dx.ini which triggered them
Some other misc bug fixes / logging issues in the ini parsing and presets
I'm releasing this now because I judge the new ini parser as "slightly risky but should probably be fine", and I also judge some of the upcoming changes as "just a tad more risky". If something does go wrong I'd like a release between these two that we can use to test with so we can easily work out which one is the culprit.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"]@kalhohan .... At some point I want to add some conditional logic to 3DMigoto so it can select the correct downscaling option automatically, but that's not there yet.
[/quote]
My hero ;), I'll be patient and enjoy my aliasing for now on the projector :D
And i'll try to find a way for fallout 4. Maybe a sli bit. I'll search the forums.
Thanks.
DarkStarSword said:@kalhohan .... At some point I want to add some conditional logic to 3DMigoto so it can select the correct downscaling option automatically, but that's not there yet.
My hero ;), I'll be patient and enjoy my aliasing for now on the projector :D
And i'll try to find a way for fallout 4. Maybe a sli bit. I'll search the forums.
Thanks.
If I can add more to my initial statement, Fallout 4 engine is just non-sense. After trying on my 3d vision 2 monitor SLI does kick in in 1440p and displayed according fps and scale correctly.
SBS shader is activated, and F11 does work .
But as soon as I try on my projector thought hmdi which displays 1080p60 (and 1440p60 by custom rez) the SLI doesn't scale at all in 1440p, and scale strangly in 1080p: it's scaling more when less things to draw, 100% scale when I look at the sky or ground , scale 10% when I look at the a city/settlement.
Of the games I tried, it's the only one with this behavior. Witcher 3, nier automata, RoTB scale ok on the projector.
So that's probably not coming from the shader or 3dmigoto, I'll try maybe to downgrade the driver, see I have better luck.
Also on a side note CustomShaderUpscale displays a portion only of the game. In game is projected correcly to 1080p (with 1440p rez) but menus, hud, quest markers are projected to 1440p so 1/3. For exemple initial menu is missing 1/3 on right and bottom.
If I can add more to my initial statement, Fallout 4 engine is just non-sense. After trying on my 3d vision 2 monitor SLI does kick in in 1440p and displayed according fps and scale correctly.
SBS shader is activated, and F11 does work .
But as soon as I try on my projector thought hmdi which displays 1080p60 (and 1440p60 by custom rez) the SLI doesn't scale at all in 1440p, and scale strangly in 1080p: it's scaling more when less things to draw, 100% scale when I look at the sky or ground , scale 10% when I look at the a city/settlement.
Of the games I tried, it's the only one with this behavior. Witcher 3, nier automata, RoTB scale ok on the projector.
So that's probably not coming from the shader or 3dmigoto, I'll try maybe to downgrade the driver, see I have better luck.
Also on a side note CustomShaderUpscale displays a portion only of the game. In game is projected correcly to 1080p (with 1440p rez) but menus, hud, quest markers are projected to 1440p so 1/3. For exemple initial menu is missing 1/3 on right and bottom.
Performance might also be a bit of a concern - this is adding state backups, changes and restores and additional draw calls to every draw call, and that would be tricky to optimise (a game developer might minimise the state changes and do one pass rendering all the geometry normally and a second rendering the back side of all geometry rather than switching state every draw call, but that is not really an option for us).
What are you using this for? Is there a particular effect that needs this?
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
I declare the key and the variable in d3dx.ini:
and in the shaders I include this code:
This code works in my modified TESO fix - but ONLY with variable x! When I use variable z instead (which is not used in the original fix) with the same code ( z = 0.7, 0.5,...; iniParam.z) the hotkey doesn't work. Only when the game is running and I enable hunting mode and reload the fix suddenly the hotkey works with z=... Besides from beeing not very comfortable the workaround with reloading the fix also brakes the shadows.
So my question: Why does only x work for the variable? Do I have to declare additional parameters for the hotkey and variable I want to use?
My original display name is 3d4dd - for some reason Nvidia changed it..?!
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
My original display name is 3d4dd - for some reason Nvidia changed it..?!
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
My original display name is 3d4dd - for some reason Nvidia changed it..?!
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
Maybe I'm missing some parameter?
Also when using CustomShader3DVision2SBS in Fallout4 SLI doesn't kick in 1440p but only in 1080p.
I'm using this param to play on my projector, in regular 3d vision mode (pc monitor) SLI is working fine.
Thanks by advance for your reply.
@oomek - which issue is this?
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
https://forums.geforce.com/default/topic/685657/3dmigoto-now-open-source-/?offset=1237
EVGA GeForce GTX 980 SC
Core i5 2500K
MSI Z77A-G45
8GB DDR3
Windows 10 x64
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
I'm releasing this now because I judge the new ini parser as "slightly risky but should probably be fine", and I also judge some of the upcoming changes as "just a tad more risky". If something does go wrong I'd like a release between these two that we can use to test with so we can easily work out which one is the culprit.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
My hero ;), I'll be patient and enjoy my aliasing for now on the projector :D
And i'll try to find a way for fallout 4. Maybe a sli bit. I'll search the forums.
Thanks.
SBS shader is activated, and F11 does work .
But as soon as I try on my projector thought hmdi which displays 1080p60 (and 1440p60 by custom rez) the SLI doesn't scale at all in 1440p, and scale strangly in 1080p: it's scaling more when less things to draw, 100% scale when I look at the sky or ground , scale 10% when I look at the a city/settlement.
Of the games I tried, it's the only one with this behavior. Witcher 3, nier automata, RoTB scale ok on the projector.
So that's probably not coming from the shader or 3dmigoto, I'll try maybe to downgrade the driver, see I have better luck.
Also on a side note CustomShaderUpscale displays a portion only of the game. In game is projected correcly to 1080p (with 1440p rez) but menus, hud, quest markers are projected to 1440p so 1/3. For exemple initial menu is missing 1/3 on right and bottom.