Bo3b, I have a very weird question;)
In this CS (pasted above)
https://pastebin.com/XJBshjip
If I simply add this:
[code]
dcl_temps 60
// 3DMigoto StereoParams:
dcl_resource_texture1d (float,float,float,float) t120
ld_indexable(texture1d)(float,float,float,float) r41.xyzw, l(0, 0, 0, 0), t120.xyzw
[/code]
The whole shader FREAKS out. All the tiles start to be wrong! r41 is free and not used! t120 again not used. I even changed it from t120 to t210 and other values (both here and in the ini file) and it doesn't work!
I am confused. Is only doing this weirdness in these tiles lighting shaders. In all other shaders works perfectly fine!
Could this be a hint that something is going wrong here? Maybe the driver is doing something wrong? I have no idea how to debug this further...
Any ideas?
The whole shader FREAKS out. All the tiles start to be wrong! r41 is free and not used! t120 again not used. I even changed it from t120 to t210 and other values (both here and in the ini file) and it doesn't work!
I am confused. Is only doing this weirdness in these tiles lighting shaders. In all other shaders works perfectly fine!
Could this be a hint that something is going wrong here? Maybe the driver is doing something wrong? I have no idea how to debug this further...
Any ideas?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Bo3b, I have a very weird question;)
In this CS (pasted above)
https://pastebin.com/XJBshjip
If I simply add this:
[code]
dcl_temps 60
// 3DMigoto StereoParams:
dcl_resource_texture1d (float,float,float,float) t120
ld_indexable(texture1d)(float,float,float,float) r41.xyzw, l(0, 0, 0, 0), t120.xyzw
[/code]
The whole shader FREAKS out. All the tiles start to be wrong! r41 is free and not used! t120 again not used. I even changed it from t120 to t210 and other values (both here and in the ini file) and it doesn't work!
I am confused. Is only doing this weirdness in these tiles lighting shaders. In all other shaders works perfectly fine!
Could this be a hint that something is going wrong here? Maybe the driver is doing something wrong? I have no idea how to debug this further...
Any ideas?[/quote]
Most likely this is a problem with the Assembler. It's super fragile about code that is slightly off, and might have generated bad code.
Try running with Debug enabled in the log, and see if you get any errors when that is loaded. The Assembler also has essentially no error checking also, but there are some consistency checks that wrap it.
dcl_temps 60 seems high. It looks like it starts at 11. I don't see anything wrong with the code there however. Worth trying to narrow it down by removing all, then add one by one.
I've seen the time-out of the good to bad CS in ME:A, and that's really interesting. I'm going to take a look at the shaders in Catalyst.
I agree with you that it goes from fine to bad, but your internal clock speed is different than mine, as I see it happen in maybe 0.25 seconds. :->
Here's a snagged image right after it was good (screen depth, no fix or 3Dmigoto here.)
[img]http://sg.bo3b.net/mea/MassEffectAndromeda04_85.jps[/img]
After reading their siggraph papers on CS, this definitely seems like a culling or cache problem that only happens because the eye location is different. Something about the code is trying to speed things up. If we can kill that, it will work.
The whole shader FREAKS out. All the tiles start to be wrong! r41 is free and not used! t120 again not used. I even changed it from t120 to t210 and other values (both here and in the ini file) and it doesn't work!
I am confused. Is only doing this weirdness in these tiles lighting shaders. In all other shaders works perfectly fine!
Could this be a hint that something is going wrong here? Maybe the driver is doing something wrong? I have no idea how to debug this further...
Any ideas?
Most likely this is a problem with the Assembler. It's super fragile about code that is slightly off, and might have generated bad code.
Try running with Debug enabled in the log, and see if you get any errors when that is loaded. The Assembler also has essentially no error checking also, but there are some consistency checks that wrap it.
dcl_temps 60 seems high. It looks like it starts at 11. I don't see anything wrong with the code there however. Worth trying to narrow it down by removing all, then add one by one.
I've seen the time-out of the good to bad CS in ME:A, and that's really interesting. I'm going to take a look at the shaders in Catalyst.
I agree with you that it goes from fine to bad, but your internal clock speed is different than mine, as I see it happen in maybe 0.25 seconds. :->
Here's a snagged image right after it was good (screen depth, no fix or 3Dmigoto here.)
After reading their siggraph papers on CS, this definitely seems like a culling or cache problem that only happens because the eye location is different. Something about the code is trying to speed things up. If we can kill that, it will work.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Hi Bo3b,
Thanks! I will try to do that! It definitely Look like a caching / internal mechanism problem!
And here is why I also believe this:
Dump the Shader that I posted above.
Load the game.
Now comment the last "store_structured u1.x, r0.w, l(0), r1.x" instruction.
Build the shader!
You will see the effect only in one EYE! The other eye is not affected!
However, if you Disable (via console) all the lights and re-enable them back, the effect will be persistent in both eyes! Which tells me that left eye runs from some sort of cache or something else!
Awesome find! Hope we can find it and KILL it! ^_^
Thanks! I will try to do that! It definitely Look like a caching / internal mechanism problem!
And here is why I also believe this:
Dump the Shader that I posted above.
Load the game.
Now comment the last "store_structured u1.x, r0.w, l(0), r1.x" instruction.
Build the shader!
You will see the effect only in one EYE! The other eye is not affected!
However, if you Disable (via console) all the lights and re-enable them back, the effect will be persistent in both eyes! Which tells me that left eye runs from some sort of cache or something else!
Awesome find! Hope we can find it and KILL it! ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Hi Bo3b,
Thanks! I will try to do that! It definitely Look like a caching / internal mechanism problem!
And here is why I also believe this:
Dump the Shader that I posted above.
Load the game.
Now comment the last "store_structured u1.x, r0.w, l(0), r1.x" instruction.
Build the shader!
You will see the effect only in one EYE! The other eye is not affected!
However, if you Disable (via console) all the lights and re-enable them back, the effect will be persistent in both eyes! Which tells me that left eye runs from some sort of cache or something else!
Awesome find! Hope we can find it and KILL it! ^_^[/quote]
Will give you a 50 € donation if you can make it :-)
Thanks! I will try to do that! It definitely Look like a caching / internal mechanism problem!
And here is why I also believe this:
Dump the Shader that I posted above.
Load the game.
Now comment the last "store_structured u1.x, r0.w, l(0), r1.x" instruction.
Build the shader!
You will see the effect only in one EYE! The other eye is not affected!
However, if you Disable (via console) all the lights and re-enable them back, the effect will be persistent in both eyes! Which tells me that left eye runs from some sort of cache or something else!
Awesome find! Hope we can find it and KILL it! ^_^
Will give you a 50 € donation if you can make it :-)
Like my work? Donations can be made via PayPal to: rauti@inetmx.de
Hey guys I'm working on adding native 3dVision support in unity using 3dmigoto. I'm not hacking any shaders or using any of the 3dvision automatic code but i am using the StereoParams to detect which eye is being rendered. In unity in a shader I'm loading the StereoParams via:
[code]
Texture2D<float4> StereoParams : register(t121);
[/code]
then:
[code]
float4 stereo = StereoParams.Load(0);
if(stereo.z>0) col = tex2D(leftTex, i.uv); //render left eye tex
else col = tex2D(rightTex, i.uv); // render right eye tex
[/code]
This ussually works but I've noticed that different versions of unity sometimes use up those texture registers which breaks things. Is there a better way to figure out which register you use than just trial and error or is there a better way to load stereo params?
Hey guys I'm working on adding native 3dVision support in unity using 3dmigoto. I'm not hacking any shaders or using any of the 3dvision automatic code but i am using the StereoParams to detect which eye is being rendered. In unity in a shader I'm loading the StereoParams via:
Texture2D<float4> StereoParams : register(t121);
then:
float4 stereo = StereoParams.Load(0);
if(stereo.z>0) col = tex2D(leftTex, i.uv); //render left eye tex
else col = tex2D(rightTex, i.uv); // render right eye tex
This ussually works but I've noticed that different versions of unity sometimes use up those texture registers which breaks things. Is there a better way to figure out which register you use than just trial and error or is there a better way to load stereo params?
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
[quote="sgsrules"]Gey guys I'm working on adding native 3dVision support in unity using 3dmigoto. I'm not hacking any shaders or using any of the 3dvision automatic code but i am using the StereoParams to detect which eye is being rendered. In unity in a shader I'm loading the StereoParams via:
[code]
Texture2D<float4> StereoParams : register(t121);
[/code]
then:
[code]
float4 stereo = StereoParams.Load(0);
if(stereo.z>0) col = tex2D(leftTex, i.uv); //render left eye tex
else col = tex2D(rightTex, i.uv); // render right eye tex
[/code]
This ussually works but I've noticed that different versions of unity sometimes use up those texture registers which breaks things. Is there a better way to figure out which register you use than just trial and error or is there a better way to load stereo params?[/quote]
You can specify the tex slot to use in the ini file of 3DMigoto. Normally is t120 I believe.
You can set it up to 210 for example. Then in all the HLSL or ASM code you need to use this new texture;)
Can't remember from the top of my head but if you look in the ini file you will spot it easily;)
sgsrules said:Gey guys I'm working on adding native 3dVision support in unity using 3dmigoto. I'm not hacking any shaders or using any of the 3dvision automatic code but i am using the StereoParams to detect which eye is being rendered. In unity in a shader I'm loading the StereoParams via:
Texture2D<float4> StereoParams : register(t121);
then:
float4 stereo = StereoParams.Load(0);
if(stereo.z>0) col = tex2D(leftTex, i.uv); //render left eye tex
else col = tex2D(rightTex, i.uv); // render right eye tex
This ussually works but I've noticed that different versions of unity sometimes use up those texture registers which breaks things. Is there a better way to figure out which register you use than just trial and error or is there a better way to load stereo params?
You can specify the tex slot to use in the ini file of 3DMigoto. Normally is t120 I believe.
You can set it up to 210 for example. Then in all the HLSL or ASM code you need to use this new texture;)
Can't remember from the top of my head but if you look in the ini file you will spot it easily;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Another question: Is it possible to disable 3d vision automatic in 3dmigoto? I'm rendering the scene twice with a offset view matrix and a and a off centerprojection matrix so i don't need any of the 3dvision automatic stuff. I'm only using 3dmigoto to create the stereo texture with the necessary 3d vision stereo header.
Another question: Is it possible to disable 3d vision automatic in 3dmigoto? I'm rendering the scene twice with a offset view matrix and a and a off centerprojection matrix so i don't need any of the 3dvision automatic stuff. I'm only using 3dmigoto to create the stereo texture with the necessary 3d vision stereo header.
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
[quote="sgsrules"]Another question: Is it possible to disable 3d vision automatic in 3dmigoto? I'm rendering the scene twice with a offset view matrix and a and a off centerprojection matrix so i don't need any of the 3dvision automatic stuff. I'm only using 3dmigoto to create the stereo texture with the necessary 3d vision stereo header.[/quote]
No,
you can use the Nvidia profiles. In the profile for the game, make sure StereoTexture is set to 0. This will enable 3D Vision hardware but you need to handle the whole rendering pipeline in stereo!
Other than this there is no way to simply disable the automatic stereo injection 3D Vision does.
You can always try to manually uncorrect what 3D Vision does per shader though;)
sgsrules said:Another question: Is it possible to disable 3d vision automatic in 3dmigoto? I'm rendering the scene twice with a offset view matrix and a and a off centerprojection matrix so i don't need any of the 3dvision automatic stuff. I'm only using 3dmigoto to create the stereo texture with the necessary 3d vision stereo header.
No,
you can use the Nvidia profiles. In the profile for the game, make sure StereoTexture is set to 0. This will enable 3D Vision hardware but you need to handle the whole rendering pipeline in stereo!
Other than this there is no way to simply disable the automatic stereo injection 3D Vision does.
You can always try to manually uncorrect what 3D Vision does per shader though;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"][quote="sgsrules"]Another question: Is it possible to disable 3d vision automatic in 3dmigoto? I'm rendering the scene twice with a offset view matrix and a and a off centerprojection matrix so i don't need any of the 3dvision automatic stuff. I'm only using 3dmigoto to create the stereo texture with the necessary 3d vision stereo header.[/quote]
No,
you can use the Nvidia profiles. In the profile for the game, make sure StereoTexture is set to 0. This will enable 3D Vision hardware but you need to handle the whole rendering pipeline in stereo!
Other than this there is no way to simply disable the automatic stereo injection 3D Vision does.
You can always try to manually uncorrect what 3D Vision does per shader though;)[/quote]
So if i set the flag in the profile i wouldn't need to use 3dmigoto at all?
would I set the flag using nvidia inspector?
How do i force a unity game to use exclusive fullscreen mode?
How would i detect which eye is being rendered?
The reason that i'm asking all this is becuase I've come up with a way to inject a camera script and shader into unity that does all the stereo rendering in unity. It's a pretty simple process and works works in my custom app and also worked in the game INSIDE without having to modify any shaders. I can currently do a SBS render. The only pieces of the puzzle that i'm missing is getting 3dvision to kick in, exclusive fullscreen come on, and detect which eye is rendering in the shader. So i was going to use 3dmigoto to do all this, which would work except that it also modifies the shaders as well. If i could get this to work we could EASILY fix any unity game without having to customize every shader.
sgsrules said:Another question: Is it possible to disable 3d vision automatic in 3dmigoto? I'm rendering the scene twice with a offset view matrix and a and a off centerprojection matrix so i don't need any of the 3dvision automatic stuff. I'm only using 3dmigoto to create the stereo texture with the necessary 3d vision stereo header.
No,
you can use the Nvidia profiles. In the profile for the game, make sure StereoTexture is set to 0. This will enable 3D Vision hardware but you need to handle the whole rendering pipeline in stereo!
Other than this there is no way to simply disable the automatic stereo injection 3D Vision does.
You can always try to manually uncorrect what 3D Vision does per shader though;)
So if i set the flag in the profile i wouldn't need to use 3dmigoto at all?
would I set the flag using nvidia inspector?
How do i force a unity game to use exclusive fullscreen mode?
How would i detect which eye is being rendered?
The reason that i'm asking all this is becuase I've come up with a way to inject a camera script and shader into unity that does all the stereo rendering in unity. It's a pretty simple process and works works in my custom app and also worked in the game INSIDE without having to modify any shaders. I can currently do a SBS render. The only pieces of the puzzle that i'm missing is getting 3dvision to kick in, exclusive fullscreen come on, and detect which eye is rendering in the shader. So i was going to use 3dmigoto to do all this, which would work except that it also modifies the shaders as well. If i could get this to work we could EASILY fix any unity game without having to customize every shader.
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
[quote="The_Nephilim"]Hey Guys,
I have a quick question.. I made a fix for Black Desert and 1 user who uses it has SLI now he says the fix is not working and he sees shadows still.. Now I was wondering would SLI cause that issue as it works fine for me and if so what can we do to fix SLI issues with Migoto??
Thank you
Nephilim[/quote]
Hey Guys,
Friendly bump on this issue and an update. I know have 2 users using SLI and issues. So I would like to know what do we do about SLI to fix with Migoto??
I have a quick question.. I made a fix for Black Desert and 1 user who uses it has SLI now he says the fix is not working and he sees shadows still.. Now I was wondering would SLI cause that issue as it works fine for me and if so what can we do to fix SLI issues with Migoto??
Thank you
Nephilim
Hey Guys,
Friendly bump on this issue and an update. I know have 2 users using SLI and issues. So I would like to know what do we do about SLI to fix with Migoto??
[quote="The_Nephilim"][quote="The_Nephilim"]Hey Guys,
I have a quick question.. I made a fix for Black Desert and 1 user who uses it has SLI now he says the fix is not working and he sees shadows still.. Now I was wondering would SLI cause that issue as it works fine for me and if so what can we do to fix SLI issues with Migoto??
Thank you
Nephilim[/quote]
Hey Guys,
Friendly bump on this issue and an update. I know have 2 users using SLI and issues. So I would like to know what do we do about SLI to fix with Migoto??[/quote]
There isn't anything to fix in 3DMigoto (not Migoto) to fix SLI issues.
That is a pure SLI problem. Try experimenting and finding a different SLI profiles/bits value that work with the fix and 3D Vision (Do you think that "I'm crazy" when I post different Game Profiles for 3D Vision fixes?!? ^_^).
I have a quick question.. I made a fix for Black Desert and 1 user who uses it has SLI now he says the fix is not working and he sees shadows still.. Now I was wondering would SLI cause that issue as it works fine for me and if so what can we do to fix SLI issues with Migoto??
Thank you
Nephilim
Hey Guys,
Friendly bump on this issue and an update. I know have 2 users using SLI and issues. So I would like to know what do we do about SLI to fix with Migoto??
There isn't anything to fix in 3DMigoto (not Migoto) to fix SLI issues.
That is a pure SLI problem. Try experimenting and finding a different SLI profiles/bits value that work with the fix and 3D Vision (Do you think that "I'm crazy" when I post different Game Profiles for 3D Vision fixes?!? ^_^).
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
OK that fixes that but if someone disables SLI should that fix SLI Being the issue.. What is happening the fix for Black Desert works fine for me but 2 other users are saying they still see shadows ingame..
Now I been having some issues but I think I have the sorted out now but whenever I go ingame the fix seems to be fine for me but I don't use SLI and those 2 users do..
So I was wondering if they disabled SLI and still saw the shadows what can be causing that as we use the same fix from HelixMod site??
OK that fixes that but if someone disables SLI should that fix SLI Being the issue.. What is happening the fix for Black Desert works fine for me but 2 other users are saying they still see shadows ingame..
Now I been having some issues but I think I have the sorted out now but whenever I go ingame the fix seems to be fine for me but I don't use SLI and those 2 users do..
So I was wondering if they disabled SLI and still saw the shadows what can be causing that as we use the same fix from HelixMod site??
[quote="The_Nephilim"]OK that fixes that but if someone disables SLI should that fix SLI Being the issue.. What is happening the fix for Black Desert works fine for me but 2 other users are saying they still see shadows ingame..
Now I been having some issues but I think I have the sorted out now but whenever I go ingame the fix seems to be fine for me but I don't use SLI and those 2 users do..
So I was wondering if they disabled SLI and still saw the shadows what can be causing that as we use the same fix from HelixMod site??[/quote]
If SLI is disabled and still get wrong shadows:
Than it means the game generated different shaders with different CRCs, based on that machine... I know in "the old days" a lot of DX9 games used to do this (has to do with "runtime generation and compilation" ... If this is the case, I am not the person to ask;)
Another "common" mistake, is when you fix something and HAVE SOME IN-GAME SETTINGS. The other person uses DIFFERENT settings.
Ex:
You you shadows HIGH setting.
They use shadows LOW setting.
In this case you get different shaders;) You need to find ALL OF THEM and fix them;)
Tell them to use the EXACT SAME SETTINGS AS YOU and see if it works or still fails! (Debugging and testing is key my friend ^_^ )
The_Nephilim said:OK that fixes that but if someone disables SLI should that fix SLI Being the issue.. What is happening the fix for Black Desert works fine for me but 2 other users are saying they still see shadows ingame..
Now I been having some issues but I think I have the sorted out now but whenever I go ingame the fix seems to be fine for me but I don't use SLI and those 2 users do..
So I was wondering if they disabled SLI and still saw the shadows what can be causing that as we use the same fix from HelixMod site??
If SLI is disabled and still get wrong shadows:
Than it means the game generated different shaders with different CRCs, based on that machine... I know in "the old days" a lot of DX9 games used to do this (has to do with "runtime generation and compilation" ... If this is the case, I am not the person to ask;)
Another "common" mistake, is when you fix something and HAVE SOME IN-GAME SETTINGS. The other person uses DIFFERENT settings.
Ex:
You you shadows HIGH setting.
They use shadows LOW setting.
In this case you get different shaders;) You need to find ALL OF THEM and fix them;)
Tell them to use the EXACT SAME SETTINGS AS YOU and see if it works or still fails! (Debugging and testing is key my friend ^_^ )
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Yes We been testing and debugging I was just trying to find out about what fix we can use for SLI issues..
The One user said he had the exact same setting I use and still got the Shadows. Besides possibly different drivers we had the same settings but he was getting shadows and I can't see why he would besides maybe SLI causing wierd issues that is why I am here asking about SLI..
I will load the game up and try and go through all the shaders again on different settings..
Yes We been testing and debugging I was just trying to find out about what fix we can use for SLI issues..
The One user said he had the exact same setting I use and still got the Shadows. Besides possibly different drivers we had the same settings but he was getting shadows and I can't see why he would besides maybe SLI causing wierd issues that is why I am here asking about SLI..
I will load the game up and try and go through all the shaders again on different settings..
In this CS (pasted above)
https://pastebin.com/XJBshjip
If I simply add this:
The whole shader FREAKS out. All the tiles start to be wrong! r41 is free and not used! t120 again not used. I even changed it from t120 to t210 and other values (both here and in the ini file) and it doesn't work!
I am confused. Is only doing this weirdness in these tiles lighting shaders. In all other shaders works perfectly fine!
Could this be a hint that something is going wrong here? Maybe the driver is doing something wrong? I have no idea how to debug this further...
Any ideas?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Most likely this is a problem with the Assembler. It's super fragile about code that is slightly off, and might have generated bad code.
Try running with Debug enabled in the log, and see if you get any errors when that is loaded. The Assembler also has essentially no error checking also, but there are some consistency checks that wrap it.
dcl_temps 60 seems high. It looks like it starts at 11. I don't see anything wrong with the code there however. Worth trying to narrow it down by removing all, then add one by one.
I've seen the time-out of the good to bad CS in ME:A, and that's really interesting. I'm going to take a look at the shaders in Catalyst.
I agree with you that it goes from fine to bad, but your internal clock speed is different than mine, as I see it happen in maybe 0.25 seconds. :->
Here's a snagged image right after it was good (screen depth, no fix or 3Dmigoto here.)
After reading their siggraph papers on CS, this definitely seems like a culling or cache problem that only happens because the eye location is different. Something about the code is trying to speed things up. If we can kill that, it will work.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Thanks! I will try to do that! It definitely Look like a caching / internal mechanism problem!
And here is why I also believe this:
Dump the Shader that I posted above.
Load the game.
Now comment the last "store_structured u1.x, r0.w, l(0), r1.x" instruction.
Build the shader!
You will see the effect only in one EYE! The other eye is not affected!
However, if you Disable (via console) all the lights and re-enable them back, the effect will be persistent in both eyes! Which tells me that left eye runs from some sort of cache or something else!
Awesome find! Hope we can find it and KILL it! ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Will give you a 50 € donation if you can make it :-)
Like my work? Donations can be made via PayPal to: rauti@inetmx.de
OMG!! the frosbite 3 is teetering...:)))....I encourage you guys ;)
i7 4970k@4.5Ghz, SLI GTX1080Ti Aorus Gigabyte Xtreme, 16GB G Skill 2400hrz, 3*PG258Q in 3D surround.
then:
This ussually works but I've noticed that different versions of unity sometimes use up those texture registers which breaks things. Is there a better way to figure out which register you use than just trial and error or is there a better way to load stereo params?
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
You can specify the tex slot to use in the ini file of 3DMigoto. Normally is t120 I believe.
You can set it up to 210 for example. Then in all the HLSL or ASM code you need to use this new texture;)
Can't remember from the top of my head but if you look in the ini file you will spot it easily;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
No,
you can use the Nvidia profiles. In the profile for the game, make sure StereoTexture is set to 0. This will enable 3D Vision hardware but you need to handle the whole rendering pipeline in stereo!
Other than this there is no way to simply disable the automatic stereo injection 3D Vision does.
You can always try to manually uncorrect what 3D Vision does per shader though;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
So if i set the flag in the profile i wouldn't need to use 3dmigoto at all?
would I set the flag using nvidia inspector?
How do i force a unity game to use exclusive fullscreen mode?
How would i detect which eye is being rendered?
The reason that i'm asking all this is becuase I've come up with a way to inject a camera script and shader into unity that does all the stereo rendering in unity. It's a pretty simple process and works works in my custom app and also worked in the game INSIDE without having to modify any shaders. I can currently do a SBS render. The only pieces of the puzzle that i'm missing is getting 3dvision to kick in, exclusive fullscreen come on, and detect which eye is rendering in the shader. So i was going to use 3dmigoto to do all this, which would work except that it also modifies the shaders as well. If i could get this to work we could EASILY fix any unity game without having to customize every shader.
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
Hey Guys,
Friendly bump on this issue and an update. I know have 2 users using SLI and issues. So I would like to know what do we do about SLI to fix with Migoto??
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit
There isn't anything to fix in 3DMigoto (not Migoto) to fix SLI issues.
That is a pure SLI problem. Try experimenting and finding a different SLI profiles/bits value that work with the fix and 3D Vision (Do you think that "I'm crazy" when I post different Game Profiles for 3D Vision fixes?!? ^_^).
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Now I been having some issues but I think I have the sorted out now but whenever I go ingame the fix seems to be fine for me but I don't use SLI and those 2 users do..
So I was wondering if they disabled SLI and still saw the shadows what can be causing that as we use the same fix from HelixMod site??
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit
If SLI is disabled and still get wrong shadows:
Than it means the game generated different shaders with different CRCs, based on that machine... I know in "the old days" a lot of DX9 games used to do this (has to do with "runtime generation and compilation" ... If this is the case, I am not the person to ask;)
Another "common" mistake, is when you fix something and HAVE SOME IN-GAME SETTINGS. The other person uses DIFFERENT settings.
Ex:
You you shadows HIGH setting.
They use shadows LOW setting.
In this case you get different shaders;) You need to find ALL OF THEM and fix them;)
Tell them to use the EXACT SAME SETTINGS AS YOU and see if it works or still fails! (Debugging and testing is key my friend ^_^ )
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
The One user said he had the exact same setting I use and still got the Shadows. Besides possibly different drivers we had the same settings but he was getting shadows and I can't see why he would besides maybe SLI causing wierd issues that is why I am here asking about SLI..
I will load the game up and try and go through all the shaders again on different settings..
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit