So this is what I found out which makes me to believe is a game bug, rather than a driver bug.
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
So this is what I found out which makes me to believe is a game bug, rather than a driver bug.
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Istn it possible to write this shader to a new register every ... 3 seconds and read this shader out of this register evere 3 seconds, so that the dissapearing will not happen ?
May be another driver will help ?
Istn it possible to write this shader to a new register every ... 3 seconds and read this shader out of this register evere 3 seconds, so that the dissapearing will not happen ?
May be another driver will help ?
Like my work? Donations can be made via PayPal to: rauti@inetmx.de
Hey Bo3b,
I went back and watched your video on the Prime Directive that was some good stuff.. It made a lot more cense now that I been doing this for a while..
Still a little confused as your videos are for the HelixMod and not 3D Migoto but I see the code at the bottom now and it makes a bit more cense..
now looking at your code you put up there I tried it and nothing happened here is what I used:
float separation = StereoParams.Load(0).x;
float convergence = StereoParams.Load(0).y;
o0.X += separation * convergence;
should there be another line of code there?? between line 2 and 3??
Other then that I am going to try and find the shader for the clouds that don't work and input the new code..
[quote="helifax"]
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
[/quote]
What about this section in the ini file ??
[code]; Specifies options for the frame analysis feature. Options can be combined by
; separating them with a space.
; log: Log draw calls and state changes (one log file per context)
; hold: Continue analysing subsequent frames while the key is held
; dump_rt_jps: Dumps out render targets as JPS files. These are the easiest to
; work with and don't take up as much space as DDS files, but they
; are not dumped for every render target and are missing some data.
; dump_rt_dds: Dumps out render targets as DDS files. WARNING: This option may
; require hundreds of gigabytes and a long time! Only use it if
; you absolutely need more information than you can get otherwise.
; Will also dump buffer type render targets & UAVs as .buf files.
; dump_rt: Dumps render targets as JPS files when possible, or DDS when not.
; clear_rt: Clears each render target the first time they are used in the
; frame. Makes it easier to see what is being drawn if the game
; doesn't clear them, but might cause some effects not to render.
; dump_depth: Dumps depth/stencil targets as DDS files
; dump_tex_jps: Dumps textures as JPS files
; dump_tex_dds: Dumps textures as DDS files (Same warning as for dump_rt_dds)
; Will also dump buffer type shader resource views as .buf files.
; dump_tex: Dumps textures as JPS files when possible, or DDS when not.
; dump_cb: Dumps constant buffers as binary .buf files
; dump_cb_txt: Decodes constant buffers as an array of float4s
; dump_vb: Dumps vertex buffers as binary .buf files
; dump_vb_txt: Decodes vertex buffers as an array of float4s
; dump_ib: Dumps index buffers as binary .buf files
; dump_ib_txt: Decodes index buffers
; filename_reg: Normally the draw number is the first part of the filename so
; that the files will be sorted in the order they were used in
; the game. Sometimes it is more desirable to examine how a
; specific output changed through the frame and this option will
; place the register number first in the filename to allow that.
; mono: Dump out mono textures instead of stereo. To dump both, specify
; 'mono stereo'. If neither are specified, defaults to stereo.
; analyse_options can also be specified in [ShaderOverride*] and
; [TextureOverride*] sections to set up triggers to change the options mid-way
; through a frame analysis, either for a single draw call (default), or
; permanently (by adding the 'persist' keyword).
analyse_options = log hold
[/code]
After that in WD2 fix there is sone specification for this analysis of a special shader (I THINK ^^)
[code][ShaderOverrideAnalysis]
hash = 5f48ddf6f31ff15c
;analyse_options = persist log dump_rt clear_rt
[ShaderOverrideAnalysisSkip1]
hash=d592f5b6e1dc2478
analyse_options =
[ShaderOverrideOceanDay]
hash=80038b93c4898c8b
analyse_options =
[ShaderOverrideAnalysisSkip3]
hash=7e593a2ef69d4929
analyse_options =
[ShaderOverrideAnalysisSkip6]
hash=b56f0cd2c82267e7
analyse_options =
[ShaderOverrideAnalysisSkip7]
hash=b96895a880ee09a9
analyse_options =[/code]
May this will help you with FA ????
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
What about this section in the ini file ??
; Specifies options for the frame analysis feature. Options can be combined by
; separating them with a space.
; log: Log draw calls and state changes (one log file per context)
; hold: Continue analysing subsequent frames while the key is held
; dump_rt_jps: Dumps out render targets as JPS files. These are the easiest to
; work with and don't take up as much space as DDS files, but they
; are not dumped for every render target and are missing some data.
; dump_rt_dds: Dumps out render targets as DDS files. WARNING: This option may
; require hundreds of gigabytes and a long time! Only use it if
; you absolutely need more information than you can get otherwise.
; Will also dump buffer type render targets & UAVs as .buf files.
; dump_rt: Dumps render targets as JPS files when possible, or DDS when not.
; clear_rt: Clears each render target the first time they are used in the
; frame. Makes it easier to see what is being drawn if the game
; doesn't clear them, but might cause some effects not to render.
; dump_depth: Dumps depth/stencil targets as DDS files
; dump_tex_jps: Dumps textures as JPS files
; dump_tex_dds: Dumps textures as DDS files (Same warning as for dump_rt_dds)
; Will also dump buffer type shader resource views as .buf files.
; dump_tex: Dumps textures as JPS files when possible, or DDS when not.
; dump_cb: Dumps constant buffers as binary .buf files
; dump_cb_txt: Decodes constant buffers as an array of float4s
; dump_vb: Dumps vertex buffers as binary .buf files
; dump_vb_txt: Decodes vertex buffers as an array of float4s
; dump_ib: Dumps index buffers as binary .buf files
; dump_ib_txt: Decodes index buffers
; filename_reg: Normally the draw number is the first part of the filename so
; that the files will be sorted in the order they were used in
; the game. Sometimes it is more desirable to examine how a
; specific output changed through the frame and this option will
; place the register number first in the filename to allow that.
; mono: Dump out mono textures instead of stereo. To dump both, specify
; 'mono stereo'. If neither are specified, defaults to stereo.
; analyse_options can also be specified in [ShaderOverride*] and
; [TextureOverride*] sections to set up triggers to change the options mid-way
; through a frame analysis, either for a single draw call (default), or
; permanently (by adding the 'persist' keyword).
analyse_options = log hold
After that in WD2 fix there is sone specification for this analysis of a special shader (I THINK ^^)
[quote="The_Nephilim"]So do I take all the code in both the Good skyboa and replace it with bothe the bad skyboxes or is there morte involved then a simple cope paste over??
Please forgive me for not know but from what you said above that seems like what I have to do but my first attempt failed so I am retrying..[/quote]
Yep, that's the idea, copy and paste the full shaders, but there is no guarantee that it can work.
The pipeline is more than just the VS and PS, there are some earlier and later pieces that might conflict if the input and output signatures don't match.
Don't think anyone has actually tried this approach, so don't be too surprised if it doesn't work. Be sure to check the log file to see if any errors are reported.
The_Nephilim said:So do I take all the code in both the Good skyboa and replace it with bothe the bad skyboxes or is there morte involved then a simple cope paste over??
Please forgive me for not know but from what you said above that seems like what I have to do but my first attempt failed so I am retrying..
Yep, that's the idea, copy and paste the full shaders, but there is no guarantee that it can work.
The pipeline is more than just the VS and PS, there are some earlier and later pieces that might conflict if the input and output signatures don't match.
Don't think anyone has actually tried this approach, so don't be too surprised if it doesn't work. Be sure to check the log file to see if any errors are reported.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="The_Nephilim"]Hey Bo3b,
I went back and watched your video on the Prime Directive that was some good stuff.. It made a lot more cense now that I been doing this for a while..
Still a little confused as your videos are for the HelixMod and not 3D Migoto but I see the code at the bottom now and it makes a bit more cense..
now looking at your code you put up there I tried it and nothing happened here is what I used:
float separation = StereoParams.Load(0).x;
float convergence = StereoParams.Load(0).y;
o0.X += separation * convergence;
should there be another line of code there?? between line 2 and 3??
Other then that I am going to try and find the shader for the clouds that don't work and input the new code..[/quote]
OK, good. I was going to suggest that you review the class material again, because when you first hit something complicated like this it's hard to make sense of the class. And the second time around usually is better because you have a grasp of the jargon and then think about the concepts.
For the code, that's the basic idea that should work for skyboxes. But, skyboxes only. You wouldn't use that for anything you didn't want at infinity.
For completeness, here are the variants that Mike_ar69 gave me awhile back. You can try different ones, but if you aren't getting anything to change using the main one I gave, then something else is going wrong. Check the log. Be sure to try the zero out approach. You need to make sure that you are modifying the right shader, and that it is actually having an effect.
Also for your code there, you are using .X instead of .x. I don't think HLSL is case sensitive, but a lot of stupid languages are, so best to get in the habit of using the right case.
[code]r10.x += stereoParams.x * [0-1] or [0-inf...];
r10.x += stereoParams.x * (stereoParams.y);
r10.x += stereoParams.x * (stereoParams.y) * [0-1];
r10.x += stereoParams.x * (- r10.w + stereoParams.y + [0-1]);
[/code]
The third variant generally works best for me, but games are weird and might need something different.
should there be another line of code there?? between line 2 and 3??
Other then that I am going to try and find the shader for the clouds that don't work and input the new code..
OK, good. I was going to suggest that you review the class material again, because when you first hit something complicated like this it's hard to make sense of the class. And the second time around usually is better because you have a grasp of the jargon and then think about the concepts.
For the code, that's the basic idea that should work for skyboxes. But, skyboxes only. You wouldn't use that for anything you didn't want at infinity.
For completeness, here are the variants that Mike_ar69 gave me awhile back. You can try different ones, but if you aren't getting anything to change using the main one I gave, then something else is going wrong. Check the log. Be sure to try the zero out approach. You need to make sure that you are modifying the right shader, and that it is actually having an effect.
Also for your code there, you are using .X instead of .x. I don't think HLSL is case sensitive, but a lot of stupid languages are, so best to get in the habit of using the right case.
[quote="helifax"]So this is what I found out which makes me to believe is a game bug, rather than a driver bug.
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?[/quote]
Pretty weird that it fades after 5 seconds. Like some sort of getting further and further off in fp errors maybe until it flies off screen.
I'm not much help here, I haven't use frame analysis deeply like this, so can't really suggest how to dig into this. We know DarkStarSword is offline, so maybe ping DHR because he's used it to good effect.
My best guess for that would be to try to track that texture specifically, using the Texture analysis that Losti kindly posted for reference.
Something like:
[code][TextureOverriddeAnalysis]
hash=d592f5b6e1dc2478
analyse_options = dump_tex stereo persist[/code]
The idea being to skip a lot of the noise generated by frame analysis, and look specifically at the texture or render target that goes wrong. Especially if it works, then goes bad, it would be interesting to see it get generated and watch it go from stereo to one-eye in dump files.
Another thought is to use DarkStarSword's on-screen display shader, where you can watch the contents of different registers and buffers in real-time as hex dumps. This assumes you know what you want to follow, but it might be possible to target the t20 to see what it looks like live.
helifax said:So this is what I found out which makes me to believe is a game bug, rather than a driver bug.
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
Pretty weird that it fades after 5 seconds. Like some sort of getting further and further off in fp errors maybe until it flies off screen.
I'm not much help here, I haven't use frame analysis deeply like this, so can't really suggest how to dig into this. We know DarkStarSword is offline, so maybe ping DHR because he's used it to good effect.
My best guess for that would be to try to track that texture specifically, using the Texture analysis that Losti kindly posted for reference.
The idea being to skip a lot of the noise generated by frame analysis, and look specifically at the texture or render target that goes wrong. Especially if it works, then goes bad, it would be interesting to see it get generated and watch it go from stereo to one-eye in dump files.
Another thought is to use DarkStarSword's on-screen display shader, where you can watch the contents of different registers and buffers in real-time as hex dumps. This assumes you know what you want to follow, but it might be possible to target the t20 to see what it looks like live.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Well I will need to go over your classes again and see what I missed the first time ;)
Now I input your new code into the Skybox for the Tank area and it worked fine.. Now in the Plane area I injected the code and nothing.. So to be sure I had the correct Shader I Zeroed it out and it went away so I do believe I have the correct shader file it is just not working..
This following shader is the clouds in the plane area.. no there are 2 sets of clouds one Cumulus the other clouds are a bit hinner and spread out. They go into Stereo just not the Big Cumulus clouds..
I looked at the code based on the Prime directive but it just has not fully clicked altho I do have a better understaning of what is going on just not fully yet..
So If you can look at thos shader and gimme a idea what could be wrong or maybe prod me in the direction as I am out of ideas as no code I inject is working.. I even tried to change the shaders files all I got was a blue sky and red sky in the plane area..
So that did not work either but it had an affect.. well here is the cloud shader file:
[code] // ---- Created with 3Dmigoto v1.2.56 on Sat Apr 01 15:24:24 2017
// Cumulus clouds in daytime plane area..
cbuffer cb0 : register(b0)
{
float4 cb0[5];
}
// 3Dmigoto declarations
#define cmp -
Texture1D<float4> IniParams : register(t120);
Texture2D<float4> StereoParams : register(t125);
void main(
float2 v0 : POSITION0,
out float4 o0 : SV_POSITION0,
out float3 o1 : TEXCOORD0)
{
float4 r0,r1;
uint4 bitmask, uiDest;
float4 fDest;
o0.z = cb0[4].x * 20 + cb0[4].y;
o0.xy = float2(20,20) * v0.xy;
o0.w = 20;
r0.x = cmp(0 < v0.y);
r0.y = cmp(v0.x < 0);
r1.xyz = r0.yyy ? cb0[0].xyz : cb0[1].xyz;
r0.yzw = r0.yyy ? cb0[2].xyz : cb0[3].xyz;
o1.xyz = r0.xxx ? r1.xyz : r0.yzw;
float separation = StereoParams.Load(0).x;
float convergence = StereoParams.Load(0).y;
o0.x += separation * convergence;
return;
}
/*~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
//
// Generated by Microsoft (R) D3D Shader Disassembler
//
// using 3Dmigoto v1.2.56 on Sat Apr 01 15:24:24 2017
//
//
// Input signature:
//
// Name Index Mask Register SysValue Format Used
// -------------------- ----- ------ -------- -------- ------- ------
// POSITION 0 xy 0 NONE float xy
//
//
// Output signature:
//
// Name Index Mask Register SysValue Format Used
// -------------------- ----- ------ -------- -------- ------- ------
// SV_POSITION 0 xyzw 0 POS float xyzw
// TEXCOORD 0 xyz 1 NONE float xyz
//
vs_4_0
dcl_constantbuffer cb0[5], immediateIndexed
dcl_input v0.xy
dcl_output_siv o0.xyzw, position
dcl_output o1.xyz
dcl_temps 2
mad o0.z, cb0[4].x, l(20.000000), cb0[4].y
mul o0.xy, v0.xyxx, l(20.000000, 20.000000, 0.000000, 0.000000)
mov o0.w, l(20.000000)
lt r0.x, l(0.000000), v0.y
lt r0.y, v0.x, l(0.000000)
movc r1.xyz, r0.yyyy, cb0[0].xyzx, cb0[1].xyzx
movc r0.yzw, r0.yyyy, cb0[2].xxyz, cb0[3].xxyz
movc o1.xyz, r0.xxxx, r1.xyzx, r0.yzwy
ret
// Approximately 0 instruction slots used
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~*/
[/code]
Now too sure why it is not having an affect on stereoising the clouds??
Well I will need to go over your classes again and see what I missed the first time ;)
Now I input your new code into the Skybox for the Tank area and it worked fine.. Now in the Plane area I injected the code and nothing.. So to be sure I had the correct Shader I Zeroed it out and it went away so I do believe I have the correct shader file it is just not working..
This following shader is the clouds in the plane area.. no there are 2 sets of clouds one Cumulus the other clouds are a bit hinner and spread out. They go into Stereo just not the Big Cumulus clouds..
I looked at the code based on the Prime directive but it just has not fully clicked altho I do have a better understaning of what is going on just not fully yet..
So If you can look at thos shader and gimme a idea what could be wrong or maybe prod me in the direction as I am out of ideas as no code I inject is working.. I even tried to change the shaders files all I got was a blue sky and red sky in the plane area..
So that did not work either but it had an affect.. well here is the cloud shader file:
// ---- Created with 3Dmigoto v1.2.56 on Sat Apr 01 15:24:24 2017
// Cumulus clouds in daytime plane area..
cbuffer cb0 : register(b0)
{
float4 cb0[5];
}
[quote="bo3b"][quote="helifax"]So this is what I found out which makes me to believe is a game bug, rather than a driver bug.
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?[/quote]
Pretty weird that it fades after 5 seconds. Like some sort of getting further and further off in fp errors maybe until it flies off screen.
I'm not much help here, I haven't use frame analysis deeply like this, so can't really suggest how to dig into this. We know DarkStarSword is offline, so maybe ping DHR because he's used it to good effect.
My best guess for that would be to try to track that texture specifically, using the Texture analysis that Losti kindly posted for reference.
Something like:
[code][TextureOverriddeAnalysis]
hash=d592f5b6e1dc2478
analyse_options = dump_tex stereo persist[/code]
The idea being to skip a lot of the noise generated by frame analysis, and look specifically at the texture or render target that goes wrong. Especially if it works, then goes bad, it would be interesting to see it get generated and watch it go from stereo to one-eye in dump files.
Another thought is to use DarkStarSword's on-screen display shader, where you can watch the contents of different registers and buffers in real-time as hex dumps. This assumes you know what you want to follow, but it might be possible to target the t20 to see what it looks like live.
[/quote]
Yeah, It doesn't seem to go off-screen, unless it suddenly goes out of screen.
If you have any Battlefield game, you can easily check it out;)
Either the driver "bugs" or the engine CS "bugs". I think the engine CS bugs as it only happens in FB3 games when Compute Lights work.
I tried to use the ShaderOverride on one of the CS but couldn't get anything from it... with Frame Analysis.
Yeah, I think only DSS would be able to help me out here:(
helifax said:So this is what I found out which makes me to believe is a game bug, rather than a driver bug.
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
Pretty weird that it fades after 5 seconds. Like some sort of getting further and further off in fp errors maybe until it flies off screen.
I'm not much help here, I haven't use frame analysis deeply like this, so can't really suggest how to dig into this. We know DarkStarSword is offline, so maybe ping DHR because he's used it to good effect.
My best guess for that would be to try to track that texture specifically, using the Texture analysis that Losti kindly posted for reference.
The idea being to skip a lot of the noise generated by frame analysis, and look specifically at the texture or render target that goes wrong. Especially if it works, then goes bad, it would be interesting to see it get generated and watch it go from stereo to one-eye in dump files.
Another thought is to use DarkStarSword's on-screen display shader, where you can watch the contents of different registers and buffers in real-time as hex dumps. This assumes you know what you want to follow, but it might be possible to target the t20 to see what it looks like live.
Yeah, It doesn't seem to go off-screen, unless it suddenly goes out of screen.
If you have any Battlefield game, you can easily check it out;)
Either the driver "bugs" or the engine CS "bugs". I think the engine CS bugs as it only happens in FB3 games when Compute Lights work.
I tried to use the ShaderOverride on one of the CS but couldn't get anything from it... with Frame Analysis.
Yeah, I think only DSS would be able to help me out here:(
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Hmm.. a very weird thing that happens as well with FB3 games is the NVIDIA overlay only appears in one eye.
Guess the eye.. Yupp, the right one (where all the effects also happen).
I tried different profiles, but it always is the same. It looks like the swap_chain is generated weird and the 3D Vision Driver doesn't hook properly...
Could this be somehow related?
Hmm.. a very weird thing that happens as well with FB3 games is the NVIDIA overlay only appears in one eye.
Guess the eye.. Yupp, the right one (where all the effects also happen).
I tried different profiles, but it always is the same. It looks like the swap_chain is generated weird and the 3D Vision Driver doesn't hook properly...
Could this be somehow related?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Hmm.. a very weird thing that happens as well with FB3 games is the NVIDIA overlay only appears in one eye.
Guess the eye.. Yupp, the right one (where all the effects also happen).
I tried different profiles, but it always is the same. It looks like the swap_chain is generated weird and the 3D Vision Driver doesn't hook properly...
Could this be somehow related?[/quote]
There are so many options in the ini file, may be some of this will help? There is anything strange with the game because the unting mode will not work with = 1, it leads to an error that mention the swap_chain.
[url=http://www.fotos-hochladen.net][img]http://www.fotos-hochladen.net/uploads/nvidiaerrorn7b4y9ocl8.jpg[/img][/url]
So may be your guess issnt that wrong! I have played something arroud with the [Device] parameters in the ini file, this leads to other errors.
[url=http://www.fotos-hochladen.net][img]http://img5.fotos-hochladen.net/uploads/nvidiaerrorp4dqvxb5jy.jpg[/img][/url]
So i set huntung = 2 and enabled it in the game and it works... Have you tryed to disable the stereo kick in starting the game and enable it in the game?May this can help???
Do you have tryed other ini parameters to change? I have no idea what they do but may be you and if not may it could help to try n error playing around with it???
[b]; This setting enables stereo compute shaders, which is requires to fix a lot
; of "one eye" type rendering issues in many DX11 games:
;StereoFlagsDX10 = 0x00004000[/b]
; some games explicitely disable stereo, prohibiting any stereo attempts.
; settings this to 1 ignores all stereo disabling calls and also calls NvAPI_Stereo_Enable to force stereo on.
;force_stereo=1
.
.
.
; games which have their own stereo renderer disable the NVidia automatic
; stereo mode and render themselves into stereo buffers (Crysis 3 for example).
; Setting this to 1 disables the game stereo renderer and enables NVidia auto stereo mechanism.
; This also forces 'false' as a return for any request for NvAPI_Stereo_IsEnabled.
automatic_mode=0
.
.
.
; sets the global surface creation heuristic for NVidia stero driver.
; 0 = NVAPI_STEREO_SURFACECREATEMODE_AUTO - use driver registry profile settings for surface creation mode.
; 1 = NVAPI_STEREO_SURFACECREATEMODE_FORCESTEREO - Always create stereo surfaces.
; 2 = NVAPI_STEREO_SURFACECREATEMODE_FORCEMONO - Always create mono surfaces.
;surface_createmode=1
.
.
.
; overrides surface creation mode for square surfaces.
;surface_square_createmode=1
.
.
.
; Force the NvAPI_Initialize to return an error so that games think stereo and NVidia is unavailable.
force_no_nvapi=0
.
.
.
;------------------------------------------------------------------------------------------------------
; Settings for GPU manipulations.
; Render settings override
;------------------------------------------------------------------------------------------------------
[Rendering]
..
...
....
......
helifax said:Hmm.. a very weird thing that happens as well with FB3 games is the NVIDIA overlay only appears in one eye.
Guess the eye.. Yupp, the right one (where all the effects also happen).
I tried different profiles, but it always is the same. It looks like the swap_chain is generated weird and the 3D Vision Driver doesn't hook properly...
Could this be somehow related?
There are so many options in the ini file, may be some of this will help? There is anything strange with the game because the unting mode will not work with = 1, it leads to an error that mention the swap_chain.
So may be your guess issnt that wrong! I have played something arroud with the [Device] parameters in the ini file, this leads to other errors.
So i set huntung = 2 and enabled it in the game and it works... Have you tryed to disable the stereo kick in starting the game and enable it in the game?May this can help???
Do you have tryed other ini parameters to change? I have no idea what they do but may be you and if not may it could help to try n error playing around with it???
; This setting enables stereo compute shaders, which is requires to fix a lot
; of "one eye" type rendering issues in many DX11 games:
;StereoFlagsDX10 = 0x00004000
; some games explicitely disable stereo, prohibiting any stereo attempts.
; settings this to 1 ignores all stereo disabling calls and also calls NvAPI_Stereo_Enable to force stereo on.
;force_stereo=1
.
.
.
; games which have their own stereo renderer disable the NVidia automatic
; stereo mode and render themselves into stereo buffers (Crysis 3 for example).
; Setting this to 1 disables the game stereo renderer and enables NVidia auto stereo mechanism.
; This also forces 'false' as a return for any request for NvAPI_Stereo_IsEnabled.
automatic_mode=0
.
.
.
; sets the global surface creation heuristic for NVidia stero driver.
; 0 = NVAPI_STEREO_SURFACECREATEMODE_AUTO - use driver registry profile settings for surface creation mode.
; 1 = NVAPI_STEREO_SURFACECREATEMODE_FORCESTEREO - Always create stereo surfaces.
; 2 = NVAPI_STEREO_SURFACECREATEMODE_FORCEMONO - Always create mono surfaces.
;surface_createmode=1
.
.
.
; overrides surface creation mode for square surfaces.
;surface_square_createmode=1
.
.
.
; Force the NvAPI_Initialize to return an error so that games think stereo and NVidia is unavailable.
force_no_nvapi=0
Trust me, I've configured 3DMigoto and the nvidia driver to work at it's best.
I don't crashes or any other things.
I am able to hunt/dump/change shaders and so on;)
Trust me I even played with the Stereo Texture and how the driver makes the stereo textures. Wasn't able to make the game display same thing in both eyes.
The biggest problem is how Frostbite 3 games work. They use HEAVILY compute shaders for lighting.
I am not sure if there is a bug in the Nvidia 3D Vision driver related to computes, but I doubt it as I haven't seen this behaviour in any other game.
We've fixed plenty of FB3 games before, but we never ever fixed the CS in any of them. They are just disabled. Problem here is we can't as we lose almost all the lights.
Thnx Bo3b. I try to do what on a CS and see if I get anything (instead of a texture override).
Yes, is very strange that it "pops-out" of existence. It doesn't actually move from location.
The game dumped 1060 CS files only in the menu... You can imagine going forward, how many it will dump;)
Trust me, I've configured 3DMigoto and the nvidia driver to work at it's best.
I don't crashes or any other things.
I am able to hunt/dump/change shaders and so on;)
Trust me I even played with the Stereo Texture and how the driver makes the stereo textures. Wasn't able to make the game display same thing in both eyes.
The biggest problem is how Frostbite 3 games work. They use HEAVILY compute shaders for lighting.
I am not sure if there is a bug in the Nvidia 3D Vision driver related to computes, but I doubt it as I haven't seen this behaviour in any other game.
We've fixed plenty of FB3 games before, but we never ever fixed the CS in any of them. They are just disabled. Problem here is we can't as we lose almost all the lights.
Thnx Bo3b. I try to do what on a CS and see if I get anything (instead of a texture override).
Yes, is very strange that it "pops-out" of existence. It doesn't actually move from location.
The game dumped 1060 CS files only in the menu... You can imagine going forward, how many it will dump;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
@helifax: 1060 CS? My oh my. Programmers. Why make something simple, when you can complicate the beejesus out of it instead?
Looks like the trend of game devs is to stuff everything into CS now, because the CPUs have stalled out and Moore is no-Moore. That's going to bad for us in general.
For this specific problem, it seems like a game-bug or more likely, a driver bug with 3D Vision Automatic. You might report it as a problem.
The right eye is the left side of a back-buffer, and hence will get drawn in cases where it has lost the idea that it should be drawing a stereo buffer. You might try to see if there is an interim buffer that is somehow not being stereo and should be. That would be like what DarkStarSword found, where an in-between texture/buffer needed to be set stereo. Can be done using the TextureOverride feature.
One last thought for trying to dig into the guts of this. If you have motivation, NVidia has a graphics debugging tool NSight, which is pretty killer for inspecting the full pipeline of what creates who and passes on to where. Only really works on Win10, needs new APIs.
Not sure it will work here, because some games prevent any debuggers from hooking in.
@helifax: 1060 CS? My oh my. Programmers. Why make something simple, when you can complicate the beejesus out of it instead?
Looks like the trend of game devs is to stuff everything into CS now, because the CPUs have stalled out and Moore is no-Moore. That's going to bad for us in general.
For this specific problem, it seems like a game-bug or more likely, a driver bug with 3D Vision Automatic. You might report it as a problem.
The right eye is the left side of a back-buffer, and hence will get drawn in cases where it has lost the idea that it should be drawing a stereo buffer. You might try to see if there is an interim buffer that is somehow not being stereo and should be. That would be like what DarkStarSword found, where an in-between texture/buffer needed to be set stereo. Can be done using the TextureOverride feature.
One last thought for trying to dig into the guts of this. If you have motivation, NVidia has a graphics debugging tool NSight, which is pretty killer for inspecting the full pipeline of what creates who and passes on to where. Only really works on Win10, needs new APIs.
Not sure it will work here, because some games prevent any debuggers from hooking in.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
@Bo3b,
I was looking through the log and found the following error and this shader is the one with the clouds I can not get Stereoized. so this could be something as to why they do not go into Stereo with the code I Injected:
[code] HackerDXGISwapChain::GetFrameStatistics(class HackerDXGISwapChain@000000000C1D4890) called
returns result = 0
overriding NVAPI wrapper failed.
> reloading *_replace.txt fixes from ShaderFixes
>Replacement shader found. Re-Loading replacement HLSL code from b0b697a3c3331096-vs_replace.txt
Reload source code loaded. Size = 2215
compiling replacement HLSL code with shader model vs_4_0
compile result for replacement HLSL shader: 80004005
--------------------------------------------- BEGIN ---------------------------------------------
C:\WarThunder\win64\ShaderFixes\b0b697a3c3331096-vs_replace.txt(37,9-20): error X3004: undeclared identifier 'stereoParams'
---------------------------------------------- END ----------------------------------------------
> FAILED to reload shaders from ShaderFixes
Reloading d3dx.ini (EXPERIMENTAL)...
[/code]
IT states there is an undeclared identifier "StereoParams" could this be something or am I using the wrong code..
I ask because the above variants in code you showed me, I am a bit confused on how to use it.. here is what I used:
[code] float separation = StereoParams.Load(0).x;
float convergence = StereoParams.Load(0).y;
o0.x += separation * convergence; [/code]
Now with the Variants you showed to I add to the code or remove the 3rd line in the code above and replace it with the variants you showed me??
here are the variants you showed me:
[code] o0.x += stereoParams.x * [0-1];
o0.x += stereoParams.x * (stereoParams.y);
o0.x += stereoParams.x * (stereoParams.y) * [0-1];
o0.x += stereoParams.x * (- r10.w + stereoParams.y + [0-1]); [/code]
I appreciate your help as I still can not get the clouds in the Aircraft section get Stereoized
I was looking through the log and found the following error and this shader is the one with the clouds I can not get Stereoized. so this could be something as to why they do not go into Stereo with the code I Injected:
HackerDXGISwapChain::GetFrameStatistics(class HackerDXGISwapChain@000000000C1D4890) called
returns result = 0
overriding NVAPI wrapper failed.
> reloading *_replace.txt fixes from ShaderFixes
>Replacement shader found. Re-Loading replacement HLSL code from b0b697a3c3331096-vs_replace.txt
Reload source code loaded. Size = 2215
compiling replacement HLSL code with shader model vs_4_0
compile result for replacement HLSL shader: 80004005
--------------------------------------------- BEGIN ---------------------------------------------
C:\WarThunder\win64\ShaderFixes\b0b697a3c3331096-vs_replace.txt(37,9-20): error X3004: undeclared identifier 'stereoParams'
---------------------------------------------- END ----------------------------------------------
> FAILED to reload shaders from ShaderFixes
Reloading d3dx.ini (EXPERIMENTAL)...
IT states there is an undeclared identifier "StereoParams" could this be something or am I using the wrong code..
I ask because the above variants in code you showed me, I am a bit confused on how to use it.. here is what I used:
[quote="bo3b"]@helifax: 1060 CS? My oh my. Programmers. Why make something simple, when you can complicate the beejesus out of it instead?
Looks like the trend of game devs is to stuff everything into CS now, because the CPUs have stalled out and Moore is no-Moore. That's going to bad for us in general.
For this specific problem, it seems like a game-bug or more likely, a driver bug with 3D Vision Automatic. You might report it as a problem.
The right eye is the left side of a back-buffer, and hence will get drawn in cases where it has lost the idea that it should be drawing a stereo buffer. You might try to see if there is an interim buffer that is somehow not being stereo and should be. That would be like what DarkStarSword found, where an in-between texture/buffer needed to be set stereo. Can be done using the TextureOverride feature.
One last thought for trying to dig into the guts of this. If you have motivation, NVidia has a graphics debugging tool NSight, which is pretty killer for inspecting the full pipeline of what creates who and passes on to where. Only really works on Win10, needs new APIs.
Not sure it will work here, because some games prevent any debuggers from hooking in.[/quote]
Yeah... well DICE believes it has this BRIGHT idea where everything is done in CS. Who the HELL computes the WHOLE lighting position in CS. They don't use the CS to calculate the tiles! NO! THEY ACTUALLY DO THE DRAWING! The Ps is just a simple "BLIT"... (Not to mention why the Frostbite3 engine is so buggy, constantly crashing in DX11 errors. Probably the only freaking engine TO CONSTANTLY CRASH! All their games has this issue - to date! No other engine does this. Oh and that "Memory Leak" from ages past? Is still there! Nope, not fixed).
We were very lucky with DA:I as they shipped both variants (Lights in CS and PS - and GUESS WHAT!!! Performance was exactly the same! So, this whole "optimisation fiasco" is - to put it mildly - their lack of understanding how CS work and where they should be used! But, we get "papers" from them, on how their engine is "superior in reproducing photo-realism"... well guess what. Is good, but I've seen way better engines! Not to mention the constantly crashing problem, as NVIDIA knows! Every time a new game under Frostbite is released, new hacks in the drivers are required...sigh).
Anyway, I've stopped looking into this:( Spent 2 more days without getting anywhere. I don't know if is a bug in the Nvidia driver or not...might be, but is the ONLY engine doing this, DICE should fix it, not the other way around... then again they don't support 3D Vision in their engine....soo..... :(
Yeah, I'm using NSight pretty regulary. Unfortunately, I couldn't see anything that might lead to this problem. :(
Guess, we should avoid any future "Frostbite3" based games in the future!
bo3b said:@helifax: 1060 CS? My oh my. Programmers. Why make something simple, when you can complicate the beejesus out of it instead?
Looks like the trend of game devs is to stuff everything into CS now, because the CPUs have stalled out and Moore is no-Moore. That's going to bad for us in general.
For this specific problem, it seems like a game-bug or more likely, a driver bug with 3D Vision Automatic. You might report it as a problem.
The right eye is the left side of a back-buffer, and hence will get drawn in cases where it has lost the idea that it should be drawing a stereo buffer. You might try to see if there is an interim buffer that is somehow not being stereo and should be. That would be like what DarkStarSword found, where an in-between texture/buffer needed to be set stereo. Can be done using the TextureOverride feature.
One last thought for trying to dig into the guts of this. If you have motivation, NVidia has a graphics debugging tool NSight, which is pretty killer for inspecting the full pipeline of what creates who and passes on to where. Only really works on Win10, needs new APIs.
Not sure it will work here, because some games prevent any debuggers from hooking in.
Yeah... well DICE believes it has this BRIGHT idea where everything is done in CS. Who the HELL computes the WHOLE lighting position in CS. They don't use the CS to calculate the tiles! NO! THEY ACTUALLY DO THE DRAWING! The Ps is just a simple "BLIT"... (Not to mention why the Frostbite3 engine is so buggy, constantly crashing in DX11 errors. Probably the only freaking engine TO CONSTANTLY CRASH! All their games has this issue - to date! No other engine does this. Oh and that "Memory Leak" from ages past? Is still there! Nope, not fixed).
We were very lucky with DA:I as they shipped both variants (Lights in CS and PS - and GUESS WHAT!!! Performance was exactly the same! So, this whole "optimisation fiasco" is - to put it mildly - their lack of understanding how CS work and where they should be used! But, we get "papers" from them, on how their engine is "superior in reproducing photo-realism"... well guess what. Is good, but I've seen way better engines! Not to mention the constantly crashing problem, as NVIDIA knows! Every time a new game under Frostbite is released, new hacks in the drivers are required...sigh).
Anyway, I've stopped looking into this:( Spent 2 more days without getting anywhere. I don't know if is a bug in the Nvidia driver or not...might be, but is the ONLY engine doing this, DICE should fix it, not the other way around... then again they don't support 3D Vision in their engine....soo..... :(
Yeah, I'm using NSight pretty regulary. Unfortunately, I couldn't see anything that might lead to this problem. :(
Guess, we should avoid any future "Frostbite3" based games in the future!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Normal Stereo Compute (0x00004000 STEREO_COMPUTE_ENABLE) works just fine.
You start the game.
Go to character creation.
You will see the character missing the blue lights.
Open The command window (~) and type in: WorldRenderer.LightCSPathEneabled 0. Will make the lights go away.
Now do this again but enable the lights (with 1). Will make the lights STAY for around 5 seconds before they go missing in one EYE. For those 5 seconds everything is perfect, as I already fixed that shader).
So, I think after a few CS runs, the shader just starts discarding information.
Can somebody, refresh my memory, on how can I use Frame-Analysis to see all the shaders that are called/used by this CS?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
May be another driver will help ?
Like my work? Donations can be made via PayPal to: rauti@inetmx.de
I went back and watched your video on the Prime Directive that was some good stuff.. It made a lot more cense now that I been doing this for a while..
Still a little confused as your videos are for the HelixMod and not 3D Migoto but I see the code at the bottom now and it makes a bit more cense..
now looking at your code you put up there I tried it and nothing happened here is what I used:
float separation = StereoParams.Load(0).x;
float convergence = StereoParams.Load(0).y;
o0.X += separation * convergence;
should there be another line of code there?? between line 2 and 3??
Other then that I am going to try and find the shader for the clouds that don't work and input the new code..
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit
What about this section in the ini file ??
After that in WD2 fix there is sone specification for this analysis of a special shader (I THINK ^^)
May this will help you with FA ????
Like my work? Donations can be made via PayPal to: rauti@inetmx.de
Yep, that's the idea, copy and paste the full shaders, but there is no guarantee that it can work.
The pipeline is more than just the VS and PS, there are some earlier and later pieces that might conflict if the input and output signatures don't match.
Don't think anyone has actually tried this approach, so don't be too surprised if it doesn't work. Be sure to check the log file to see if any errors are reported.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
OK, good. I was going to suggest that you review the class material again, because when you first hit something complicated like this it's hard to make sense of the class. And the second time around usually is better because you have a grasp of the jargon and then think about the concepts.
For the code, that's the basic idea that should work for skyboxes. But, skyboxes only. You wouldn't use that for anything you didn't want at infinity.
For completeness, here are the variants that Mike_ar69 gave me awhile back. You can try different ones, but if you aren't getting anything to change using the main one I gave, then something else is going wrong. Check the log. Be sure to try the zero out approach. You need to make sure that you are modifying the right shader, and that it is actually having an effect.
Also for your code there, you are using .X instead of .x. I don't think HLSL is case sensitive, but a lot of stupid languages are, so best to get in the habit of using the right case.
The third variant generally works best for me, but games are weird and might need something different.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Pretty weird that it fades after 5 seconds. Like some sort of getting further and further off in fp errors maybe until it flies off screen.
I'm not much help here, I haven't use frame analysis deeply like this, so can't really suggest how to dig into this. We know DarkStarSword is offline, so maybe ping DHR because he's used it to good effect.
My best guess for that would be to try to track that texture specifically, using the Texture analysis that Losti kindly posted for reference.
Something like:
The idea being to skip a lot of the noise generated by frame analysis, and look specifically at the texture or render target that goes wrong. Especially if it works, then goes bad, it would be interesting to see it get generated and watch it go from stereo to one-eye in dump files.
Another thought is to use DarkStarSword's on-screen display shader, where you can watch the contents of different registers and buffers in real-time as hex dumps. This assumes you know what you want to follow, but it might be possible to target the t20 to see what it looks like live.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Now I input your new code into the Skybox for the Tank area and it worked fine.. Now in the Plane area I injected the code and nothing.. So to be sure I had the correct Shader I Zeroed it out and it went away so I do believe I have the correct shader file it is just not working..
This following shader is the clouds in the plane area.. no there are 2 sets of clouds one Cumulus the other clouds are a bit hinner and spread out. They go into Stereo just not the Big Cumulus clouds..
I looked at the code based on the Prime directive but it just has not fully clicked altho I do have a better understaning of what is going on just not fully yet..
So If you can look at thos shader and gimme a idea what could be wrong or maybe prod me in the direction as I am out of ideas as no code I inject is working.. I even tried to change the shaders files all I got was a blue sky and red sky in the plane area..
So that did not work either but it had an affect.. well here is the cloud shader file:
Now too sure why it is not having an affect on stereoising the clouds??
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit
Yeah, It doesn't seem to go off-screen, unless it suddenly goes out of screen.
If you have any Battlefield game, you can easily check it out;)
Either the driver "bugs" or the engine CS "bugs". I think the engine CS bugs as it only happens in FB3 games when Compute Lights work.
I tried to use the ShaderOverride on one of the CS but couldn't get anything from it... with Frame Analysis.
Yeah, I think only DSS would be able to help me out here:(
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Guess the eye.. Yupp, the right one (where all the effects also happen).
I tried different profiles, but it always is the same. It looks like the swap_chain is generated weird and the 3D Vision Driver doesn't hook properly...
Could this be somehow related?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
There are so many options in the ini file, may be some of this will help? There is anything strange with the game because the unting mode will not work with = 1, it leads to an error that mention the swap_chain.
So may be your guess issnt that wrong! I have played something arroud with the [Device] parameters in the ini file, this leads to other errors.
So i set huntung = 2 and enabled it in the game and it works... Have you tryed to disable the stereo kick in starting the game and enable it in the game?May this can help???
Do you have tryed other ini parameters to change? I have no idea what they do but may be you and if not may it could help to try n error playing around with it???
; This setting enables stereo compute shaders, which is requires to fix a lot
; of "one eye" type rendering issues in many DX11 games:
;StereoFlagsDX10 = 0x00004000
; some games explicitely disable stereo, prohibiting any stereo attempts.
; settings this to 1 ignores all stereo disabling calls and also calls NvAPI_Stereo_Enable to force stereo on.
;force_stereo=1
.
.
.
; games which have their own stereo renderer disable the NVidia automatic
; stereo mode and render themselves into stereo buffers (Crysis 3 for example).
; Setting this to 1 disables the game stereo renderer and enables NVidia auto stereo mechanism.
; This also forces 'false' as a return for any request for NvAPI_Stereo_IsEnabled.
automatic_mode=0
.
.
.
; sets the global surface creation heuristic for NVidia stero driver.
; 0 = NVAPI_STEREO_SURFACECREATEMODE_AUTO - use driver registry profile settings for surface creation mode.
; 1 = NVAPI_STEREO_SURFACECREATEMODE_FORCESTEREO - Always create stereo surfaces.
; 2 = NVAPI_STEREO_SURFACECREATEMODE_FORCEMONO - Always create mono surfaces.
;surface_createmode=1
.
.
.
; overrides surface creation mode for square surfaces.
;surface_square_createmode=1
.
.
.
; Force the NvAPI_Initialize to return an error so that games think stereo and NVidia is unavailable.
force_no_nvapi=0
.
.
.
;------------------------------------------------------------------------------------------------------
; Settings for GPU manipulations.
; Render settings override
;------------------------------------------------------------------------------------------------------
[Rendering]
..
...
....
......
Like my work? Donations can be made via PayPal to: rauti@inetmx.de
I don't crashes or any other things.
I am able to hunt/dump/change shaders and so on;)
Trust me I even played with the Stereo Texture and how the driver makes the stereo textures. Wasn't able to make the game display same thing in both eyes.
The biggest problem is how Frostbite 3 games work. They use HEAVILY compute shaders for lighting.
I am not sure if there is a bug in the Nvidia 3D Vision driver related to computes, but I doubt it as I haven't seen this behaviour in any other game.
We've fixed plenty of FB3 games before, but we never ever fixed the CS in any of them. They are just disabled. Problem here is we can't as we lose almost all the lights.
Thnx Bo3b. I try to do what on a CS and see if I get anything (instead of a texture override).
Yes, is very strange that it "pops-out" of existence. It doesn't actually move from location.
The game dumped 1060 CS files only in the menu... You can imagine going forward, how many it will dump;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Looks like the trend of game devs is to stuff everything into CS now, because the CPUs have stalled out and Moore is no-Moore. That's going to bad for us in general.
For this specific problem, it seems like a game-bug or more likely, a driver bug with 3D Vision Automatic. You might report it as a problem.
The right eye is the left side of a back-buffer, and hence will get drawn in cases where it has lost the idea that it should be drawing a stereo buffer. You might try to see if there is an interim buffer that is somehow not being stereo and should be. That would be like what DarkStarSword found, where an in-between texture/buffer needed to be set stereo. Can be done using the TextureOverride feature.
One last thought for trying to dig into the guts of this. If you have motivation, NVidia has a graphics debugging tool NSight, which is pretty killer for inspecting the full pipeline of what creates who and passes on to where. Only really works on Win10, needs new APIs.
Not sure it will work here, because some games prevent any debuggers from hooking in.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I was looking through the log and found the following error and this shader is the one with the clouds I can not get Stereoized. so this could be something as to why they do not go into Stereo with the code I Injected:
IT states there is an undeclared identifier "StereoParams" could this be something or am I using the wrong code..
I ask because the above variants in code you showed me, I am a bit confused on how to use it.. here is what I used:
Now with the Variants you showed to I add to the code or remove the 3rd line in the code above and replace it with the variants you showed me??
here are the variants you showed me:
I appreciate your help as I still can not get the clouds in the Aircraft section get Stereoized
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit
Yeah... well DICE believes it has this BRIGHT idea where everything is done in CS. Who the HELL computes the WHOLE lighting position in CS. They don't use the CS to calculate the tiles! NO! THEY ACTUALLY DO THE DRAWING! The Ps is just a simple "BLIT"... (Not to mention why the Frostbite3 engine is so buggy, constantly crashing in DX11 errors. Probably the only freaking engine TO CONSTANTLY CRASH! All their games has this issue - to date! No other engine does this. Oh and that "Memory Leak" from ages past? Is still there! Nope, not fixed).
We were very lucky with DA:I as they shipped both variants (Lights in CS and PS - and GUESS WHAT!!! Performance was exactly the same! So, this whole "optimisation fiasco" is - to put it mildly - their lack of understanding how CS work and where they should be used! But, we get "papers" from them, on how their engine is "superior in reproducing photo-realism"... well guess what. Is good, but I've seen way better engines! Not to mention the constantly crashing problem, as NVIDIA knows! Every time a new game under Frostbite is released, new hacks in the drivers are required...sigh).
Anyway, I've stopped looking into this:( Spent 2 more days without getting anywhere. I don't know if is a bug in the Nvidia driver or not...might be, but is the ONLY engine doing this, DICE should fix it, not the other way around... then again they don't support 3D Vision in their engine....soo..... :(
Yeah, I'm using NSight pretty regulary. Unfortunately, I couldn't see anything that might lead to this problem. :(
Guess, we should avoid any future "Frostbite3" based games in the future!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)