[quote="bo3b"][quote="helifax"]Right, so I have this CS Shader here:
https://pastebin.com/XJBshjip
(too long to paste here)
Anyone knows or has any idea on how to fix the Tiles in this shader??? If we could solve this one;) I could easily solve the mystery of the Frosbite3 engines;)[/quote]
Is there any way to get variable names? As a general idea, the biggest advantage of the HLSL decompile is the insertion of variable names into the code flow. Won't work well here, because CS almost never Decompile properly, and this shader has no header information.
Is there any other CS that is similar that has header information, or some way to cross-correlate the use of cb0, t2, t3, g1, g3 and so on?
I think that in the absence of named variables this is going to require a lot of active comment out, test, type tweaking to understand different parts of the code.
The little bit I studied of the CS for tile lights indicated that the entire shader was used for a single tile. So, it wouldn't divide the screen into tiles, that was done at a higher level, and it used the thread groups to decide which tile was being worked upon. So for example, and 8x8 tile list had 64 CS in parallel, all running the same code in a thread group.[/quote]
Hi Bob,
Thank you very much for replying;)
Regaring this shader:
- There are always 60-ish CS per scene.
- I already fixed the CS that deal with the actual drawing of the lights and they are spot on;)
- Problem now, I can see the tiles missing in one different eyes (as now the drawing is done outside of the mono tiles.)
- When I select the above shader and I skip it, the tiles STOP computing. Rotating the camera 90 degrees I see the tiles computed for the scene before I skipped it. "Releasing" the Compute shader makes it calculating the tiles again for the new scene.
- This is not the only one in the scene. I was able to find another one exactly the same.
- I know the headers are stripped... damn them, but I believe I can use Battlefield 1 to get my shader where the headers are intact;) Let my try it out. So far all the compute shaders seem to be identical to BF1 as well as other previous Frostbite3 engine based games;)
Ah man, the game looks absolutely fantastic now (if I get lucky) and the tiles are computed properly, since the lighting is fixed. If we could only fix this problem... ^_^ (we could have a proper fix in place ^_^)
And finally another question regarding 3DMigoto: Is it possible to dump all the shaders that are active in a scene? Only the 20 PS/VS/CS that I see? Instead of the whole suite of billion of shaders that are sent to the GPU? ^_^
Edit:
I managed to find one CS shader that is 95% the same! from BF1 Frostbite3 engine. Here all the buffers are defined;)
I made a paste here as is huge:
https://pastebin.com/WirK72GU
Manually trying to "hack-it" by commenting stuff, didn't work for me:( This one really needs a precise fix;)
Big thank you again for looking into it!
Anyone knows or has any idea on how to fix the Tiles in this shader??? If we could solve this one;) I could easily solve the mystery of the Frosbite3 engines;)
Is there any way to get variable names? As a general idea, the biggest advantage of the HLSL decompile is the insertion of variable names into the code flow. Won't work well here, because CS almost never Decompile properly, and this shader has no header information.
Is there any other CS that is similar that has header information, or some way to cross-correlate the use of cb0, t2, t3, g1, g3 and so on?
I think that in the absence of named variables this is going to require a lot of active comment out, test, type tweaking to understand different parts of the code.
The little bit I studied of the CS for tile lights indicated that the entire shader was used for a single tile. So, it wouldn't divide the screen into tiles, that was done at a higher level, and it used the thread groups to decide which tile was being worked upon. So for example, and 8x8 tile list had 64 CS in parallel, all running the same code in a thread group.
Hi Bob,
Thank you very much for replying;)
Regaring this shader:
- There are always 60-ish CS per scene.
- I already fixed the CS that deal with the actual drawing of the lights and they are spot on;)
- Problem now, I can see the tiles missing in one different eyes (as now the drawing is done outside of the mono tiles.)
- When I select the above shader and I skip it, the tiles STOP computing. Rotating the camera 90 degrees I see the tiles computed for the scene before I skipped it. "Releasing" the Compute shader makes it calculating the tiles again for the new scene.
- This is not the only one in the scene. I was able to find another one exactly the same.
- I know the headers are stripped... damn them, but I believe I can use Battlefield 1 to get my shader where the headers are intact;) Let my try it out. So far all the compute shaders seem to be identical to BF1 as well as other previous Frostbite3 engine based games;)
Ah man, the game looks absolutely fantastic now (if I get lucky) and the tiles are computed properly, since the lighting is fixed. If we could only fix this problem... ^_^ (we could have a proper fix in place ^_^)
And finally another question regarding 3DMigoto: Is it possible to dump all the shaders that are active in a scene? Only the 20 PS/VS/CS that I see? Instead of the whole suite of billion of shaders that are sent to the GPU? ^_^
Edit:
I managed to find one CS shader that is 95% the same! from BF1 Frostbite3 engine. Here all the buffers are defined;)
I made a paste here as is huge:
https://pastebin.com/WirK72GU
Manually trying to "hack-it" by commenting stuff, didn't work for me:( This one really needs a precise fix;)
Big thank you again for looking into it!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
....really? Helifax....you are...ALMOST do 3d vision ready???....impressive is short..you are the BOSS :)
PD: Sorry offtopic
PD2: Thanks bo3b for help ;))
Another question lol, related to Frame Analyser this time:
I have this CS:
[code]
[ShaderOverrideCS1]
Hash=14f8f4febc50582f
analyse_options = log dump_rt_jps clear_rt
[/code]
I press F8 and I get the log. In the log I can only find one line related to this CS hash:
[code]
000892 CSSetShader(pComputeShader:0x00000000413599F8, ppClassInstances:0x0000000000000000, NumClassInstances:0) hash=14f8f4febc50582f
000892 CSSetUnorderedAccessViews(StartSlot:0, NumUAVs:1, ppUnorderedAccessViews:0x0000000032405C40, pUAVInitialCounts:0x0000000000000000)
0: view=0x00000002029A40F8 resource=0x0000000202355550
000892 CSSetUnorderedAccessViews(StartSlot:1, NumUAVs:1, ppUnorderedAccessViews:0x0000000032405C40, pUAVInitialCounts:0x0000000000000000)
0: view=0x00000001ACD6D7F8 resource=0x0000000202366BD0
000892 CSSetUnorderedAccessViews(StartSlot:2, NumUAVs:1, ppUnorderedAccessViews:0x0000000032405C40, pUAVInitialCounts:0x0000000000000000)
0: view=0x00000000E7EB51B8 resource=0x0000000202367E50
000892 Map(pResource:0x00000000FEDBB3D0, Subresource:0, MapType:4, MapFlags:0, pMappedResource:0x0000000032405A58)
000892 Unmap(pResource:0x00000000FEDBB3D0, Subresource:0)
000892 CSSetConstantBuffers(StartSlot:0, NumBuffers:1, ppConstantBuffers:0x0000000032405C20)
0: resource=0x00000000FEDBB3D0
000892 Dispatch(ThreadGroupCountX:80, ThreadGroupCountY:45, ThreadGroupCountZ:1)
000892 analyse_options (one-shot): 0020000a
000892 Begin(pAsync:0x00000001F8AB4228) type=performance
[/code]
I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?
Thank you;)
I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?
Thank you;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Guyssss, help here for Bayonetta. I'm using the debug helixmod dll (I need to use "UseExtInterfaceOnly = true" because of d3d9ex), and I can navigate shaders, but I can't dump them! Even "DumpAll=true" doesn't work!
In fact, I see over 200 shaders in the loading screens, which makes me think "UseRenderedShaders = true" isn't working either. I have encountered this dumping problem also in Resident Evil 4 (which I didn't report, I forgot).
The log has content, and clearly it's reading the DX9Settings.ini file because "UseExtInterfaceOnly = true" worked to enable shader hunting.
Edit: I'm on Windows 10, but that problem in RE4 was already there in W7.
Guyssss, help here for Bayonetta. I'm using the debug helixmod dll (I need to use "UseExtInterfaceOnly = true" because of d3d9ex), and I can navigate shaders, but I can't dump them! Even "DumpAll=true" doesn't work!
In fact, I see over 200 shaders in the loading screens, which makes me think "UseRenderedShaders = true" isn't working either. I have encountered this dumping problem also in Resident Evil 4 (which I didn't report, I forgot).
The log has content, and clearly it's reading the DX9Settings.ini file because "UseExtInterfaceOnly = true" worked to enable shader hunting.
Edit: I'm on Windows 10, but that problem in RE4 was already there in W7.
[quote="masterotaku"]Guyssss, help here for Bayonetta. I'm using the debug helixmod dll (I need to use "UseExtInterfaceOnly = true" because of d3d9ex), and I can navigate shaders, but I can't dump them! Even "DumpAll=true" doesn't work!
In fact, I see over 200 shaders in the loading screens, which makes me think "UseRenderedShaders = true" isn't working either. I have encountered this dumping problem also in Resident Evil 4 (which I didn't report, I forgot).
The log has content, and clearly it's reading the DX9Settings.ini file because "UseExtInterfaceOnly = true" worked to enable shader hunting.
Edit: I'm on Windows 10, but that problem in RE4 was already there in W7.[/quote]
Ah yes, I've seen this problem before. But I was able to navigate the shaders. I wasn't able to only stick with the rendered ones. It would always pick up the whole chain. I think this is a problem with the dx9ex variant. Sadly, I don't know a way to work-around it... :(
Oh, try a different version of the DLL. There are different versions without a proper version. So, try all of them until you get one that works:( I know it sux, but...
masterotaku said:Guyssss, help here for Bayonetta. I'm using the debug helixmod dll (I need to use "UseExtInterfaceOnly = true" because of d3d9ex), and I can navigate shaders, but I can't dump them! Even "DumpAll=true" doesn't work!
In fact, I see over 200 shaders in the loading screens, which makes me think "UseRenderedShaders = true" isn't working either. I have encountered this dumping problem also in Resident Evil 4 (which I didn't report, I forgot).
The log has content, and clearly it's reading the DX9Settings.ini file because "UseExtInterfaceOnly = true" worked to enable shader hunting.
Edit: I'm on Windows 10, but that problem in RE4 was already there in W7.
Ah yes, I've seen this problem before. But I was able to navigate the shaders. I wasn't able to only stick with the rendered ones. It would always pick up the whole chain. I think this is a problem with the dx9ex variant. Sadly, I don't know a way to work-around it... :(
Oh, try a different version of the DLL. There are different versions without a proper version. So, try all of them until you get one that works:( I know it sux, but...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]
Oh, try a different version of the DLL. There are different versions without a proper version. So, try all of them until you get one that works:( I know it sux, but...[/quote]
Thanks for the advice. I have downloaded the debug DLLs from most recent to oldest from here: http://wiki.bo3b.net/index.php?title=HelixMod_Feature_List
First and second load the hunting OSD, but nothing can be dumped. The third one doesn't display the OSD (of course no dump either), and starting from the fourth, the game can't boot and Windows shows a d3d9ex error.
I have kept the DX9Settings.ini to the simplest form and still nothing :(. Shit, is this unfixable?
helifax said:
Oh, try a different version of the DLL. There are different versions without a proper version. So, try all of them until you get one that works:( I know it sux, but...
First and second load the hunting OSD, but nothing can be dumped. The third one doesn't display the OSD (of course no dump either), and starting from the fourth, the game can't boot and Windows shows a d3d9ex error.
I have kept the DX9Settings.ini to the simplest form and still nothing :(. Shit, is this unfixable?
Maybe look at the fix by mike_ar69 for Metal Gear Rising
It might be the same engine, it's the same developer
https://forums.geforce.com/default/topic/668043/3d-vision/metal-gear-rising-3d-vision/1/
"Game is just about playable out the box, but there are significant haloing effects on just about everything (for shadows, fog, smoke, and fire"
sound familiar?
http://helixmod.blogspot.com/2014/01/metal-gear-rising-revengeance-3d-vision.html
[quote="D-Man11"]Maybe look at the fix by mike_ar69 for Metal Gear Rising[/quote]
That worked! I'm using the latest d3d9.dll but with this configuration:
[code][General]
OverrideMethod = 2
UseEndScene = false
DefPSViewSizeConst = 210
GetCurDirAtLoad = true
DefPSSampler = 15
PresetsKeysList = 1;
DumpAll = true
UseExtInterfaceOnly = true[/code]
Something about that Metal Gear Rising configuration makes dumping work. Off to working on the fix then!
[quote="masterotaku"][quote="D-Man11"]Maybe look at the fix by mike_ar69 for Metal Gear Rising[/quote]
That worked! I'm using the latest d3d9.dll but with this configuration:
[code][General]
OverrideMethod = 2
UseEndScene = false
DefPSViewSizeConst = 210
GetCurDirAtLoad = true
DefPSSampler = 15
PresetsKeysList = 1;
DumpAll = true
UseExtInterfaceOnly = true[/code]
Something about that Metal Gear Rising configuration makes dumping work. Off to working on the fix then!
[/quote]
Awesome stuff! And great work D-Man11 for pointing that one out! ;)
Something about that Metal Gear Rising configuration makes dumping work. Off to working on the fix then!
Awesome stuff! And great work D-Man11 for pointing that one out! ;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"][quote="bo3b"][quote="helifax"]Right, so I have this CS Shader here:
https://pastebin.com/XJBshjip
(too long to paste here)
Anyone knows or has any idea on how to fix the Tiles in this shader??? If we could solve this one;) I could easily solve the mystery of the Frosbite3 engines;)[/quote]
Is there any way to get variable names? As a general idea, the biggest advantage of the HLSL decompile is the insertion of variable names into the code flow. Won't work well here, because CS almost never Decompile properly, and this shader has no header information.
Is there any other CS that is similar that has header information, or some way to cross-correlate the use of cb0, t2, t3, g1, g3 and so on?
I think that in the absence of named variables this is going to require a lot of active comment out, test, type tweaking to understand different parts of the code.
The little bit I studied of the CS for tile lights indicated that the entire shader was used for a single tile. So, it wouldn't divide the screen into tiles, that was done at a higher level, and it used the thread groups to decide which tile was being worked upon. So for example, and 8x8 tile list had 64 CS in parallel, all running the same code in a thread group.[/quote]
Hi Bob,
Thank you very much for replying;)
Regaring this shader:
- There are always 60-ish CS per scene.
- I already fixed the CS that deal with the actual drawing of the lights and they are spot on;)
- Problem now, I can see the tiles missing in one different eyes (as now the drawing is done outside of the mono tiles.)
- When I select the above shader and I skip it, the tiles STOP computing. Rotating the camera 90 degrees I see the tiles computed for the scene before I skipped it. "Releasing" the Compute shader makes it calculating the tiles again for the new scene.
- This is not the only one in the scene. I was able to find another one exactly the same.
- I know the headers are stripped... damn them, but I believe I can use Battlefield 1 to get my shader where the headers are intact;) Let my try it out. So far all the compute shaders seem to be identical to BF1 as well as other previous Frostbite3 engine based games;)
Ah man, the game looks absolutely fantastic now (if I get lucky) and the tiles are computed properly, since the lighting is fixed. If we could only fix this problem... ^_^ (we could have a proper fix in place ^_^)
And finally another question regarding 3DMigoto: Is it possible to dump all the shaders that are active in a scene? Only the 20 PS/VS/CS that I see? Instead of the whole suite of billion of shaders that are sent to the GPU? ^_^
Edit:
I managed to find one CS shader that is 95% the same! from BF1 Frostbite3 engine. Here all the buffers are defined;)
I made a paste here as is huge:
https://pastebin.com/WirK72GU
Manually trying to "hack-it" by commenting stuff, didn't work for me:( This one really needs a precise fix;)
Big thank you again for looking into it![/quote]
Bump ^_^
Anyone knows or has any idea on how to fix the Tiles in this shader??? If we could solve this one;) I could easily solve the mystery of the Frosbite3 engines;)
Is there any way to get variable names? As a general idea, the biggest advantage of the HLSL decompile is the insertion of variable names into the code flow. Won't work well here, because CS almost never Decompile properly, and this shader has no header information.
Is there any other CS that is similar that has header information, or some way to cross-correlate the use of cb0, t2, t3, g1, g3 and so on?
I think that in the absence of named variables this is going to require a lot of active comment out, test, type tweaking to understand different parts of the code.
The little bit I studied of the CS for tile lights indicated that the entire shader was used for a single tile. So, it wouldn't divide the screen into tiles, that was done at a higher level, and it used the thread groups to decide which tile was being worked upon. So for example, and 8x8 tile list had 64 CS in parallel, all running the same code in a thread group.
Hi Bob,
Thank you very much for replying;)
Regaring this shader:
- There are always 60-ish CS per scene.
- I already fixed the CS that deal with the actual drawing of the lights and they are spot on;)
- Problem now, I can see the tiles missing in one different eyes (as now the drawing is done outside of the mono tiles.)
- When I select the above shader and I skip it, the tiles STOP computing. Rotating the camera 90 degrees I see the tiles computed for the scene before I skipped it. "Releasing" the Compute shader makes it calculating the tiles again for the new scene.
- This is not the only one in the scene. I was able to find another one exactly the same.
- I know the headers are stripped... damn them, but I believe I can use Battlefield 1 to get my shader where the headers are intact;) Let my try it out. So far all the compute shaders seem to be identical to BF1 as well as other previous Frostbite3 engine based games;)
Ah man, the game looks absolutely fantastic now (if I get lucky) and the tiles are computed properly, since the lighting is fixed. If we could only fix this problem... ^_^ (we could have a proper fix in place ^_^)
And finally another question regarding 3DMigoto: Is it possible to dump all the shaders that are active in a scene? Only the 20 PS/VS/CS that I see? Instead of the whole suite of billion of shaders that are sent to the GPU? ^_^
Edit:
I managed to find one CS shader that is 95% the same! from BF1 Frostbite3 engine. Here all the buffers are defined;)
I made a paste here as is huge:
https://pastebin.com/WirK72GU
Manually trying to "hack-it" by commenting stuff, didn't work for me:( This one really needs a precise fix;)
Big thank you again for looking into it!
Bump ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]Another question lol, related to Frame Analyser this time:
I have this CS:
[code]
[ShaderOverrideCS1]
Hash=14f8f4febc50582f
analyse_options = log dump_rt_jps clear_rt
[/code]
I press F8 and I get the log. In the log I can only find one line related to this CS hash:
[code]
000892 CSSetShader(pComputeShader:0x00000000413599F8, ppClassInstances:0x0000000000000000, NumClassInstances:0) hash=14f8f4febc50582f
000892 CSSetUnorderedAccessViews(StartSlot:0, NumUAVs:1, ppUnorderedAccessViews:0x0000000032405C40, pUAVInitialCounts:0x0000000000000000)
0: view=0x00000002029A40F8 resource=0x0000000202355550
000892 CSSetUnorderedAccessViews(StartSlot:1, NumUAVs:1, ppUnorderedAccessViews:0x0000000032405C40, pUAVInitialCounts:0x0000000000000000)
0: view=0x00000001ACD6D7F8 resource=0x0000000202366BD0
000892 CSSetUnorderedAccessViews(StartSlot:2, NumUAVs:1, ppUnorderedAccessViews:0x0000000032405C40, pUAVInitialCounts:0x0000000000000000)
0: view=0x00000000E7EB51B8 resource=0x0000000202367E50
000892 Map(pResource:0x00000000FEDBB3D0, Subresource:0, MapType:4, MapFlags:0, pMappedResource:0x0000000032405A58)
000892 Unmap(pResource:0x00000000FEDBB3D0, Subresource:0)
000892 CSSetConstantBuffers(StartSlot:0, NumBuffers:1, ppConstantBuffers:0x0000000032405C20)
0: resource=0x00000000FEDBB3D0
000892 Dispatch(ThreadGroupCountX:80, ThreadGroupCountY:45, ThreadGroupCountZ:1)
000892 analyse_options (one-shot): 0020000a
000892 Begin(pAsync:0x00000001F8AB4228) type=performance
[/code]
I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?
Thank you;)
[/quote]
Bump 2 ^_^
I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?
Thank you;)
Bump 2 ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Hi helifax!
When you use Frame Analysis and press F8, a folder is created. Always inside that folder is the shaderusage.txt that show the relation of the shader analized.
When you use Frame Analysis and press F8, a folder is created. Always inside that folder is the shaderusage.txt that show the relation of the shader analized.
[quote="DHR"]Hi helifax!
When you use Frame Analysis and press F8, a folder is created. Always inside that folder is the shaderusage.txt that show the relation of the shader analized.
[/quote]
Yes, but I only found the relation between the PS and VS and nothing about Compute Shaders :-s, unless I am missing something in the configuration file?
When you use Frame Analysis and press F8, a folder is created. Always inside that folder is the shaderusage.txt that show the relation of the shader analized.
Yes, but I only found the relation between the PS and VS and nothing about Compute Shaders :-s, unless I am missing something in the configuration file?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
@helifax: I'm unlikely to be able to add much here, but sometimes just talking through ideas help the person with better knowledge (you) come up with some possibilities. I can also add a few details to questions you asked before.
For the one-eye problem here with ME:A, it's probably something related to them cheating on the graphics, so that the driver does not understand what is happening. Examples of that are drawing into a 2D texture that DX11 did not create, so that the driver never sees it. If the driver never sees it, it cannot double it for the other eye.
Since they've gone full retard with drawing everything with the Compute Shaders, it's going to be a tough job for Automatic to see anything. The counter-argument is that it does that wink-out effect, where it works for a bit, then cuts out. That suggests a different problem, but could happen if they recycle buffers or any number of other things.
As a starting spot for thinking, maybe try to narrow down the texture/render target, and try to see where it gets created. If you can narrow it down to a specific texture hash, we can jack with it multiple ways.
[quote]I know 3DMigoto has the ability to specify for a shader by hash, the handling:
"handling=mono"
I was wondering what does it actually do? Will make the Shader Run just one time? or will it disable all the 3D Vision Auto heuristics that try to stereorize the shader?
Would it work on Compute Shaders?
(I am looking at ME:A that is Frostbite 3 engine and all the lights are "rendered" from CS, but some shaders look like they are stereorized when they shouldn't be)[/quote]
This is telling the NVidia Automatic that a given shader or hash should either be mono or stereo, or let it be automatic. 0,1,2 settings. If something is misfiring in particular, this can swap the mode for a specific reason.
I don't think this will help here, but maybe. Generally this takes something from screen depth, and makes it stereo, or vice-versa. It can't really solve one-eye problems, without other changes.
I think that the shader always runs twice, the only difference is whether it will draw the image at screen depth or not. Doubled up, just at the exact same .x.
One question- how does it look with no fix applied to the shader? I read through a bunch of stuff DarkStarSword posted, and he had actually introduced a one-eye problem via bad stereo-correction.
The basic starting point here is to try to determine exactly why it's one eye. One eye could fly off screen for example, or could simply be skipped because Automatic doesn't know about it, or could be a timing problem if the 2nd eye comes in late, or, or...
For a timing test, set the affinity flag in the d3dx.ini, and see if it affects the 5 second wink-out timing. This solved a problem DarkStarSword had with flickering textures from CS.
@helifax: I'm unlikely to be able to add much here, but sometimes just talking through ideas help the person with better knowledge (you) come up with some possibilities. I can also add a few details to questions you asked before.
For the one-eye problem here with ME:A, it's probably something related to them cheating on the graphics, so that the driver does not understand what is happening. Examples of that are drawing into a 2D texture that DX11 did not create, so that the driver never sees it. If the driver never sees it, it cannot double it for the other eye.
Since they've gone full retard with drawing everything with the Compute Shaders, it's going to be a tough job for Automatic to see anything. The counter-argument is that it does that wink-out effect, where it works for a bit, then cuts out. That suggests a different problem, but could happen if they recycle buffers or any number of other things.
As a starting spot for thinking, maybe try to narrow down the texture/render target, and try to see where it gets created. If you can narrow it down to a specific texture hash, we can jack with it multiple ways.
I know 3DMigoto has the ability to specify for a shader by hash, the handling:
"handling=mono"
I was wondering what does it actually do? Will make the Shader Run just one time? or will it disable all the 3D Vision Auto heuristics that try to stereorize the shader?
Would it work on Compute Shaders?
(I am looking at ME:A that is Frostbite 3 engine and all the lights are "rendered" from CS, but some shaders look like they are stereorized when they shouldn't be)
This is telling the NVidia Automatic that a given shader or hash should either be mono or stereo, or let it be automatic. 0,1,2 settings. If something is misfiring in particular, this can swap the mode for a specific reason.
I don't think this will help here, but maybe. Generally this takes something from screen depth, and makes it stereo, or vice-versa. It can't really solve one-eye problems, without other changes.
I think that the shader always runs twice, the only difference is whether it will draw the image at screen depth or not. Doubled up, just at the exact same .x.
One question- how does it look with no fix applied to the shader? I read through a bunch of stuff DarkStarSword posted, and he had actually introduced a one-eye problem via bad stereo-correction.
The basic starting point here is to try to determine exactly why it's one eye. One eye could fly off screen for example, or could simply be skipped because Automatic doesn't know about it, or could be a timing problem if the 2nd eye comes in late, or, or...
For a timing test, set the affinity flag in the d3dx.ini, and see if it affects the 5 second wink-out timing. This solved a problem DarkStarSword had with flickering textures from CS.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote]On side note, when using Frame Anyliser I get a lot of ".buf" files. Any idea on how I can read/see them?
I have no idea with what I'm supposed to open it with:([/quote]
Those are just straight binary files, nothing special about them. It's just a given constant buffer like cb0, dumped out as bytes. Only way to look at them would be as hex.
If you want to see them in a more sensible form, you can set the "analyse_options = log dump_cb_txt" which will dump the buffer to the log as decoded Float4.
This is how DarkStarSword has been able to decode a ViewProjMatrix with no headers, by looking at the raw floats, and deducing from known ViewProjMatrix formats which is which. (like the diagonal values.)
[quote]I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?[/quote]
For this one, I also thought this was part of ShaderUsage, but it doesn't look like that happens. Don't know if that code was updated with CS in mind.
Barring that, the only way to tell is from the log.txt itself from FrameAnalysis. You'll see the CS being bound to the pipeline, and can follow the sequence of who gets bound when, and thus when it would be executed.
For a given CS, you can probably use the [ShaderOverride*] feature to just frame analyse a single CS, and dump out it's cb0 and/or possibly find the texture in question here. If you can narrow down the render target, we can jack with it using the texture or shader manipulations.
For example, if it's some raw memory that wasn't created with CreateTexture2D, and thus not doubled, we can replace their texture with a real one using the manipulation language.
Since it fades out or disappears though, it seems like it is sort of working, just some later check fails and cuts out. We might be able to fix something like by clearing a rendertarget in the pipeline using the manipulation language, or unbind something after a frame to make it create it every time.
We basically can jack with anything in the frame with current functionality, it's just hard to narrow down what to hit.
On side note, when using Frame Anyliser I get a lot of ".buf" files. Any idea on how I can read/see them?
I have no idea with what I'm supposed to open it with:(
Those are just straight binary files, nothing special about them. It's just a given constant buffer like cb0, dumped out as bytes. Only way to look at them would be as hex.
If you want to see them in a more sensible form, you can set the "analyse_options = log dump_cb_txt" which will dump the buffer to the log as decoded Float4.
This is how DarkStarSword has been able to decode a ViewProjMatrix with no headers, by looking at the raw floats, and deducing from known ViewProjMatrix formats which is which. (like the diagonal values.)
I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?
For this one, I also thought this was part of ShaderUsage, but it doesn't look like that happens. Don't know if that code was updated with CS in mind.
Barring that, the only way to tell is from the log.txt itself from FrameAnalysis. You'll see the CS being bound to the pipeline, and can follow the sequence of who gets bound when, and thus when it would be executed.
For a given CS, you can probably use the [ShaderOverride*] feature to just frame analyse a single CS, and dump out it's cb0 and/or possibly find the texture in question here. If you can narrow down the render target, we can jack with it using the texture or shader manipulations.
For example, if it's some raw memory that wasn't created with CreateTexture2D, and thus not doubled, we can replace their texture with a real one using the manipulation language.
Since it fades out or disappears though, it seems like it is sort of working, just some later check fails and cuts out. We might be able to fix something like by clearing a rendertarget in the pipeline using the manipulation language, or unbind something after a frame to make it create it every time.
We basically can jack with anything in the frame with current functionality, it's just hard to narrow down what to hit.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Hi Bob,
Thank you very much for replying;)
Regaring this shader:
- There are always 60-ish CS per scene.
- I already fixed the CS that deal with the actual drawing of the lights and they are spot on;)
- Problem now, I can see the tiles missing in one different eyes (as now the drawing is done outside of the mono tiles.)
- When I select the above shader and I skip it, the tiles STOP computing. Rotating the camera 90 degrees I see the tiles computed for the scene before I skipped it. "Releasing" the Compute shader makes it calculating the tiles again for the new scene.
- This is not the only one in the scene. I was able to find another one exactly the same.
- I know the headers are stripped... damn them, but I believe I can use Battlefield 1 to get my shader where the headers are intact;) Let my try it out. So far all the compute shaders seem to be identical to BF1 as well as other previous Frostbite3 engine based games;)
Ah man, the game looks absolutely fantastic now (if I get lucky) and the tiles are computed properly, since the lighting is fixed. If we could only fix this problem... ^_^ (we could have a proper fix in place ^_^)
And finally another question regarding 3DMigoto: Is it possible to dump all the shaders that are active in a scene? Only the 20 PS/VS/CS that I see? Instead of the whole suite of billion of shaders that are sent to the GPU? ^_^
Edit:
I managed to find one CS shader that is 95% the same! from BF1 Frostbite3 engine. Here all the buffers are defined;)
I made a paste here as is huge:
https://pastebin.com/WirK72GU
Manually trying to "hack-it" by commenting stuff, didn't work for me:( This one really needs a precise fix;)
Big thank you again for looking into it!
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
PD: Sorry offtopic
PD2: Thanks bo3b for help ;))
i7 4970k@4.5Ghz, SLI GTX1080Ti Aorus Gigabyte Xtreme, 16GB G Skill 2400hrz, 3*PG258Q in 3D surround.
I have this CS:
I press F8 and I get the log. In the log I can only find one line related to this CS hash:
I was under the impression that I can see who "feeds" this CS? (What other CS feeds this one I mean).
Anyone know where I can find the info?
Thank you;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
In fact, I see over 200 shaders in the loading screens, which makes me think "UseRenderedShaders = true" isn't working either. I have encountered this dumping problem also in Resident Evil 4 (which I didn't report, I forgot).
The log has content, and clearly it's reading the DX9Settings.ini file because "UseExtInterfaceOnly = true" worked to enable shader hunting.
Edit: I'm on Windows 10, but that problem in RE4 was already there in W7.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Ah yes, I've seen this problem before. But I was able to navigate the shaders. I wasn't able to only stick with the rendered ones. It would always pick up the whole chain. I think this is a problem with the dx9ex variant. Sadly, I don't know a way to work-around it... :(
Oh, try a different version of the DLL. There are different versions without a proper version. So, try all of them until you get one that works:( I know it sux, but...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Thanks for the advice. I have downloaded the debug DLLs from most recent to oldest from here: http://wiki.bo3b.net/index.php?title=HelixMod_Feature_List
First and second load the hunting OSD, but nothing can be dumped. The third one doesn't display the OSD (of course no dump either), and starting from the fourth, the game can't boot and Windows shows a d3d9ex error.
I have kept the DX9Settings.ini to the simplest form and still nothing :(. Shit, is this unfixable?
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
It might be the same engine, it's the same developer
https://forums.geforce.com/default/topic/668043/3d-vision/metal-gear-rising-3d-vision/1/
"Game is just about playable out the box, but there are significant haloing effects on just about everything (for shadows, fog, smoke, and fire"
sound familiar?
http://helixmod.blogspot.com/2014/01/metal-gear-rising-revengeance-3d-vision.html
That worked! I'm using the latest d3d9.dll but with this configuration:
Something about that Metal Gear Rising configuration makes dumping work. Off to working on the fix then!
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Awesome stuff! And great work D-Man11 for pointing that one out! ;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Bump ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
Bump 2 ^_^
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
When you use Frame Analysis and press F8, a folder is created. Always inside that folder is the shaderusage.txt that show the relation of the shader analized.
MY WEB
Helix Mod - Making 3D Better
My 3D Screenshot Gallery
Like my fixes? you can donate to Paypal: dhr.donation@gmail.com
Yes, but I only found the relation between the PS and VS and nothing about Compute Shaders :-s, unless I am missing something in the configuration file?
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
For the one-eye problem here with ME:A, it's probably something related to them cheating on the graphics, so that the driver does not understand what is happening. Examples of that are drawing into a 2D texture that DX11 did not create, so that the driver never sees it. If the driver never sees it, it cannot double it for the other eye.
Since they've gone full retard with drawing everything with the Compute Shaders, it's going to be a tough job for Automatic to see anything. The counter-argument is that it does that wink-out effect, where it works for a bit, then cuts out. That suggests a different problem, but could happen if they recycle buffers or any number of other things.
As a starting spot for thinking, maybe try to narrow down the texture/render target, and try to see where it gets created. If you can narrow it down to a specific texture hash, we can jack with it multiple ways.
This is telling the NVidia Automatic that a given shader or hash should either be mono or stereo, or let it be automatic. 0,1,2 settings. If something is misfiring in particular, this can swap the mode for a specific reason.
I don't think this will help here, but maybe. Generally this takes something from screen depth, and makes it stereo, or vice-versa. It can't really solve one-eye problems, without other changes.
I think that the shader always runs twice, the only difference is whether it will draw the image at screen depth or not. Doubled up, just at the exact same .x.
One question- how does it look with no fix applied to the shader? I read through a bunch of stuff DarkStarSword posted, and he had actually introduced a one-eye problem via bad stereo-correction.
The basic starting point here is to try to determine exactly why it's one eye. One eye could fly off screen for example, or could simply be skipped because Automatic doesn't know about it, or could be a timing problem if the 2nd eye comes in late, or, or...
For a timing test, set the affinity flag in the d3dx.ini, and see if it affects the 5 second wink-out timing. This solved a problem DarkStarSword had with flickering textures from CS.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Those are just straight binary files, nothing special about them. It's just a given constant buffer like cb0, dumped out as bytes. Only way to look at them would be as hex.
If you want to see them in a more sensible form, you can set the "analyse_options = log dump_cb_txt" which will dump the buffer to the log as decoded Float4.
This is how DarkStarSword has been able to decode a ViewProjMatrix with no headers, by looking at the raw floats, and deducing from known ViewProjMatrix formats which is which. (like the diagonal values.)
For this one, I also thought this was part of ShaderUsage, but it doesn't look like that happens. Don't know if that code was updated with CS in mind.
Barring that, the only way to tell is from the log.txt itself from FrameAnalysis. You'll see the CS being bound to the pipeline, and can follow the sequence of who gets bound when, and thus when it would be executed.
For a given CS, you can probably use the [ShaderOverride*] feature to just frame analyse a single CS, and dump out it's cb0 and/or possibly find the texture in question here. If you can narrow down the render target, we can jack with it using the texture or shader manipulations.
For example, if it's some raw memory that wasn't created with CreateTexture2D, and thus not doubled, we can replace their texture with a real one using the manipulation language.
Since it fades out or disappears though, it seems like it is sort of working, just some later check fails and cuts out. We might be able to fix something like by clearing a rendertarget in the pipeline using the manipulation language, or unbind something after a frame to make it create it every time.
We basically can jack with anything in the frame with current functionality, it's just hard to narrow down what to hit.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers