[quote="Qwinn"][quote="Volnaiskra"]Qwinn, have you tried 3dmigoto? Or have you not because you're on Windows 10?[/quote]I haven't tried it, no, because I'm on Windows 10 and older drivers don't work on 980Ti's. Is there any chance it could fix anything for me, whether compatibility mode or not? I was told the funky text issues were pretty much unavoidable in Compatibility mode.[/quote]
Nope, no chance that this will fix it, the Win10 problem is because of the NVidia driver, which we cannot fix.
If you want to play this game right now (and you should, because it's pretty much the best game fix we've made for one of the best games we've ever seen), you need to dual boot Win7. Don't shortchange yourself and use CM, that is completely bogus technology when a true fix exists.
Volnaiskra said:Qwinn, have you tried 3dmigoto? Or have you not because you're on Windows 10?
I haven't tried it, no, because I'm on Windows 10 and older drivers don't work on 980Ti's. Is there any chance it could fix anything for me, whether compatibility mode or not? I was told the funky text issues were pretty much unavoidable in Compatibility mode.
Nope, no chance that this will fix it, the Win10 problem is because of the NVidia driver, which we cannot fix.
If you want to play this game right now (and you should, because it's pretty much the best game fix we've made for one of the best games we've ever seen), you need to dual boot Win7. Don't shortchange yourself and use CM, that is completely bogus technology when a true fix exists.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Thank you very much Helifx for your patch, so far I have noticed no problems. Btw, new expansion is really good. I haven't finished it yet, but so far I had a lot of fun and laugh (especially during wedding :) )
Thank you very much Helifx for your patch, so far I have noticed no problems. Btw, new expansion is really good. I haven't finished it yet, but so far I had a lot of fun and laugh (especially during wedding :) )
I'm so happy that I managed to get win7 Ultimate installed from my retail disc. No OEM licence for me. Like moving it from computer to computer. Was really problematic the first 5 times I tried which failed installation
I'm so happy that I managed to get win7 Ultimate installed from my retail disc. No OEM licence for me. Like moving it from computer to computer. Was really problematic the first 5 times I tried which failed installation
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
[quote="Volnaiskra"]Does the 3dmigoto fix place HUD markers (eg. NPC names, enemy health bars, etc.) at the correct depth on a per-object basis? Or are they tied to the rest of the UI?
The reason I ask is because I've fooled around (unsuccessfully) with trying to get the 3D fix to work on Windows 10. I'm now considering rolling back to Windows 7. But imperfect UI is a pet peeve of mine, so not sure if it's worth it for me personally.[/quote]Bump
Sorry to be so insistent and bump this question, but I'd like to know this. I'm on Win 10, and I've attempted dual-booting Win 7 but without success (my win 7 CD didn't want to install on any of my available hard disks - couldn't figure out why; it's annoying, because I've got about 4 of the fuckers). I can keep trying, though I'd like to know what I'm shooting for.
I have 980ti SLI now by the way - not sure if that makes a difference in terms of drivers. Ironically, I bought them partly so I could enjoy 3D in Witcher 3. Meanwhile, my Win7 installation shat itself, so I clean installed Windows 10, thinking I'd have a perfect new OS to play Witcher 3 on...but then I learned of the Win 10 problems :(
Playing this game in 2D is obviously lukewarm, graphically speaking. I really wish I could have 3D. But at least I'm enjoying silky 60fps and pure unadulterated graphics. Is the 3D experience without meaningful compromises, or does it involve some glitches (imprecise name markers, bad godrays, etc.) and/or screwing around?
Volnaiskra said:Does the 3dmigoto fix place HUD markers (eg. NPC names, enemy health bars, etc.) at the correct depth on a per-object basis? Or are they tied to the rest of the UI?
The reason I ask is because I've fooled around (unsuccessfully) with trying to get the 3D fix to work on Windows 10. I'm now considering rolling back to Windows 7. But imperfect UI is a pet peeve of mine, so not sure if it's worth it for me personally.
Bump
Sorry to be so insistent and bump this question, but I'd like to know this. I'm on Win 10, and I've attempted dual-booting Win 7 but without success (my win 7 CD didn't want to install on any of my available hard disks - couldn't figure out why; it's annoying, because I've got about 4 of the fuckers). I can keep trying, though I'd like to know what I'm shooting for.
I have 980ti SLI now by the way - not sure if that makes a difference in terms of drivers. Ironically, I bought them partly so I could enjoy 3D in Witcher 3. Meanwhile, my Win7 installation shat itself, so I clean installed Windows 10, thinking I'd have a perfect new OS to play Witcher 3 on...but then I learned of the Win 10 problems :(
Playing this game in 2D is obviously lukewarm, graphically speaking. I really wish I could have 3D. But at least I'm enjoying silky 60fps and pure unadulterated graphics. Is the 3D experience without meaningful compromises, or does it involve some glitches (imprecise name markers, bad godrays, etc.) and/or screwing around?
It's tied to the rest of the hud, really. But there are various depth presets you can set. I don't think its an issue in this case the fix is practically 3d ready. The trick is to use convergeance and depth properly with your Hud depth setting. Most games don't deal with this correctly. (only one that I can really think of that does is GW2 complete with 3d cursor.)
I couldn't bare to play this game in 2d. After a small time playing you wont even think about the UI.
It's tied to the rest of the hud, really. But there are various depth presets you can set. I don't think its an issue in this case the fix is practically 3d ready. The trick is to use convergeance and depth properly with your Hud depth setting. Most games don't deal with this correctly. (only one that I can really think of that does is GW2 complete with 3d cursor.)
I couldn't bare to play this game in 2d. After a small time playing you wont even think about the UI.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
There is also some experimental support to automatically adjust the UI in this game based on the depth buffer, which can be enabled by editing the d3dx.ini - uncomment the section of UI shaders and set z=1 or z=2. It doesn't work very well however - with z=1 UI elements can get split up and some of them use the wrong part of the depth buffer for their depth, and z=2 looks a lot like the UI does in CM.
Some of the features I have planned for 3DMigoto have the potential to be able to improve this situation, but I don't want to get anyone's hopes up - the HUD in this game is particularly tricky.
I'd recommend just using the ~ key to cycle UI depth as needed.
There is also some experimental support to automatically adjust the UI in this game based on the depth buffer, which can be enabled by editing the d3dx.ini - uncomment the section of UI shaders and set z=1 or z=2. It doesn't work very well however - with z=1 UI elements can get split up and some of them use the wrong part of the depth buffer for their depth, and z=2 looks a lot like the UI does in CM.
Some of the features I have planned for 3DMigoto have the potential to be able to improve this situation, but I don't want to get anyone's hopes up - the HUD in this game is particularly tricky.
I'd recommend just using the ~ key to cycle UI depth as needed.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Thanks for the answers, guys. I think I might stick to 2D for now. The 2D HUD markers in Shadow of Mordor actually bothered me throughout the entire playthrough, so I fear they might do the same here.
Normally it'd be worth a shot anyway, of course. But in this case it doesn't look like I'd manage to even get Windows 7 installed without a lot of troubleshooting. So I think I'll save myself that headache.
It's a shame, because I can tell by the extensive list of things that you have fixed that your shaderhacker magic is as impressive as ever.
Thanks for the answers, guys. I think I might stick to 2D for now. The 2D HUD markers in Shadow of Mordor actually bothered me throughout the entire playthrough, so I fear they might do the same here.
Normally it'd be worth a shot anyway, of course. But in this case it doesn't look like I'd manage to even get Windows 7 installed without a lot of troubleshooting. So I think I'll save myself that headache.
It's a shame, because I can tell by the extensive list of things that you have fixed that your shaderhacker magic is as impressive as ever.
[quote="Volnaiskra"]Thanks for the answers, guys. I think I might stick to 2D for now. The 2D HUD markers in Shadow of Mordor actually bothered me throughout the entire playthrough, so I fear they might do the same here.
Normally it'd be worth a shot anyway, of course. But in this case it doesn't look like I'd manage to even get Windows 7 installed without a lot of troubleshooting. So I think I'll save myself that headache.
It's a shame, because I can tell by the extensive list of things that you have fixed that your shaderhacker magic is as impressive as ever.
[/quote]
I can understand why the HUD/UI might have infuriated you(like it did me ) in Shadow of Mordor. I took a look at the UI as well (in Mordor) and is very hard to get a "normal" behaviour out of it.
HOWEVER, this is not the case here. I actually had to start the game just now and to look at the UI;)) I couldn't find anything that is so obtrusive as it was in Mordor;)
I would defo give it a SHOT at Witcher 3 In 3D Vision (is marvellous).
Regarding W7: I wouldn't EVEN BOTHER with DUAL BOOT!
I would just Disable in BIOS the WIN10 HDD. Leave the other HDD. Boot Win7 DVD and install it on the other HDD. Re-enable Win10 HDD from BIOS. Use BIOS boot selector (normally is F8 when POST-ing) and select the Win7 HDD whenever you want to boot into it;)
Volnaiskra said:Thanks for the answers, guys. I think I might stick to 2D for now. The 2D HUD markers in Shadow of Mordor actually bothered me throughout the entire playthrough, so I fear they might do the same here.
Normally it'd be worth a shot anyway, of course. But in this case it doesn't look like I'd manage to even get Windows 7 installed without a lot of troubleshooting. So I think I'll save myself that headache.
It's a shame, because I can tell by the extensive list of things that you have fixed that your shaderhacker magic is as impressive as ever.
I can understand why the HUD/UI might have infuriated you(like it did me ) in Shadow of Mordor. I took a look at the UI as well (in Mordor) and is very hard to get a "normal" behaviour out of it.
HOWEVER, this is not the case here. I actually had to start the game just now and to look at the UI;)) I couldn't find anything that is so obtrusive as it was in Mordor;)
I would defo give it a SHOT at Witcher 3 In 3D Vision (is marvellous).
Regarding W7: I wouldn't EVEN BOTHER with DUAL BOOT!
I would just Disable in BIOS the WIN10 HDD. Leave the other HDD. Boot Win7 DVD and install it on the other HDD. Re-enable Win10 HDD from BIOS. Use BIOS boot selector (normally is F8 when POST-ing) and select the Win7 HDD whenever you want to boot into it;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
The Witcher3 3D fix is very likely our best fix of all time. Fixes more and makes the game truly stunning.
You guys not being able to play because of Win10 makes me- sad.
[quote="helifax"]
Regarding W7: I wouldn't EVEN BOTHER with DUAL BOOT!
I would just Disable in BIOS the WIN10 HDD. Leave the other HDD. Boot Win7 DVD and install it on the other HDD. Re-enable Win10 HDD from BIOS. Use BIOS boot selector (normally is F8 when POST-ing) and select the Win7 HDD whenever you want to boot into it;)
[/quote]That's exactly what I tried. But when it came time to select which drive to install Windows 7 on, the Windows installer refused to install on any of the ones I had available. All it gave me was some vague error message. I looked up the error code online, but all I could find was that it was some sort of "general error" message. :/
The sad irony is that the reason I installed Windows 10 is because I couldn't play Witcher 3 on Windows 7!
My Win 7 installation shat itself last week. I was getting various crashes, and many games wouldn't start, including Witcher 2 and Witcher 3. So I figured it was time to reinstall Windows....and I figured that I may as well give Win 10 a try. :/
If it's not like Shadow of Mordor then that's different.
One last question: does the fix tend to give any of you instability? I'm already getting a few crashes in 2D.
helifax said:
Regarding W7: I wouldn't EVEN BOTHER with DUAL BOOT!
I would just Disable in BIOS the WIN10 HDD. Leave the other HDD. Boot Win7 DVD and install it on the other HDD. Re-enable Win10 HDD from BIOS. Use BIOS boot selector (normally is F8 when POST-ing) and select the Win7 HDD whenever you want to boot into it;)
That's exactly what I tried. But when it came time to select which drive to install Windows 7 on, the Windows installer refused to install on any of the ones I had available. All it gave me was some vague error message. I looked up the error code online, but all I could find was that it was some sort of "general error" message. :/
The sad irony is that the reason I installed Windows 10 is because I couldn't play Witcher 3 on Windows 7!
My Win 7 installation shat itself last week. I was getting various crashes, and many games wouldn't start, including Witcher 2 and Witcher 3. So I figured it was time to reinstall Windows....and I figured that I may as well give Win 10 a try. :/
If it's not like Shadow of Mordor then that's different.
One last question: does the fix tend to give any of you instability? I'm already getting a few crashes in 2D.
I wonder if I should go back and study the problem that StereoFlagsDX10 solves in more detail and see if there's an alternate solution that would work on Win10... I had a working theory when we originally hit this in FC4, but didn't have enough information to be able to solve it at the time. Nowadays we have frame analysis, which might be able to give me the info I need...
However, this is not going to be a high priority for me since we have a workable solution - my backlog is too full as it is.
I wonder if I should go back and study the problem that StereoFlagsDX10 solves in more detail and see if there's an alternate solution that would work on Win10... I had a working theory when we originally hit this in FC4, but didn't have enough information to be able to solve it at the time. Nowadays we have frame analysis, which might be able to give me the info I need...
However, this is not going to be a high priority for me since we have a workable solution - my backlog is too full as it is.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Totally understand that Win10 shouldn't be a priority for a working fix. Though I guess you guys are going to have to bite the Windows 10 bullet sooner or later.
After much screwing around in my BIOS, I finally got win7 to install. So I'm going to try the fix tonight.
Totally understand that Win10 shouldn't be a priority for a working fix. Though I guess you guys are going to have to bite the Windows 10 bullet sooner or later.
After much screwing around in my BIOS, I finally got win7 to install. So I'm going to try the fix tonight.
[quote="DarkStarSword"]I wonder if I should go back and study the problem that StereoFlagsDX10 solves in more detail and see if there's an alternate solution that would work on Win10... I had a working theory when we originally hit this in FC4, but didn't have enough information to be able to solve it at the time. Nowadays we have frame analysis, which might be able to give me the info I need...
However, this is not going to be a high priority for me since we have a workable solution - my backlog is too full as it is.[/quote]
That is actually an interesting approach and I thought about it myself:) I have never used the Frame Analyses, but I remember trying it one time;) I had no idea what to do after it dumped all all the 30Gb of data:))
What I think the Stereo10 Flag does, it controls what Surfaces/textures are stereorized and probably what Framebuffers;)
For example in some games we see stuff rendered in one eye but not the other. Changing this flag will make things render in both eyes. Logically, you would assume it tells the driver that those textures/FBOs need to be Rendered 2 TIMES (one per each eye) rather than ONCE (before the frame swap):
- 2D Rendering : Draw, Swap/Present and get ready for next frame
- 3D Rendering : Draw Left, Draw Right, Swap/Present and get ready for next frame
This is the reason we always see them ONLY in the right eye (not only in the left eye).
The same with Witcher 3:) Left eye => white, Right eye => correct. If you look closely, in the left eye you can kinda see a "variance" of the Depth Buffer rather than the final Colour FBO being drawn. So, I expect is skipping everything for the left eye and only draws it on the right?
In any case, Nvidia is now aware and there is a bug in their official DB regarding this issue. When/IF it will get fixed... I have no idea;) But we both believe now it has something to do with the WDDM 2.0 as it works perfectly fine in WDDM 1.3 (I guess this wasn't thoroughly tested when the win10 drivers were made).
Still, DarkStarSword if you are interested in pursuing this I am more than happy to help;)
DarkStarSword said:I wonder if I should go back and study the problem that StereoFlagsDX10 solves in more detail and see if there's an alternate solution that would work on Win10... I had a working theory when we originally hit this in FC4, but didn't have enough information to be able to solve it at the time. Nowadays we have frame analysis, which might be able to give me the info I need...
However, this is not going to be a high priority for me since we have a workable solution - my backlog is too full as it is.
That is actually an interesting approach and I thought about it myself:) I have never used the Frame Analyses, but I remember trying it one time;) I had no idea what to do after it dumped all all the 30Gb of data:))
What I think the Stereo10 Flag does, it controls what Surfaces/textures are stereorized and probably what Framebuffers;)
For example in some games we see stuff rendered in one eye but not the other. Changing this flag will make things render in both eyes. Logically, you would assume it tells the driver that those textures/FBOs need to be Rendered 2 TIMES (one per each eye) rather than ONCE (before the frame swap):
- 2D Rendering : Draw, Swap/Present and get ready for next frame
- 3D Rendering : Draw Left, Draw Right, Swap/Present and get ready for next frame
This is the reason we always see them ONLY in the right eye (not only in the left eye).
The same with Witcher 3:) Left eye => white, Right eye => correct. If you look closely, in the left eye you can kinda see a "variance" of the Depth Buffer rather than the final Colour FBO being drawn. So, I expect is skipping everything for the left eye and only draws it on the right?
In any case, Nvidia is now aware and there is a bug in their official DB regarding this issue. When/IF it will get fixed... I have no idea;) But we both believe now it has something to do with the WDDM 2.0 as it works perfectly fine in WDDM 1.3 (I guess this wasn't thoroughly tested when the win10 drivers were made).
Still, DarkStarSword if you are interested in pursuing this I am more than happy to help;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote="helifax"]That is actually an interesting approach and I thought about it myself:) I have never used the Frame Analyses, but I remember trying it one time;) I had no idea what to do after it dumped all all the 30Gb of data:))[/quote]Throw away 95% of it and examine the rest to find any that show a broken or mono effect :-p
I added the ability to use shaders as triggers (use analyse_options in a [ShaderOverride] with the persist keyword) to change the options during the frame so you can focus on a specific shader, or find one that runs before the interesting part of the frame (finding one that runs at the start of post-processing is useful, as it means you can skip dumping the opaque draw calls, which take up the vast majority of the frame and are almost always fine. I usually do this by doing one pass with dump_rt or dump_rt_jps so I don't have to wait too long, then the next pass I'll have the triggers set up to dump dds files or whatever I need).
For analysing depth targets you will need to dump them out as .dds files (saving as .jps doesn't work for depth targets yet), and find something that can view them (I've got my own dds-info.py script on github that can convert some of them to .png).
[quote]What I think the Stereo10 Flag does, it controls what Surfaces/textures are stereorized and probably what Framebuffers;)[/quote]
I think there's a bit more to it than that - from the docs nvidia sent us we know that StereoTextureEnable is the primary setting responsible for that, but we never got docs for StereoFlagsDX10 so we don't know exactly what it does. You might be thinking along the right lines - at a complete guess I suspect it sets heuristics for how to treat resources based on the flags structs (e.g. bind flags, usage, cpu access flags, mip levels, array size, misc flags, etc) in the various CreateXXX() routines that didn't exist in DX9. It would not surprise me if the specific setting that helps tells it to create an intermediate resource (e.g. one with no bind flags, or a staging texture) in stereo, as we would not see these types of resources used in a shader directly, which could explain why I never found the resource in ShaderUsage.txt.
[quote]For example in some games we see stuff rendered in one eye but not the other. Changing this flag will make things render in both eyes. Logically, you would assume it tells the driver that those textures/FBOs need to be Rendered 2 TIMES (one per each eye) rather than ONCE (before the frame swap):
- 2D Rendering : Draw, Swap/Present and get ready for next frame
- 3D Rendering : Draw Left, Draw Right, Swap/Present and get ready for next frame
This is the reason we always see them ONLY in the right eye (not only in the left eye).
The same with Witcher 3:) Left eye => white, Right eye => correct. If you look closely, in the left eye you can kinda see a "variance" of the Depth Buffer rather than the final Colour FBO being drawn. So, I expect is skipping everything for the left eye and only draws it on the right?[/quote]
So, we know that there are a few different things that can happen which result in a mono effect:
- A resource (texture, render target, etc) was created mono that should have been stereo. Once identified through frame analysis or ShaderUsage.txt it is easily fixed with a [TextureOverride] section.
- A shader was only run once instead of twice, so regardless of whether it's output resources are stereo or mono it will only ever render one perspective (e.g. the shader that resolves anti-aliased hairworks in Witcher 3). [ShaderOverride]fake_o0=1 may be able to solve these (depending on the heuristic).
- A shader is run twice and the output is stereo, but the stereo correction formula is not applied (typical of UI elements, not relevant here)
- A shader is run twice and the output is stereo, but the stereo correction formula is applied inconsistently (e.g. based on the depth buffer being inconsistently enabled, or position overlapping the edge of the screen)
[quote]In any case, Nvidia is now aware and there is a bug in their official DB regarding this issue. When/IF it will get fixed... I have no idea;) But we both believe now it has something to do with the WDDM 2.0 as it works perfectly fine in WDDM 1.3 (I guess this wasn't thoroughly tested when the win10 drivers were made).[/quote]
Good to know :)
[quote]Still, DarkStarSword if you are interested in pursuing this I am more than happy to help;)[/quote]
So, in FC4 we had a fairly typical scenario (note that the problem in FC4 was related to the depth buffers - it's not clear that is the case in Witcher 3):
1. Opaque objects drawn with depth buffer enabled. Depth buffer was stereo.
2. The depth buffer was transferred to a shader resource (AKA texture) for transparent effects and postprocessing effects. It was somewhere in this process that the stereo information from the depth buffer was lost. The details of exactly how this copy occurred were fuzzy and ShaderUsage.txt was confusing - was it copied using CopyResource()? Rendered to a render target using a dedicated shader? Unbound from the depth buffer and rebound elsewhere? Mapped onto the CPU and copied there? Maybe one of the NDA nvapi calls that can do this were used?
3. Transparent effects used the depth information in the new shader resource to calculate opacity. The depth information was mono by this stage, only containing information for one eye (right IIRC), which caused opacity to be incorrect in one eye.
The obvious solution of forcing the final depth texture to stereo did not work - without doing this the depth information from the right eye was used in the left eye, but with this the depth information was missing from the left eye instead. This suggests that the information for the second eye had already been lost, either before it was copied to this resource, or during the copy (like the problem with anti-aliased hairworks in Witcher 3).
Frame analysis should make it very obvious where the stereo information was lost - it uses the nvapi reverse stereo blit, which allows it to get the information for both eyes. This means if we can find the copy in the dump using "analyse_options = dump_tex_dds dump_rt_dds" we should be able to see the input depth buffer in stereo, with two different perspectives, and the output in mono - either with the same perspective in both eyes (resource was probably created mono), or one eye blank (resource was probably created stereo, but missing one eye). I've already used this approach to work out what was wrong with anti-aliased hairworks in Witcher 3 and the post processed bloom in Mad Max and some other things so it can definitely work. Not all the scenarios I mentioned in 2 are currently analysed - it will catch it if a shader is used for the copy, but will miss it if CopyResource() or one of the other possibilities was used - I may need to extend the feature if that is the case, but that should be relatively straight forward.
helifax said:That is actually an interesting approach and I thought about it myself:) I have never used the Frame Analyses, but I remember trying it one time;) I had no idea what to do after it dumped all all the 30Gb of data:))
Throw away 95% of it and examine the rest to find any that show a broken or mono effect :-p
I added the ability to use shaders as triggers (use analyse_options in a [ShaderOverride] with the persist keyword) to change the options during the frame so you can focus on a specific shader, or find one that runs before the interesting part of the frame (finding one that runs at the start of post-processing is useful, as it means you can skip dumping the opaque draw calls, which take up the vast majority of the frame and are almost always fine. I usually do this by doing one pass with dump_rt or dump_rt_jps so I don't have to wait too long, then the next pass I'll have the triggers set up to dump dds files or whatever I need).
For analysing depth targets you will need to dump them out as .dds files (saving as .jps doesn't work for depth targets yet), and find something that can view them (I've got my own dds-info.py script on github that can convert some of them to .png).
What I think the Stereo10 Flag does, it controls what Surfaces/textures are stereorized and probably what Framebuffers;)
I think there's a bit more to it than that - from the docs nvidia sent us we know that StereoTextureEnable is the primary setting responsible for that, but we never got docs for StereoFlagsDX10 so we don't know exactly what it does. You might be thinking along the right lines - at a complete guess I suspect it sets heuristics for how to treat resources based on the flags structs (e.g. bind flags, usage, cpu access flags, mip levels, array size, misc flags, etc) in the various CreateXXX() routines that didn't exist in DX9. It would not surprise me if the specific setting that helps tells it to create an intermediate resource (e.g. one with no bind flags, or a staging texture) in stereo, as we would not see these types of resources used in a shader directly, which could explain why I never found the resource in ShaderUsage.txt.
For example in some games we see stuff rendered in one eye but not the other. Changing this flag will make things render in both eyes. Logically, you would assume it tells the driver that those textures/FBOs need to be Rendered 2 TIMES (one per each eye) rather than ONCE (before the frame swap):
- 2D Rendering : Draw, Swap/Present and get ready for next frame
- 3D Rendering : Draw Left, Draw Right, Swap/Present and get ready for next frame
This is the reason we always see them ONLY in the right eye (not only in the left eye).
The same with Witcher 3:) Left eye => white, Right eye => correct. If you look closely, in the left eye you can kinda see a "variance" of the Depth Buffer rather than the final Colour FBO being drawn. So, I expect is skipping everything for the left eye and only draws it on the right?
So, we know that there are a few different things that can happen which result in a mono effect:
- A resource (texture, render target, etc) was created mono that should have been stereo. Once identified through frame analysis or ShaderUsage.txt it is easily fixed with a [TextureOverride] section.
- A shader was only run once instead of twice, so regardless of whether it's output resources are stereo or mono it will only ever render one perspective (e.g. the shader that resolves anti-aliased hairworks in Witcher 3). [ShaderOverride]fake_o0=1 may be able to solve these (depending on the heuristic).
- A shader is run twice and the output is stereo, but the stereo correction formula is not applied (typical of UI elements, not relevant here)
- A shader is run twice and the output is stereo, but the stereo correction formula is applied inconsistently (e.g. based on the depth buffer being inconsistently enabled, or position overlapping the edge of the screen)
In any case, Nvidia is now aware and there is a bug in their official DB regarding this issue. When/IF it will get fixed... I have no idea;) But we both believe now it has something to do with the WDDM 2.0 as it works perfectly fine in WDDM 1.3 (I guess this wasn't thoroughly tested when the win10 drivers were made).
Good to know :)
Still, DarkStarSword if you are interested in pursuing this I am more than happy to help;)
So, in FC4 we had a fairly typical scenario (note that the problem in FC4 was related to the depth buffers - it's not clear that is the case in Witcher 3):
1. Opaque objects drawn with depth buffer enabled. Depth buffer was stereo.
2. The depth buffer was transferred to a shader resource (AKA texture) for transparent effects and postprocessing effects. It was somewhere in this process that the stereo information from the depth buffer was lost. The details of exactly how this copy occurred were fuzzy and ShaderUsage.txt was confusing - was it copied using CopyResource()? Rendered to a render target using a dedicated shader? Unbound from the depth buffer and rebound elsewhere? Mapped onto the CPU and copied there? Maybe one of the NDA nvapi calls that can do this were used?
3. Transparent effects used the depth information in the new shader resource to calculate opacity. The depth information was mono by this stage, only containing information for one eye (right IIRC), which caused opacity to be incorrect in one eye.
The obvious solution of forcing the final depth texture to stereo did not work - without doing this the depth information from the right eye was used in the left eye, but with this the depth information was missing from the left eye instead. This suggests that the information for the second eye had already been lost, either before it was copied to this resource, or during the copy (like the problem with anti-aliased hairworks in Witcher 3).
Frame analysis should make it very obvious where the stereo information was lost - it uses the nvapi reverse stereo blit, which allows it to get the information for both eyes. This means if we can find the copy in the dump using "analyse_options = dump_tex_dds dump_rt_dds" we should be able to see the input depth buffer in stereo, with two different perspectives, and the output in mono - either with the same perspective in both eyes (resource was probably created mono), or one eye blank (resource was probably created stereo, but missing one eye). I've already used this approach to work out what was wrong with anti-aliased hairworks in Witcher 3 and the post processed bloom in Mad Max and some other things so it can definitely work. Not all the scenarios I mentioned in 2 are currently analysed - it will catch it if a shader is used for the copy, but will miss it if CopyResource() or one of the other possibilities was used - I may need to extend the feature if that is the case, but that should be relatively straight forward.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
I need to double check a few things, but frame analysis seems to indicate that the issue in Witcher 3 with StereoFlagsDX10=0 first occurs when compute shader 4e7b1b2f8c99e97a is run. It's writing to UAV c77776b9, which is stereo, so I'm guessing the issue is that compute shaders are only run for one eye without that flag, and I can't think of any way to influence that heuristic outside of the setting.
I need to double check a few things, but frame analysis seems to indicate that the issue in Witcher 3 with StereoFlagsDX10=0 first occurs when compute shader 4e7b1b2f8c99e97a is run. It's writing to UAV c77776b9, which is stereo, so I'm guessing the issue is that compute shaders are only run for one eye without that flag, and I can't think of any way to influence that heuristic outside of the setting.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Nope, no chance that this will fix it, the Win10 problem is because of the NVidia driver, which we cannot fix.
If you want to play this game right now (and you should, because it's pretty much the best game fix we've made for one of the best games we've ever seen), you need to dual boot Win7. Don't shortchange yourself and use CM, that is completely bogus technology when a true fix exists.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
MSI Z97 GAMING 5
Intel Core i7 4790K
G.SKILL RipjawsX DDR3 2133MHz CL9 16GB
MSI GTX980 GAMING 4G
Windows 7
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
donations: ulfjalmbrant@hotmail.com
Sorry to be so insistent and bump this question, but I'd like to know this. I'm on Win 10, and I've attempted dual-booting Win 7 but without success (my win 7 CD didn't want to install on any of my available hard disks - couldn't figure out why; it's annoying, because I've got about 4 of the fuckers). I can keep trying, though I'd like to know what I'm shooting for.
I have 980ti SLI now by the way - not sure if that makes a difference in terms of drivers. Ironically, I bought them partly so I could enjoy 3D in Witcher 3. Meanwhile, my Win7 installation shat itself, so I clean installed Windows 10, thinking I'd have a perfect new OS to play Witcher 3 on...but then I learned of the Win 10 problems :(
Playing this game in 2D is obviously lukewarm, graphically speaking. I really wish I could have 3D. But at least I'm enjoying silky 60fps and pure unadulterated graphics. Is the 3D experience without meaningful compromises, or does it involve some glitches (imprecise name markers, bad godrays, etc.) and/or screwing around?
I couldn't bare to play this game in 2d. After a small time playing you wont even think about the UI.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
Some of the features I have planned for 3DMigoto have the potential to be able to improve this situation, but I don't want to get anyone's hopes up - the HUD in this game is particularly tricky.
I'd recommend just using the ~ key to cycle UI depth as needed.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Normally it'd be worth a shot anyway, of course. But in this case it doesn't look like I'd manage to even get Windows 7 installed without a lot of troubleshooting. So I think I'll save myself that headache.
It's a shame, because I can tell by the extensive list of things that you have fixed that your shaderhacker magic is as impressive as ever.
I can understand why the HUD/UI might have infuriated you(like it did me ) in Shadow of Mordor. I took a look at the UI as well (in Mordor) and is very hard to get a "normal" behaviour out of it.
HOWEVER, this is not the case here. I actually had to start the game just now and to look at the UI;)) I couldn't find anything that is so obtrusive as it was in Mordor;)
I would defo give it a SHOT at Witcher 3 In 3D Vision (is marvellous).
Regarding W7: I wouldn't EVEN BOTHER with DUAL BOOT!
I would just Disable in BIOS the WIN10 HDD. Leave the other HDD. Boot Win7 DVD and install it on the other HDD. Re-enable Win10 HDD from BIOS. Use BIOS boot selector (normally is F8 when POST-ing) and select the Win7 HDD whenever you want to boot into it;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
You guys not being able to play because of Win10 makes me- sad.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
The sad irony is that the reason I installed Windows 10 is because I couldn't play Witcher 3 on Windows 7!
My Win 7 installation shat itself last week. I was getting various crashes, and many games wouldn't start, including Witcher 2 and Witcher 3. So I figured it was time to reinstall Windows....and I figured that I may as well give Win 10 a try. :/
If it's not like Shadow of Mordor then that's different.
One last question: does the fix tend to give any of you instability? I'm already getting a few crashes in 2D.
However, this is not going to be a high priority for me since we have a workable solution - my backlog is too full as it is.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
After much screwing around in my BIOS, I finally got win7 to install. So I'm going to try the fix tonight.
That is actually an interesting approach and I thought about it myself:) I have never used the Frame Analyses, but I remember trying it one time;) I had no idea what to do after it dumped all all the 30Gb of data:))
What I think the Stereo10 Flag does, it controls what Surfaces/textures are stereorized and probably what Framebuffers;)
For example in some games we see stuff rendered in one eye but not the other. Changing this flag will make things render in both eyes. Logically, you would assume it tells the driver that those textures/FBOs need to be Rendered 2 TIMES (one per each eye) rather than ONCE (before the frame swap):
- 2D Rendering : Draw, Swap/Present and get ready for next frame
- 3D Rendering : Draw Left, Draw Right, Swap/Present and get ready for next frame
This is the reason we always see them ONLY in the right eye (not only in the left eye).
The same with Witcher 3:) Left eye => white, Right eye => correct. If you look closely, in the left eye you can kinda see a "variance" of the Depth Buffer rather than the final Colour FBO being drawn. So, I expect is skipping everything for the left eye and only draws it on the right?
In any case, Nvidia is now aware and there is a bug in their official DB regarding this issue. When/IF it will get fixed... I have no idea;) But we both believe now it has something to do with the WDDM 2.0 as it works perfectly fine in WDDM 1.3 (I guess this wasn't thoroughly tested when the win10 drivers were made).
Still, DarkStarSword if you are interested in pursuing this I am more than happy to help;)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
I added the ability to use shaders as triggers (use analyse_options in a [ShaderOverride] with the persist keyword) to change the options during the frame so you can focus on a specific shader, or find one that runs before the interesting part of the frame (finding one that runs at the start of post-processing is useful, as it means you can skip dumping the opaque draw calls, which take up the vast majority of the frame and are almost always fine. I usually do this by doing one pass with dump_rt or dump_rt_jps so I don't have to wait too long, then the next pass I'll have the triggers set up to dump dds files or whatever I need).
For analysing depth targets you will need to dump them out as .dds files (saving as .jps doesn't work for depth targets yet), and find something that can view them (I've got my own dds-info.py script on github that can convert some of them to .png).
I think there's a bit more to it than that - from the docs nvidia sent us we know that StereoTextureEnable is the primary setting responsible for that, but we never got docs for StereoFlagsDX10 so we don't know exactly what it does. You might be thinking along the right lines - at a complete guess I suspect it sets heuristics for how to treat resources based on the flags structs (e.g. bind flags, usage, cpu access flags, mip levels, array size, misc flags, etc) in the various CreateXXX() routines that didn't exist in DX9. It would not surprise me if the specific setting that helps tells it to create an intermediate resource (e.g. one with no bind flags, or a staging texture) in stereo, as we would not see these types of resources used in a shader directly, which could explain why I never found the resource in ShaderUsage.txt.
So, we know that there are a few different things that can happen which result in a mono effect:
- A resource (texture, render target, etc) was created mono that should have been stereo. Once identified through frame analysis or ShaderUsage.txt it is easily fixed with a [TextureOverride] section.
- A shader was only run once instead of twice, so regardless of whether it's output resources are stereo or mono it will only ever render one perspective (e.g. the shader that resolves anti-aliased hairworks in Witcher 3). [ShaderOverride]fake_o0=1 may be able to solve these (depending on the heuristic).
- A shader is run twice and the output is stereo, but the stereo correction formula is not applied (typical of UI elements, not relevant here)
- A shader is run twice and the output is stereo, but the stereo correction formula is applied inconsistently (e.g. based on the depth buffer being inconsistently enabled, or position overlapping the edge of the screen)
Good to know :)
So, in FC4 we had a fairly typical scenario (note that the problem in FC4 was related to the depth buffers - it's not clear that is the case in Witcher 3):
1. Opaque objects drawn with depth buffer enabled. Depth buffer was stereo.
2. The depth buffer was transferred to a shader resource (AKA texture) for transparent effects and postprocessing effects. It was somewhere in this process that the stereo information from the depth buffer was lost. The details of exactly how this copy occurred were fuzzy and ShaderUsage.txt was confusing - was it copied using CopyResource()? Rendered to a render target using a dedicated shader? Unbound from the depth buffer and rebound elsewhere? Mapped onto the CPU and copied there? Maybe one of the NDA nvapi calls that can do this were used?
3. Transparent effects used the depth information in the new shader resource to calculate opacity. The depth information was mono by this stage, only containing information for one eye (right IIRC), which caused opacity to be incorrect in one eye.
The obvious solution of forcing the final depth texture to stereo did not work - without doing this the depth information from the right eye was used in the left eye, but with this the depth information was missing from the left eye instead. This suggests that the information for the second eye had already been lost, either before it was copied to this resource, or during the copy (like the problem with anti-aliased hairworks in Witcher 3).
Frame analysis should make it very obvious where the stereo information was lost - it uses the nvapi reverse stereo blit, which allows it to get the information for both eyes. This means if we can find the copy in the dump using "analyse_options = dump_tex_dds dump_rt_dds" we should be able to see the input depth buffer in stereo, with two different perspectives, and the output in mono - either with the same perspective in both eyes (resource was probably created mono), or one eye blank (resource was probably created stereo, but missing one eye). I've already used this approach to work out what was wrong with anti-aliased hairworks in Witcher 3 and the post processed bloom in Mad Max and some other things so it can definitely work. Not all the scenarios I mentioned in 2 are currently analysed - it will catch it if a shader is used for the copy, but will miss it if CopyResource() or one of the other possibilities was used - I may need to extend the feature if that is the case, but that should be relatively straight forward.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword