[quote="mindw0rk"]This game has style that fits great for 3d but I couldnt turn 3d on. Resolution choice only allows 60hz. How do I use 3D-Hubplay? [/quote]
You must assign the game exe to the 3D-Hub Player profile using nvidiainspector.
[quote="mindw0rk"]I did but this didnt work for me. 3D didnt activate[/quote]
I had the same problem before assigning the game exe to 3D-Hub Player profile the reason possible I'm working is probably because of EDID override to Samsung UN55HU9000 I found what works fine for me doesn't always work for everyone.
The game uses the unity engine which always has a full-screen problem.
mindw0rk said:I did but this didnt work for me. 3D didnt activate
I had the same problem before assigning the game exe to 3D-Hub Player profile the reason possible I'm working is probably because of EDID override to Samsung UN55HU9000 I found what works fine for me doesn't always work for everyone.
The game uses the unity engine which always has a full-screen problem.
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
mindw0rk - I Couldn't get 3D to activate as Unity engine wouldn't go into fullscreen. Adding command line "-window-mode exclusive" to Steam launch options fixed this ;) Game looks great in 3D.
mindw0rk - I Couldn't get 3D to activate as Unity engine wouldn't go into fullscreen. Adding command line "-window-mode exclusive" to Steam launch options fixed this ;) Game looks great in 3D.
[quote="TonyBirt"]mindw0rk - I Couldn't get 3D to activate as Unity engine wouldn't go into fullscreen. Adding command line "-window-mode exclusive" to Steam launch options fixed this ;) Game looks great in 3D.[/quote]
Thanks, that got it working for me.
Game actually has broken lighting effects. Tried running DarkStarSword's unity fix template on the game, and it surprisingly didn't help, but it looks like it's only a couple of broken shaders, so I might do a manual fix on the game.
TonyBirt said:mindw0rk - I Couldn't get 3D to activate as Unity engine wouldn't go into fullscreen. Adding command line "-window-mode exclusive" to Steam launch options fixed this ;) Game looks great in 3D.
Thanks, that got it working for me.
Game actually has broken lighting effects. Tried running DarkStarSword's unity fix template on the game, and it surprisingly didn't help, but it looks like it's only a couple of broken shaders, so I might do a manual fix on the game.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
I started having a look at this one, but it uses custom shaders that we can neither decompile nor assemble with our current tools - kind of the excuse I was looking for to start work on rewriting the assembler, even though that approach isn't going to be the fastest way to fix this.
I started having a look at this one, but it uses custom shaders that we can neither decompile nor assemble with our current tools - kind of the excuse I was looking for to start work on rewriting the assembler, even though that approach isn't going to be the fastest way to fix this.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
What are the HLSL glitches? If it's something that might be straightforward I can prioritize a look at them, especially if it might be happening in the future for other unity games.
What are the HLSL glitches? If it's something that might be straightforward I can prioritize a look at them, especially if it might be happening in the future for other unity games.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
These may not be the shaders we actually need to fix - the code doesn't look right for lighting so I need to confirm if they are the right ones with frame analysis, but they were the likely prospects from hunting and were giving me trouble.
Thanks for those examples.
The Atomic_Or is easy to fix, and a bit of an odd oversight since I already added a bunch of other atomic ops.
For the VS with StructuredBuffers, that is problematic. With no header information, that means I can't find out what the structured buffer (usually a struct) looks like, or how big it is.
In this case t0 and t1 are basically undefined, so making a general solution there is not clear to me at the moment.
I'll take a look at hand-fixing the VS to start, to see if there is something reasonable that might generalize.
The Atomic_Or is easy to fix, and a bit of an odd oversight since I already added a bunch of other atomic ops.
For the VS with StructuredBuffers, that is problematic. With no header information, that means I can't find out what the structured buffer (usually a struct) looks like, or how big it is.
In this case t0 and t1 are basically undefined, so making a general solution there is not clear to me at the moment.
I'll take a look at hand-fixing the VS to start, to see if there is something reasonable that might generalize.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Another approach for Unity might be to recreate the RDEF section in the binary shader from the Unity headers before running them through the decompiler - that is something I could potentially do in my scripts (and I have the RDEF format partially decoded, but not complete yet), but I haven't needed that so far because the scripts directly use the Unity headers and for the most part are now working on the assembly versions of the shaders.
Another approach for Unity might be to recreate the RDEF section in the binary shader from the Unity headers before running them through the decompiler - that is something I could potentially do in my scripts (and I have the RDEF format partially decoded, but not complete yet), but I haven't needed that so far because the scripts directly use the Unity headers and for the most part are now working on the assembly versions of the shaders.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Re-read your comment - yeah, the t0 and t1 buffers would be tricky to come up with a general solution. In this case I don't believe they are structured though, even though the compiler produced code that suggests that - s_Indices is clearly a typed buffer of integers, and s_Vertices is clearly a typed buffer of float4s.... but there's no easy way to determine that programmatically as you need to look at the instructions to see how it is accessed.
The vertex shader might be ok with the assembler though - I think it was only the pixel shader giving me trouble there. I don't think it's worth spending time fixing up the vertex shader unless we are fairly sure we need to patch it (e.g. the auto halo fix on it made zero difference).
Re-read your comment - yeah, the t0 and t1 buffers would be tricky to come up with a general solution. In this case I don't believe they are structured though, even though the compiler produced code that suggests that - s_Indices is clearly a typed buffer of integers, and s_Vertices is clearly a typed buffer of float4s.... but there's no easy way to determine that programmatically as you need to look at the instructions to see how it is accessed.
The vertex shader might be ok with the assembler though - I think it was only the pixel shader giving me trouble there. I don't think it's worth spending time fixing up the vertex shader unless we are fairly sure we need to patch it (e.g. the auto halo fix on it made zero difference).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
[quote="DarkStarSword"]Re-read your comment - yeah, the t0 and t1 buffers would be tricky to come up with a general solution. In this case I don't believe they are structured though, even though the compiler produced code that suggests that - s_Indices is clearly a typed buffer of integers, and s_Vertices is clearly a typed buffer of float4s.... but there's no easy way to determine that programmatically as you need to look at the instructions to see how it is accessed.
The vertex shader might be ok with the assembler though - I think it was only the pixel shader giving me trouble there. I don't think it's worth spending time fixing up the vertex shader unless we are fairly sure we need to patch it (e.g. the auto halo fix on it made zero difference).[/quote]
Latest version is shipped as 1.2.42, and I added all the atomic_* opcodes with possible expected manual fixes. The shaders should at least generate HLSL code now. I added the most likely intrinsic that would match, but am not sure it's right.
For the VS, I didn't see any good way to generalize that, but if we run into this more, I'll take a deeper look. (Also, we can always special case a fix for something as common as Unity 5 shaders)
DarkStarSword said:Re-read your comment - yeah, the t0 and t1 buffers would be tricky to come up with a general solution. In this case I don't believe they are structured though, even though the compiler produced code that suggests that - s_Indices is clearly a typed buffer of integers, and s_Vertices is clearly a typed buffer of float4s.... but there's no easy way to determine that programmatically as you need to look at the instructions to see how it is accessed.
The vertex shader might be ok with the assembler though - I think it was only the pixel shader giving me trouble there. I don't think it's worth spending time fixing up the vertex shader unless we are fairly sure we need to patch it (e.g. the auto halo fix on it made zero difference).
Latest version is shipped as 1.2.42, and I added all the atomic_* opcodes with possible expected manual fixes. The shaders should at least generate HLSL code now. I added the most likely intrinsic that would match, but am not sure it's right.
For the VS, I didn't see any good way to generalize that, but if we run into this more, I'll take a deeper look. (Also, we can always special case a fix for something as common as Unity 5 shaders)
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
http://store.steampowered.com/app/460700/
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
i5 2500K/16gb/GTX 970/Asus VG278H + Sony HMZ-T1
You must assign the game exe to the 3D-Hub Player profile using nvidiainspector.
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
i5 2500K/16gb/GTX 970/Asus VG278H + Sony HMZ-T1
I had the same problem before assigning the game exe to 3D-Hub Player profile the reason possible I'm working is probably because of EDID override to Samsung UN55HU9000 I found what works fine for me doesn't always work for everyone.
The game uses the unity engine which always has a full-screen problem.
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
Thanks, that got it working for me.
Game actually has broken lighting effects. Tried running DarkStarSword's unity fix template on the game, and it surprisingly didn't help, but it looks like it's only a couple of broken shaders, so I might do a manual fix on the game.
3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot
Like my fixes? Dontations can be made to: www.paypal.me/DShanz or rshannonca@gmail.com
Like electronic music? Check out: www.soundcloud.com/dj-ryan-king
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
The pixel shader fails to decompile:
The assembler produces garbage output from the dcl_input_ps_siv line and use of signed integers, which causes DirectX to hang.
Vertex shader needs some manual fixups:
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
The Atomic_Or is easy to fix, and a bit of an odd oversight since I already added a bunch of other atomic ops.
For the VS with StructuredBuffers, that is problematic. With no header information, that means I can't find out what the structured buffer (usually a struct) looks like, or how big it is.
In this case t0 and t1 are basically undefined, so making a general solution there is not clear to me at the moment.
I'll take a look at hand-fixing the VS to start, to see if there is something reasonable that might generalize.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
The vertex shader might be ok with the assembler though - I think it was only the pixel shader giving me trouble there. I don't think it's worth spending time fixing up the vertex shader unless we are fairly sure we need to patch it (e.g. the auto halo fix on it made zero difference).
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
Latest version is shipped as 1.2.42, and I added all the atomic_* opcodes with possible expected manual fixes. The shaders should at least generate HLSL code now. I added the most likely intrinsic that would match, but am not sure it's right.
For the VS, I didn't see any good way to generalize that, but if we run into this more, I'll take a deeper look. (Also, we can always special case a fix for something as common as Unity 5 shaders)
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers