There's a modding tool available in the Unity Store, I'm not sure if it will be useful for something.
https://forum.unity3d.com/threads/now-released-umod-2-0-modding-support-made-easy.386818/
http://umod.trivialinteractive.co.uk/
[url]https://forums.geforce.com/default/topic/949459/3d-vision/inside/post/5046602/#5046602[/url]
The game can be forced to run in Dx9, so it is possible to make a helixmod fix.
I'm not sure if 3Dmigoto works?
But it needs a lot of time to make a fix, and I'm still learning, so I won't be able to make one the next days.
Requesting a fix is easy. But to make a fix is hard. So be patient.
Yeah, I'd wait to see what sgsrules comes up with. Last I spoke with him, he was planning to release his earlier DX11 fix, which is not perfect, but definitely playable. Probably in the next month or so.
Wouldn't hurt to look at the DX9 variant using HelixMod if you were motivated. No idea what the visual differences would be.
Our DirectMode approach was not working well on this game, because of changes to the Unity plugin architecture, so the DirectMode approach will probably only work for newer games.
Yeah, I'd wait to see what sgsrules comes up with. Last I spoke with him, he was planning to release his earlier DX11 fix, which is not perfect, but definitely playable. Probably in the next month or so.
Wouldn't hurt to look at the DX9 variant using HelixMod if you were motivated. No idea what the visual differences would be.
Our DirectMode approach was not working well on this game, because of changes to the Unity plugin architecture, so the DirectMode approach will probably only work for newer games.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Thank you, sgsrules! You rule :p.
I doesn't seem to be forcing 0 separation for me. In fact, every time I boot the game I have to manually reduce it (the Nvidia one), and also convergence a bit to correct its small issues. [b]Edit:[/b] booting the game with admin rights enabled your automatic 0 separation. So it was on my end. Even with that, tweaking the Nvidia convergence a bit helped me at the settings I use.
Also, it looks like this fix reduces the internal fps by half (30fps per eye when RTSS reports 60fps). I solved it by forcing vsync off in the game profile and limiting fps to 120 (thankfully this isn't a 60fps capped game). This is probably using double the GPU performance than it should (50-55% usage on my GTX 1080 at 1440p), something that I feared when you said that this was using your 3D on top of 3D Vision automatic (something that sort of happens on Dolphin by default).
But anyway, the effects themselves are as perfect as they can be, judging by the first 40 seconds of gameplay I could try before going to bed :).
I doesn't seem to be forcing 0 separation for me. In fact, every time I boot the game I have to manually reduce it (the Nvidia one), and also convergence a bit to correct its small issues. Edit: booting the game with admin rights enabled your automatic 0 separation. So it was on my end. Even with that, tweaking the Nvidia convergence a bit helped me at the settings I use.
Also, it looks like this fix reduces the internal fps by half (30fps per eye when RTSS reports 60fps). I solved it by forcing vsync off in the game profile and limiting fps to 120 (thankfully this isn't a 60fps capped game). This is probably using double the GPU performance than it should (50-55% usage on my GTX 1080 at 1440p), something that I feared when you said that this was using your 3D on top of 3D Vision automatic (something that sort of happens on Dolphin by default).
But anyway, the effects themselves are as perfect as they can be, judging by the first 40 seconds of gameplay I could try before going to bed :).
[quote="masterotaku"]
Also, it looks like this fix reduces the internal fps by half (30fps per eye when RTSS reports 60fps). I solved it by forcing vsync off in the game profile and limiting fps to 120 (thankfully this isn't a 60fps capped game). This is probably using double the GPU performance than it should (50-55% usage on my GTX 1080 at 1440p), something that I feared when you said that this was using your 3D on top of 3D Vision automatic (something that sort of happens on Dolphin by default).
But anyway, the effects themselves are as perfect as they can be, judging by the first 40 seconds of gameplay I could try before going to bed :).[/quote]
The fix works by making the engine update at 120hz instead of 60hz. It sequentially renders each eye to a separate buffer and time is paused between each eye rendering to keep everything in sync. So the gpu has to do twice as much work as does the cpu. There's also the added overhead of Nvidia's 3d Vision Automatic mode doing it's thing, so that's using more resources than it should need.
There shouldn't be a need to cap fps since the engine is doing that already. Also vsync has to be disabled otherwise it'll update at 60hz which makes it run at 30fps, like you mentioned.
Thanks for the added input masterotaku. I've updated the fix to v1.1 which just adds a few changes to the profile section and sets the convergence.
I'm hoping that i can ditch 3D Vision Automatic and use 3D Vision direct for future versions of this method. I wanted to do it for this game but INSIDE is running on Unity 5.0.4 and it's native plugin API is a bit of a mess. Using 3D Vision Direct will be a lot more efficient.
masterotaku said:
Also, it looks like this fix reduces the internal fps by half (30fps per eye when RTSS reports 60fps). I solved it by forcing vsync off in the game profile and limiting fps to 120 (thankfully this isn't a 60fps capped game). This is probably using double the GPU performance than it should (50-55% usage on my GTX 1080 at 1440p), something that I feared when you said that this was using your 3D on top of 3D Vision automatic (something that sort of happens on Dolphin by default).
But anyway, the effects themselves are as perfect as they can be, judging by the first 40 seconds of gameplay I could try before going to bed :).
The fix works by making the engine update at 120hz instead of 60hz. It sequentially renders each eye to a separate buffer and time is paused between each eye rendering to keep everything in sync. So the gpu has to do twice as much work as does the cpu. There's also the added overhead of Nvidia's 3d Vision Automatic mode doing it's thing, so that's using more resources than it should need.
There shouldn't be a need to cap fps since the engine is doing that already. Also vsync has to be disabled otherwise it'll update at 60hz which makes it run at 30fps, like you mentioned.
Thanks for the added input masterotaku. I've updated the fix to v1.1 which just adds a few changes to the profile section and sets the convergence.
I'm hoping that i can ditch 3D Vision Automatic and use 3D Vision direct for future versions of this method. I wanted to do it for this game but INSIDE is running on Unity 5.0.4 and it's native plugin API is a bit of a mess. Using 3D Vision Direct will be a lot more efficient.
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
Thank you for all of your efforts with this sgsrules.
Running with an EDID override on a 60Hz passive display, I always have to pull the depth to zero when I start, which isn't really a hassle - just noting for anyone else who runs it the same way. I'm not sure my GPU would run it at 120 Hz.
What convergence is recommended in the Nvidia settings? Adjusting with the hotkeys works fine for me but I also find that depending on where the Nvidia convergence is set can cause this light shimmering effect on the outlines of the player character.
Thank you for all of your efforts with this sgsrules.
Running with an EDID override on a 60Hz passive display, I always have to pull the depth to zero when I start, which isn't really a hassle - just noting for anyone else who runs it the same way. I'm not sure my GPU would run it at 120 Hz.
What convergence is recommended in the Nvidia settings? Adjusting with the hotkeys works fine for me but I also find that depending on where the Nvidia convergence is set can cause this light shimmering effect on the outlines of the player character.
[quote="ummester"]Thank you for all of your efforts with this sgsrules.
Running with an EDID override on a 60Hz passive display, I always have to pull the depth to zero when I start, which isn't really a hassle - just noting for anyone else who runs it the same way. I'm not sure my GPU would run it at 120 Hz.
What convergence is recommended in the Nvidia settings? Adjusting with the hotkeys works fine for me but I also find that depending on where the Nvidia convergence is set can cause this light shimmering effect on the outlines of the player character.[/quote]
I used a value of 10.589 for convergence. You shouldn't have to set it since i added it to the profile section of the .ini
ummester said:Thank you for all of your efforts with this sgsrules.
Running with an EDID override on a 60Hz passive display, I always have to pull the depth to zero when I start, which isn't really a hassle - just noting for anyone else who runs it the same way. I'm not sure my GPU would run it at 120 Hz.
What convergence is recommended in the Nvidia settings? Adjusting with the hotkeys works fine for me but I also find that depending on where the Nvidia convergence is set can cause this light shimmering effect on the outlines of the player character.
I used a value of 10.589 for convergence. You shouldn't have to set it since i added it to the profile section of the .ini
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
https://forum.unity3d.com/threads/now-released-umod-2-0-modding-support-made-easy.386818/
http://umod.trivialinteractive.co.uk/
The game can be forced to run in Dx9, so it is possible to make a helixmod fix.
I'm not sure if 3Dmigoto works?
But it needs a lot of time to make a fix, and I'm still learning, so I won't be able to make one the next days.
Requesting a fix is easy. But to make a fix is hard. So be patient.
Desktop-PC
i7 870 @ 4.0GHz + MSI GTX1070 Gaming X + 16GB RAM + Win10 64Bit Home + AW2310+3D-Vision
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Wouldn't hurt to look at the DX9 variant using HelixMod if you were motivated. No idea what the visual differences would be.
Our DirectMode approach was not working well on this game, because of changes to the Unity plugin architecture, so the DirectMode approach will probably only work for newer games.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
I doesn't seem to be forcing 0 separation for me. In fact, every time I boot the game I have to manually reduce it (the Nvidia one), and also convergence a bit to correct its small issues. Edit: booting the game with admin rights enabled your automatic 0 separation. So it was on my end. Even with that, tweaking the Nvidia convergence a bit helped me at the settings I use.
Also, it looks like this fix reduces the internal fps by half (30fps per eye when RTSS reports 60fps). I solved it by forcing vsync off in the game profile and limiting fps to 120 (thankfully this isn't a 60fps capped game). This is probably using double the GPU performance than it should (50-55% usage on my GTX 1080 at 1440p), something that I feared when you said that this was using your 3D on top of 3D Vision automatic (something that sort of happens on Dolphin by default).
But anyway, the effects themselves are as perfect as they can be, judging by the first 40 seconds of gameplay I could try before going to bed :).
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
The fix works by making the engine update at 120hz instead of 60hz. It sequentially renders each eye to a separate buffer and time is paused between each eye rendering to keep everything in sync. So the gpu has to do twice as much work as does the cpu. There's also the added overhead of Nvidia's 3d Vision Automatic mode doing it's thing, so that's using more resources than it should need.
There shouldn't be a need to cap fps since the engine is doing that already. Also vsync has to be disabled otherwise it'll update at 60hz which makes it run at 30fps, like you mentioned.
Thanks for the added input masterotaku. I've updated the fix to v1.1 which just adds a few changes to the profile section and sets the convergence.
I'm hoping that i can ditch 3D Vision Automatic and use 3D Vision direct for future versions of this method. I wanted to do it for this game but INSIDE is running on Unity 5.0.4 and it's native plugin API is a bit of a mess. Using 3D Vision Direct will be a lot more efficient.
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
i7 2600k @4,2ghz
GTX 1070 G1 GAMING
DDR3 12gb
TV LG UF8500
Windows 10 64bits
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z
Running with an EDID override on a 60Hz passive display, I always have to pull the depth to zero when I start, which isn't really a hassle - just noting for anyone else who runs it the same way. I'm not sure my GPU would run it at 120 Hz.
What convergence is recommended in the Nvidia settings? Adjusting with the hotkeys works fine for me but I also find that depending on where the Nvidia convergence is set can cause this light shimmering effect on the outlines of the player character.
I used a value of 10.589 for convergence. You shouldn't have to set it since i added it to the profile section of the .ini
Like my work? You can send a donation via Paypal to sgs.rules@gmail.com
Windows 7 Pro 64x - Nvidia Driver 398.82 - EVGA 980Ti SC - Optoma HD26 with Edid override - 3D Vision 2 - i7-8700K CPU at 5.0Ghz - ASROCK Z370 Ext 4 Motherboard - 32 GB RAM Corsair Vengeance - 512 GB Samsung SSD 850 Pro - Creative Sound Blaster Z