No Man's Sky
  2 / 10    
Can't help it... I am still trying to decide... either to laugh or cry... http://www.metacritic.com/game/pc/no-mans-sky (Score currently: 2.4 going down it seems...:( )
Can't help it...
I am still trying to decide... either to laugh or cry...

http://www.metacritic.com/game/pc/no-mans-sky
(Score currently: 2.4 going down it seems...:( )

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#16
Posted 08/12/2016 11:07 PM   
Just managed to start it with your tool and I have pretty the same experience as with Doom. I can see 3D when I`m moving sideway but as far as I remember it wasn`t the real one - more like illusion. But looks nice :) Edit: Looks like that.
Just managed to start it with your tool and I have pretty the same experience as with Doom. I can see 3D when I`m moving sideway but as far as I remember it wasn`t the real one - more like illusion. But looks nice :)




Edit:
Looks like that.
[quote="SKAUT"]Just managed to start it with your tool and I have pretty the same experience as with Doom. I can see 3D when I`m moving sideway but as far as I remember it wasn`t the real one - more like illusion. But looks nice :) Edit: Looks like that. [/quote] There is definitely some depth there;)) (Moving sideways and giving you a "fake 3D" means there is some "horizontal parallax". That happens if you the raw FPS is not 120 FPS sadly:( We can try to improve that a bit later on though;) But the best improvement is having 120FPS raw and 60 per eye;) ) Which version of the wrapper are you using? The latest one? If so, open the ini file and find this line [code] VertexStereoString1 = "gl_Position.x -= g_eye * g_eye_separation * (gl_Position.w - g_convergence);\n" [/code] and replace it with [code] VertexStereoString1 = "gl_Position.x += g_eye * g_eye_separation * (gl_Position.w - g_convergence);\n" [/code] (The only difference is the "+=" instead of "-="). See if you get better results;) For me the game just keeps on hanging at startup no matter what I do/try:((( So, I can't actually see it:( (Hope things will change soon ^_^)
SKAUT said:Just managed to start it with your tool and I have pretty the same experience as with Doom. I can see 3D when I`m moving sideway but as far as I remember it wasn`t the real one - more like illusion. But looks nice :)

Edit:
Looks like that.


There is definitely some depth there;))
(Moving sideways and giving you a "fake 3D" means there is some "horizontal parallax". That happens if you the raw FPS is not 120 FPS sadly:( We can try to improve that a bit later on though;) But the best improvement is having 120FPS raw and 60 per eye;) )

Which version of the wrapper are you using? The latest one?
If so, open the ini file and find this line

VertexStereoString1 = "gl_Position.x -= g_eye * g_eye_separation * (gl_Position.w - g_convergence);\n"


and replace it with
VertexStereoString1 = "gl_Position.x += g_eye * g_eye_separation * (gl_Position.w - g_convergence);\n"


(The only difference is the "+=" instead of "-=").

See if you get better results;) For me the game just keeps on hanging at startup no matter what I do/try:((( So, I can't actually see it:( (Hope things will change soon ^_^)

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#18
Posted 08/12/2016 11:28 PM   
I'm definitely looking forward to playing this game in 3D. Hope it goes well and wish you the best of luck ! Also that startup bug is annoying but it eventually works after starting the game 20 times lol
I'm definitely looking forward to playing this game in 3D. Hope it goes well and wish you the best of luck ! Also that startup bug is annoying but it eventually works after starting the game 20 times lol

#19
Posted 08/13/2016 12:03 AM   
Changing that command didn`t make any improvement. Game is still the same. Thing is that the best picture I can get only if I`m moving to the right so it feels like some buffered frame is reproduced in there and that creates this 3D effect. Your tool generate proper convergence in there so it look good but it doesn`t do a thing if I`m standing still. I had that with Doom and I remember how much you had to change to make it look good so this actually doesn`t represent the quality of 3D in this game. I bet that when you get this properly sorted it will be a mess.
Changing that command didn`t make any improvement. Game is still the same. Thing is that the best picture I can get only if I`m moving to the right so it feels like some buffered frame is reproduced in there and that creates this 3D effect. Your tool generate proper convergence in there so it look good but it doesn`t do a thing if I`m standing still.
I had that with Doom and I remember how much you had to change to make it look good so this actually doesn`t represent the quality of 3D in this game. I bet that when you get this properly sorted it will be a mess.
You have to do more than that, I think. Coincidentally, I have looked at helifax's videos today, in hopes of making 3D work in Hyperdimension Neptunia. If what I understood is correct, you have to dump all shaders and make them render im 3D manually. Or something like that. I still haven't figured out shader hunting, even knowing the hotkeys. Ctrl++ crashes that game for me, but I don't even know if I'm navigating through shaders (there is no OSD).
You have to do more than that, I think. Coincidentally, I have looked at helifax's videos today, in hopes of making 3D work in Hyperdimension Neptunia.

If what I understood is correct, you have to dump all shaders and make them render im 3D manually. Or something like that.

I still haven't figured out shader hunting, even knowing the hotkeys. Ctrl++ crashes that game for me, but I don't even know if I'm navigating through shaders (there is no OSD).

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#21
Posted 08/13/2016 12:56 AM   
The "VertexStereoString" will automatically insert that line when it matches the string in "VertexInjectionPoint". Before that you will also need to add insert the uniforms required (like we do in DX11 with t120, t125). The Uniforms are automatically inserted after the "UniformInjection" string. In short, this will replicate what 3D Vision Automatic does for 3D Vision;) (about 90% as there are other "logics" that I don't know in 3D Vision Automatic). This should make the game render in 3D but broken. Then you need to find the individual shaders and fix them or "un-correct them". Now each OpenGL game uses different shaders in GLSL so you will have to dump the shaders and see what the "injection" points look like;) and modify the ini file accordingly;) If you are able to dump the shaders and post one vertex shader here I can tell you what you need to modify in the ini file;) To generate the shaders set in the ini file: [code] EnableShaderDump = true ..... EnableDevMode = true [/code] It should generated all the PS and VS shaders in a folder called DebugShaders;) Just open one Vertex Shader and paste the content here;) and I hope, I will be able to help;) I know is not as easy as is in DX11 and 3D Vision Automatic, as we need first to mimic that behaviour, but the fact that it starts in 3D and you can see a mono image (while 3D Vision is running) in the game is the biggest and a HUGE step forward;) So, if we passed this milestone we are good;)
The "VertexStereoString" will automatically insert that line when it matches the string in "VertexInjectionPoint". Before that you will also need to add insert the uniforms required (like we do in DX11 with t120, t125). The Uniforms are automatically inserted after the "UniformInjection" string.

In short, this will replicate what 3D Vision Automatic does for 3D Vision;) (about 90% as there are other "logics" that I don't know in 3D Vision Automatic). This should make the game render in 3D but broken.

Then you need to find the individual shaders and fix them or "un-correct them".

Now each OpenGL game uses different shaders in GLSL so you will have to dump the shaders and see what the "injection" points look like;) and modify the ini file accordingly;)

If you are able to dump the shaders and post one vertex shader here I can tell you what you need to modify in the ini file;)

To generate the shaders set in the ini file:
EnableShaderDump = true
.....
EnableDevMode = true


It should generated all the PS and VS shaders in a folder called DebugShaders;) Just open one Vertex Shader and paste the content here;) and I hope, I will be able to help;)

I know is not as easy as is in DX11 and 3D Vision Automatic, as we need first to mimic that behaviour, but the fact that it starts in 3D and you can see a mono image (while 3D Vision is running) in the game is the biggest and a HUGE step forward;) So, if we passed this milestone we are good;)

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#22
Posted 08/13/2016 01:05 AM   
What if that folder doesn't appear? I'm going to bed now, but I'll post a log tomorrow. Although this isn't the appropriate thread for Hyperdimension Neptunia.
What if that folder doesn't appear? I'm going to bed now, but I'll post a log tomorrow. Although this isn't the appropriate thread for Hyperdimension Neptunia.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#23
Posted 08/13/2016 01:10 AM   
[quote="masterotaku"]What if that folder doesn't appear? I'm going to bed now, but I'll post a log tomorrow. Although this isn't the appropriate thread for Hyperdimension Neptunia.[/quote] Then it either means 2 things;) 1. Is using Legacy Calls (try and enable the legacy Mode and see if that solves the issue with stereo 3D rendering) 2. Is using ARB shaders and I have no idea on how to handle those;( (ARB shaders were used briefly between fixed and programable pipeline with GLSL, and were replaced by GLSL, but some games were coded using them:( If this is the case, there isn't anything I can really do:( as I don't know anything about the old ARB shader system )
masterotaku said:What if that folder doesn't appear? I'm going to bed now, but I'll post a log tomorrow. Although this isn't the appropriate thread for Hyperdimension Neptunia.


Then it either means 2 things;)

1. Is using Legacy Calls (try and enable the legacy Mode and see if that solves the issue with stereo 3D rendering)
2. Is using ARB shaders and I have no idea on how to handle those;( (ARB shaders were used briefly between fixed and programable pipeline with GLSL, and were replaced by GLSL, but some games were coded using them:( If this is the case, there isn't anything I can really do:( as I don't know anything about the old ARB shader system )

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#24
Posted 08/13/2016 01:14 AM   
Allow me to chime in regarding Hyperdimension (sorry to threadjack, btw), because I've tried to get it working for quite some time, with some limited success. -Enabling legacy mode did get 3D to kick in, and pushed the screen into depth, but still a flat/2D image while still, however when rotating the camera it would cause the game to look 3D -Also, for some reason framerate tanked to single digits -No debugshaders folder automatically gets created for me either -However, in all my attempts at this game over a long period of time, at one point (before I ever even managed to get 3D to kick in) I did actually manage to get a VS and PS shader dumped somehow (couldn't tell you how), with next to nothing inside of them, however the PS does make to reference ARB. Here are the contents of each: 1_Vertex_115d091d.glsl [code]void main() { gl_TexCoord[0] = gl_MultiTexCoord0; gl_Position = gl_Vertex;gl_Position.x -= g_eye * g_eye_separation * (gl_Position.w - g_convergence); if(g_vertexEnabled < 1.0) {gl_Position = vec4(0.0);} }[/code] 1_Pixel_48f26b20.glsl [code]#extension GL_ARB_texture_rectangle : enable uniform sampler2DRect tex0; void main() { gl_FragColor = vec4( texture2DRect(tex0, gl_TexCoord[0].st).rgb, 1.0); }[/code] And just for the sake of providing it, here's my log when I run the game now in it's current state: [code] ============ Welcome to OpenGL-3D Vision wrapper v.4.19. ============ Found NeptuniaReBirth1.exe with PID: 514 Setting up the Nvidia Profile Information Nvidia Profile exe name: neptuniarebirth1.exe 3D Vision Profile value found and is already set! Setting Successfully imported from Profile.nip Game profile existing/updated with the 3D Vision flag !!! Automatic Hooking Enabled !!! Loading OpenGL Extensions.... Found max of 354 extensions Extensions enumerated and retrieved ! (OGL 3.3 or later) OpenGL extensions Retrieved !!! Creating OpenGL/Direct3D bridge... Window size: 1920, 1080 WGL NV_DX_interop extension supported on this platform This operating system uses WDDM (it's Windows Vista or later) Activated Direct3D 9x Retrieved adapter display mode, format: 21 Our required formats are supported Retrieved device capabilities Hardware vertex processing is supported NVAPI initialized NVIDIA Single Screen is enabled. NVIDIA SLI is Disabled. Context is in Windowed Mode Windows information for D3D device set. Created device Enabled OpenGL interop on device Created NV stereo handle on device: Activated stereo Retrieved device back buffer Created color texture(left) Created color texture (right) Retrieved color buffer from color texture (left) Retrieved color buffer from color texture (right) Created render target surface (stereo) Inserted stereo header into render target surface (stereo) Created depth/stencil surface Created OpenGL color texture (left) Created OpenGL color texture (right) Created OpenGL depth/stencil texture Bound render target surface to OpenGL texture (left) Bound render target surface to OpenGL texture (right) Bound depth/stencil surface to OpenGL render buffer Created OpenGL frame buffer objects --- 3D Vision successfully initialized ! --- Starting Rendering: --- Starting Resize Detection Thread Starting NVAPI Thread Disabling 3D Vision... Cleaning up and freeing the resources... Application closed successfully ! END [/code] So does the ARB reference mean that there's absolutely no hope? Or does anything I've describe indicate otherwise? One thing I'll mention is that Fairy Fencer F, also from InfiniFactory, is a DX based game and other than the UI, it is flawless in 3D so I'm kinda hopeful that if we could just get 3D to kick in properly on Neptunia it wouldn't require any actual shader fixing (and from what I've seen, when rotating the camera, it looks to be the case).
Allow me to chime in regarding Hyperdimension (sorry to threadjack, btw), because I've tried to get it working for quite some time, with some limited success.

-Enabling legacy mode did get 3D to kick in, and pushed the screen into depth, but still a flat/2D image while still, however when rotating the camera it would cause the game to look 3D
-Also, for some reason framerate tanked to single digits
-No debugshaders folder automatically gets created for me either
-However, in all my attempts at this game over a long period of time, at one point (before I ever even managed to get 3D to kick in) I did actually manage to get a VS and PS shader dumped somehow (couldn't tell you how), with next to nothing inside of them, however the PS does make to reference ARB. Here are the contents of each:

1_Vertex_115d091d.glsl
void main()
{
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = gl_Vertex;gl_Position.x -= g_eye * g_eye_separation * (gl_Position.w - g_convergence);

if(g_vertexEnabled < 1.0)
{gl_Position = vec4(0.0);}

}


1_Pixel_48f26b20.glsl
#extension GL_ARB_texture_rectangle : enable
uniform sampler2DRect tex0;

void main()
{
gl_FragColor = vec4( texture2DRect(tex0, gl_TexCoord[0].st).rgb, 1.0);
}


And just for the sake of providing it, here's my log when I run the game now in it's current state:
============
Welcome to OpenGL-3D Vision wrapper v.4.19.
============

Found NeptuniaReBirth1.exe with PID: 514
Setting up the Nvidia Profile Information

Nvidia Profile exe name: neptuniarebirth1.exe
3D Vision Profile value found and is already set!
Setting Successfully imported from Profile.nip
Game profile existing/updated with the 3D Vision flag !!!
Automatic Hooking Enabled !!!
Loading OpenGL Extensions....
Found max of 354 extensions
Extensions enumerated and retrieved ! (OGL 3.3 or later)
OpenGL extensions Retrieved !!!
Creating OpenGL/Direct3D bridge...

Window size: 1920, 1080
WGL NV_DX_interop extension supported on this platform
This operating system uses WDDM (it's Windows Vista or later)
Activated Direct3D 9x
Retrieved adapter display mode, format: 21
Our required formats are supported
Retrieved device capabilities
Hardware vertex processing is supported
NVAPI initialized
NVIDIA Single Screen is enabled.
NVIDIA SLI is Disabled.
Context is in Windowed Mode
Windows information for D3D device set.
Created device

Enabled OpenGL interop on device
Created NV stereo handle on device:
Activated stereo
Retrieved device back buffer
Created color texture(left)
Created color texture (right)

Retrieved color buffer from color texture (left)
Retrieved color buffer from color texture (right)
Created render target surface (stereo)
Inserted stereo header into render target surface (stereo)
Created depth/stencil surface
Created OpenGL color texture (left)
Created OpenGL color texture (right)
Created OpenGL depth/stencil texture
Bound render target surface to OpenGL texture (left)
Bound render target surface to OpenGL texture (right)
Bound depth/stencil surface to OpenGL render buffer
Created OpenGL frame buffer objects
---
3D Vision successfully initialized !
---
Starting Rendering:
---
Starting Resize Detection Thread
Starting NVAPI Thread

Disabling 3D Vision... Cleaning up and freeing the resources...
Application closed successfully !
END


So does the ARB reference mean that there's absolutely no hope? Or does anything I've describe indicate otherwise?

One thing I'll mention is that Fairy Fencer F, also from InfiniFactory, is a DX based game and other than the UI, it is flawless in 3D so I'm kinda hopeful that if we could just get 3D to kick in properly on Neptunia it wouldn't require any actual shader fixing (and from what I've seen, when rotating the camera, it looks to be the case).

3D Gaming Rig: CPU: i7 7700K @ 4.9Ghz | Mobo: Asus Maximus Hero VIII | RAM: Corsair Dominator 16GB | GPU: 2 x GTX 1080 Ti SLI | 3xSSDs for OS and Apps, 2 x HDD's for 11GB storage | PSU: Seasonic X-1250 M2| Case: Corsair C70 | Cooling: Corsair H115i Hydro cooler | Displays: Asus PG278QR, BenQ XL2420TX & BenQ HT1075 | OS: Windows 10 Pro + Windows 7 dual boot

Like my fixes? Dontations can be made to: www.paypal.me/DShanz or rshannonca@gmail.com
Like electronic music? Check out: www.soundcloud.com/dj-ryan-king

#25
Posted 08/13/2016 04:05 AM   
How should I say this the easy way and avoid any confusion... Let's see: There are 2 main types of shaders (like in DX): Assembler (ASM) and C Source code(GLSL/HLSL). The ASM shaders are of multiple types: - ARB variant - CG variant (by Nvidia) The C Source Code are of multiple types: - GLSL - standard OpenGL (or Core as we call it) - ARB extension - is an extension over the Core design and uses different API as well. The wrapper only intercepts the C Source Code (both GLSL and ARB extension ones), but is not dealing with ASM shaders. For the C Source Code ones I know how it works. Since you basically provide the source code and you build the shaders at runtime this was easy (but not trivial at all) to intercept/dump/modify, recompile, relink, re-create the shader program objects. For ASM-ARB ones I have no idea. Most likely (from what I've read around) the shaders are already in bin format and even if I intercept the blob, I need to de-compile it, modify, recompile etc.. Plus the API for using them is different and I can't find a clear answer on how is supposed to work (is deprecated for like more than 10 years and resources are scarce). Even if I would find them, I would still need to de-compile them and re-compile them, but problem is I don't know what compiler made them into bins and there isn't exactly a standard written - like is for DX. So, it could be doable in theory but in practice... That's why I say there is no chance:)
How should I say this the easy way and avoid any confusion...

Let's see:
There are 2 main types of shaders (like in DX): Assembler (ASM) and C Source code(GLSL/HLSL).

The ASM shaders are of multiple types:
- ARB variant
- CG variant (by Nvidia)

The C Source Code are of multiple types:
- GLSL - standard OpenGL (or Core as we call it)
- ARB extension - is an extension over the Core design and uses different API as well.

The wrapper only intercepts the C Source Code (both GLSL and ARB extension ones), but is not dealing with ASM shaders.

For the C Source Code ones I know how it works. Since you basically provide the source code and you build the shaders at runtime this was easy (but not trivial at all) to intercept/dump/modify, recompile, relink, re-create the shader program objects.

For ASM-ARB ones I have no idea. Most likely (from what I've read around) the shaders are already in bin format and even if I intercept the blob, I need to de-compile it, modify, recompile etc.. Plus the API for using them is different and I can't find a clear answer on how is supposed to work (is deprecated for like more than 10 years and resources are scarce). Even if I would find them, I would still need to de-compile them and re-compile them, but problem is I don't know what compiler made them into bins and there isn't exactly a standard written - like is for DX.

So, it could be doable in theory but in practice... That's why I say there is no chance:)

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#26
Posted 08/13/2016 11:19 AM   
[quote="helifax"]How should I say this the easy way and avoid any confusion... Let's see: There are 2 main types of shaders (like in DX): Assembler (ASM) and C Source code(GLSL/HLSL). The ASM shaders are of multiple types: - ARB variant - CG variant (by Nvidia) The C Source Code are of multiple types: - GLSL - standard OpenGL (or Core as we call it) - ARB extension - is an extension over the Core design and uses different API as well. The wrapper only intercepts the C Source Code (both GLSL and ARB extension ones), but is not dealing with ASM shaders. For the C Source Code ones I know how it works. Since you basically provide the source code and you build the shaders at runtime this was easy (but not trivial at all) to intercept/dump/modify, recompile, relink, re-create the shader program objects. For ASM-ARB ones I have no idea. Most likely (from what I've read around) the shaders are already in bin format and even if I intercept the blob, I need to de-compile it, modify, recompile etc.. Plus the API for using them is different and I can't find a clear answer on how is supposed to work (is deprecated for like more than 10 years and resources are scarce). Even if I would find them, I would still need to de-compile them and re-compile them, but problem is I don't know what compiler made them into bins and there isn't exactly a standard written - like is for DX. So, it could be doable in theory but in practice... That's why I say there is no chance:)[/quote] So it looks like the only hope is that they will keep promise and release this in VR.
helifax said:How should I say this the easy way and avoid any confusion...

Let's see:
There are 2 main types of shaders (like in DX): Assembler (ASM) and C Source code(GLSL/HLSL).

The ASM shaders are of multiple types:
- ARB variant
- CG variant (by Nvidia)

The C Source Code are of multiple types:
- GLSL - standard OpenGL (or Core as we call it)
- ARB extension - is an extension over the Core design and uses different API as well.

The wrapper only intercepts the C Source Code (both GLSL and ARB extension ones), but is not dealing with ASM shaders.

For the C Source Code ones I know how it works. Since you basically provide the source code and you build the shaders at runtime this was easy (but not trivial at all) to intercept/dump/modify, recompile, relink, re-create the shader program objects.

For ASM-ARB ones I have no idea. Most likely (from what I've read around) the shaders are already in bin format and even if I intercept the blob, I need to de-compile it, modify, recompile etc.. Plus the API for using them is different and I can't find a clear answer on how is supposed to work (is deprecated for like more than 10 years and resources are scarce). Even if I would find them, I would still need to de-compile them and re-compile them, but problem is I don't know what compiler made them into bins and there isn't exactly a standard written - like is for DX.

So, it could be doable in theory but in practice... That's why I say there is no chance:)

So it looks like the only hope is that they will keep promise and release this in VR.
[quote="Sean Murray"]Developer Hello Games chief Sean Murray took to Twitter to ask users to check their video cards' drivers and make sure their video cards are compatible with OpenGL 4.5.[/quote]
Sean Murray said:Developer Hello Games chief Sean Murray took to Twitter to ask users to check their video cards' drivers and make sure their video cards are compatible with OpenGL 4.5.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#28
Posted 08/13/2016 11:58 AM   
[quote="helifax"]Q: Guess what Windows and PlayStation 4 have in common ? A: OpenGL. Q: How does PlayStation 4 differ from Xbox One ? A: Xbox One uses DirectX 11 while PlayStation 4 uses OpenGL (or variant based on OpenGL). Q: What platforms is this game released on ? A: PC and PlayStation 4. Bottom line: I bet (90%) it uses OpenGL, as is the only graphics API that is common across both platforms and as Xbox One is missing from the list, it 99% show DirectX 11 is out of the list (I might be mistaken). (Regarding a possible fix from my part, if is OpenGL, it might happen, it might not. Don't have the game and I am unsure if I plan to buy it or not. I don't want to get any hopes up if is OpenGL. If is DirectX there are quite a few moders around who might look at the game;) ) Need to wait and see:) Edit: If is OpenGL, I hope it doesn't have a "stupid" artificial "frame-lock" at 60hz or something and it will actually follow your refresh rate as well. If not, a 3D Fix will still be possible, but even if it will succeed 100% the experience will not be optimal (unless you get 120Hz and 60FPS per eye).[/quote] Streetfighter 5 is crossplay with PC and Ps4 and must be running on the same engine (unreal 4) is there something going on in this case? Its a shame but from what I havve seem of the user reviews I think I can skip this one.
helifax said:Q: Guess what Windows and PlayStation 4 have in common ?
A: OpenGL.

Q: How does PlayStation 4 differ from Xbox One ?
A: Xbox One uses DirectX 11 while PlayStation 4 uses OpenGL (or variant based on OpenGL).

Q: What platforms is this game released on ?
A: PC and PlayStation 4.

Bottom line:
I bet (90%) it uses OpenGL, as is the only graphics API that is common across both platforms and as Xbox One is missing from the list, it 99% show DirectX 11 is out of the list (I might be mistaken).

(Regarding a possible fix from my part, if is OpenGL, it might happen, it might not. Don't have the game and I am unsure if I plan to buy it or not.
I don't want to get any hopes up if is OpenGL. If is DirectX there are quite a few moders around who might look at the game;) )

Need to wait and see:)

Edit:
If is OpenGL, I hope it doesn't have a "stupid" artificial "frame-lock" at 60hz or something and it will actually follow your refresh rate as well. If not, a 3D Fix will still be possible, but even if it will succeed 100% the experience will not be optimal (unless you get 120Hz and 60FPS per eye).


Streetfighter 5 is crossplay with PC and Ps4 and must be running on the same engine (unreal 4) is there something going on in this case?

Its a shame but from what I havve seem of the user reviews I think I can skip this one.

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
EVGA 1080TI SLI
Samsung SSD 840Pro
ASUS Z97-WS
3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#29
Posted 08/13/2016 12:19 PM   
[quote="SKAUT"][quote="helifax"]How should I say this the easy way and avoid any confusion... Let's see: There are 2 main types of shaders (like in DX): Assembler (ASM) and C Source code(GLSL/HLSL). The ASM shaders are of multiple types: - ARB variant - CG variant (by Nvidia) The C Source Code are of multiple types: - GLSL - standard OpenGL (or Core as we call it) - ARB extension - is an extension over the Core design and uses different API as well. The wrapper only intercepts the C Source Code (both GLSL and ARB extension ones), but is not dealing with ASM shaders. For the C Source Code ones I know how it works. Since you basically provide the source code and you build the shaders at runtime this was easy (but not trivial at all) to intercept/dump/modify, recompile, relink, re-create the shader program objects. For ASM-ARB ones I have no idea. Most likely (from what I've read around) the shaders are already in bin format and even if I intercept the blob, I need to de-compile it, modify, recompile etc.. Plus the API for using them is different and I can't find a clear answer on how is supposed to work (is deprecated for like more than 10 years and resources are scarce). Even if I would find them, I would still need to de-compile them and re-compile them, but problem is I don't know what compiler made them into bins and there isn't exactly a standard written - like is for DX. So, it could be doable in theory but in practice... That's why I say there is no chance:)[/quote] So it looks like the only hope is that they will keep promise and release this in VR.[/quote] I didn't say this;) Related to this game. We were talking about another game. I am trying to see what can be done about it;) I am having massive issues with the game on my Surround Setup (Multi-mon and SLI) but seems it works on my Laptop lol.
SKAUT said:
helifax said:How should I say this the easy way and avoid any confusion...

Let's see:
There are 2 main types of shaders (like in DX): Assembler (ASM) and C Source code(GLSL/HLSL).

The ASM shaders are of multiple types:
- ARB variant
- CG variant (by Nvidia)

The C Source Code are of multiple types:
- GLSL - standard OpenGL (or Core as we call it)
- ARB extension - is an extension over the Core design and uses different API as well.

The wrapper only intercepts the C Source Code (both GLSL and ARB extension ones), but is not dealing with ASM shaders.

For the C Source Code ones I know how it works. Since you basically provide the source code and you build the shaders at runtime this was easy (but not trivial at all) to intercept/dump/modify, recompile, relink, re-create the shader program objects.

For ASM-ARB ones I have no idea. Most likely (from what I've read around) the shaders are already in bin format and even if I intercept the blob, I need to de-compile it, modify, recompile etc.. Plus the API for using them is different and I can't find a clear answer on how is supposed to work (is deprecated for like more than 10 years and resources are scarce). Even if I would find them, I would still need to de-compile them and re-compile them, but problem is I don't know what compiler made them into bins and there isn't exactly a standard written - like is for DX.

So, it could be doable in theory but in practice... That's why I say there is no chance:)

So it looks like the only hope is that they will keep promise and release this in VR.


I didn't say this;) Related to this game. We were talking about another game.
I am trying to see what can be done about it;)
I am having massive issues with the game on my Surround Setup (Multi-mon and SLI) but seems it works on my Laptop lol.

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#30
Posted 08/13/2016 12:56 PM   
  2 / 10    
Scroll To Top