NVIDIA Inspector >>> new 2.1.3.10 :)
  3 / 4    
The only thing that I've came across is in a PDF where it says "Configuration is saved to HKLM\Software\NVIDIA Corporation\Global\Stereo3D\GameConfigs\<game exe name> Add DWORD entry StereoTextureEnable: 1 Add DWORD entry StereoCutoff: 1 These are optimal values for games that do not render directly into backbuffer. We will likely make them default in the future." This was from a long time ago but after their 3DFX acquisition. So it shows that these settings are not something that carried over from 3DFX. Yet, I can't find any documentation on them, unless they are in the developer SDK that requires registration and a NDA agreement.
The only thing that I've came across is in a PDF where it says

"Configuration is saved to HKLM\Software\NVIDIA
Corporation\Global\Stereo3D\GameConfigs\<game exe name>
Add DWORD entry StereoTextureEnable: 1
Add DWORD entry StereoCutoff: 1
These are optimal values for games that do not render directly
into backbuffer. We will likely make them default in the future."

This was from a long time ago but after their 3DFX acquisition. So it shows that these settings are not something that carried over from 3DFX. Yet, I can't find any documentation on them, unless they are in the developer SDK that requires registration and a NDA agreement.

#31
Posted 07/10/2016 10:27 PM   
I've no idea what this even means These are optimal values for games that do not render directly into backbuffer. So if a game renders directly into the back buffer, then a stereo cut off value wouldn't be used? I thought all games rendered into front buffers first, but I've really no idea. edit: thinking about it, those two settings are pretty much in every profile. ID_0x701eb457 and ID_0x709a1ddf
I've no idea what this even means

These are optimal values for games that do not render directly into backbuffer.

So if a game renders directly into the back buffer, then a stereo cut off value wouldn't be used?

I thought all games rendered into front buffers first, but I've really no idea.

edit: thinking about it, those two settings are pretty much in every profile.
ID_0x701eb457 and ID_0x709a1ddf

#32
Posted 07/10/2016 10:30 PM   
Most games render 3D Geometry simultaneously to multiple G-buffers which are later consumed by deferred rendering and post processing, and the back buffer is only updated very late in the frame. Hence, most games would need StereoTextureEnable=1 (and likely more). I believe they must have made that the default a long time ago (edit: looks like 0x23 is the default). I believe that StereoCutoff = 1 is the default for profiles missing that setting. As far as I can tell, only the low four bits are used - I have no idea why other bits are set in the profiles though, so I might be missing something. Further to that, the profiles shipped with the driver don't make much sense at all here compared to what the driver appears to do with the values (over 1200 profiles appear to have invalid values), so take the next paragraph with a grain of salt - this probably needs to be experimentally confirmed. More damning is that this does not seem to match your observations for Tomb Raider: Only 0, 1, 2, 4, and 8 seem to be valid values for the low nibble of StereoCutoff - any other value appears to be ignored and 1 will be used in their place. If StereoCutOff=2, StereoCutoffDepthNear is used (default is 1.0 if not specified in the profile). StereoCutOff=4 means that StereoCutoffDepthFar is used (default is 10000.0 if not specified in the profile). It does not appear possible to combine these two. I do not know what StereoCutoff=0, 1 or 8 means. It is worth noting that most games render their UI at depth=1.0. I would guess that StereoCutoff might be used in these games to denote that geometry at exactly that depth should not be stereoised, or to specify what depth or range of depths should be ignored for games that do not do that and have no better way to match UI elements. It is worthwhile noting that this is not the only heuristic that the driver will use for UI elements - e.g. if there is no depth buffer assigned when the UI is drawn it will not stereoise it regardless of the depth value. Presumably this is controlled by another setting (Maybe StereoTextureEnable = 0x80 ?).
Most games render 3D Geometry simultaneously to multiple G-buffers which are later consumed by deferred rendering and post processing, and the back buffer is only updated very late in the frame. Hence, most games would need StereoTextureEnable=1 (and likely more). I believe they must have made that the default a long time ago (edit: looks like 0x23 is the default).

I believe that StereoCutoff = 1 is the default for profiles missing that setting. As far as I can tell, only the low four bits are used - I have no idea why other bits are set in the profiles though, so I might be missing something. Further to that, the profiles shipped with the driver don't make much sense at all here compared to what the driver appears to do with the values (over 1200 profiles appear to have invalid values), so take the next paragraph with a grain of salt - this probably needs to be experimentally confirmed. More damning is that this does not seem to match your observations for Tomb Raider:

Only 0, 1, 2, 4, and 8 seem to be valid values for the low nibble of StereoCutoff - any other value appears to be ignored and 1 will be used in their place. If StereoCutOff=2, StereoCutoffDepthNear is used (default is 1.0 if not specified in the profile). StereoCutOff=4 means that StereoCutoffDepthFar is used (default is 10000.0 if not specified in the profile). It does not appear possible to combine these two. I do not know what StereoCutoff=0, 1 or 8 means.

It is worth noting that most games render their UI at depth=1.0. I would guess that StereoCutoff might be used in these games to denote that geometry at exactly that depth should not be stereoised, or to specify what depth or range of depths should be ignored for games that do not do that and have no better way to match UI elements.

It is worthwhile noting that this is not the only heuristic that the driver will use for UI elements - e.g. if there is no depth buffer assigned when the UI is drawn it will not stereoise it regardless of the depth value. Presumably this is controlled by another setting (Maybe StereoTextureEnable = 0x80 ?).

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#33
Posted 07/10/2016 11:17 PM   
Well my observation was on The Walking Dead Survival Instinct using the 26 different profiles that contained the following setting. Setting ID_0x709a1ddf = 0x4b1cd96b InternalSettingFlag=V0 // is found in 26 profiles Out of the 26 profiles, some of them resulted in a 2D image, others had duplicate results. I picked the 5 most varying profile extremes to post pics of, to demonstrate the manipulation that some of these profiles could cause. The Tomb Raider profile was only used in one of the pictures. The convergence was also exaggerated to highlight the effect.
Well my observation was on The Walking Dead Survival Instinct using the 26 different profiles that contained the following setting.

Setting ID_0x709a1ddf = 0x4b1cd96b InternalSettingFlag=V0 // is found in 26 profiles

Out of the 26 profiles, some of them resulted in a 2D image, others had duplicate results. I picked the 5 most varying profile extremes to post pics of, to demonstrate the manipulation that some of these profiles could cause.

The Tomb Raider profile was only used in one of the pictures. The convergence was also exaggerated to highlight the effect.

#34
Posted 07/10/2016 11:48 PM   
This is an obvious observation, but I'm not sure of the implications. Give it a try and see what you think, it will make more sense in your head than mine. Extract your profiles and add the following settings in place of the existing settings for The Wolf Among Us Profile "The Wolf Among Us" ShowOn GeForce ProfileType Application Executable "thewolfamongus.exe" Setting ID_0x701eb457 = 0x2241ab21 InternalSettingFlag=V0 //StereoProfile (AKA The Mystery Stereo Setting) Setting ID_0x709a1ddf = 0x4b1cd968 InternalSettingFlag=V0 //StereoCutOff Setting ID_0x7050e011 = 0x7418eed9 InternalSettingFlag=V0 //StereoCutoffDepthNear Setting ID_0x70edb381 = 0x24208b5c InternalSettingFlag=V0 //StereoTextureEnable makes prompt 2D Setting ID_0x70e34a78 = 0x1945b570 InternalSettingFlag=V0 //StereoUseMatrix helps with some shadows? Setting ID_0x70e5a749 = 0x0056b4c9 InternalSettingFlag=V0 //StereoEpsilon helps with some shadows? EndProfile Start the game and all you need to do is hit start to get to the Title Screen, you'll see the Sheriff/Wolf walking in and out of your screen. Exit the game Now add this line into the profile that you just imported previously Setting ID_0x708db8c5 = 0x5c1beabe InternalSettingFlag=V0 [s]//seems to help, better to leave in?[/s] (edit) //Convergence Setting Value Now, he stays inside of the screen. While this doesn't look as awesome as him walking around outside of the screen, it does fix a lot of issues within the game itself With the hybrid profile that I came up with, there are only a few issues. Some shadows, some camera zooms where stuff is up in your face, but it is playable in 3D. At least the first 10 minutes that I tried it. Out of the TellTale games I've played this one has the most issues though.
This is an obvious observation, but I'm not sure of the implications.

Give it a try and see what you think, it will make more sense in your head than mine.

Extract your profiles and add the following settings in place of the existing settings for The Wolf Among Us

Profile "The Wolf Among Us"
ShowOn GeForce
ProfileType Application
Executable "thewolfamongus.exe"
Setting ID_0x701eb457 = 0x2241ab21 InternalSettingFlag=V0 //StereoProfile (AKA The Mystery Stereo Setting)
Setting ID_0x709a1ddf = 0x4b1cd968 InternalSettingFlag=V0 //StereoCutOff
Setting ID_0x7050e011 = 0x7418eed9 InternalSettingFlag=V0 //StereoCutoffDepthNear
Setting ID_0x70edb381 = 0x24208b5c InternalSettingFlag=V0 //StereoTextureEnable makes prompt 2D
Setting ID_0x70e34a78 = 0x1945b570 InternalSettingFlag=V0 //StereoUseMatrix helps with some shadows?
Setting ID_0x70e5a749 = 0x0056b4c9 InternalSettingFlag=V0 //StereoEpsilon helps with some shadows?
EndProfile

Start the game and all you need to do is hit start to get to the Title Screen, you'll see the Sheriff/Wolf walking in and out of your screen.

Exit the game

Now add this line into the profile that you just imported previously

Setting ID_0x708db8c5 = 0x5c1beabe InternalSettingFlag=V0 //seems to help, better to leave in? (edit)
//Convergence Setting Value

Now, he stays inside of the screen.

While this doesn't look as awesome as him walking around outside of the screen, it does fix a lot of issues within the game itself

With the hybrid profile that I came up with, there are only a few issues. Some shadows, some camera zooms where stuff is up in your face, but it is playable in 3D. At least the first 10 minutes that I tried it. Out of the TellTale games I've played this one has the most issues though.

#35
Posted 07/11/2016 12:17 AM   
I'll have to take a closer look at that tomorrow. For now I've added a new section to the wiki with settings I spotted in the driver that can only be set in the registry. Some like MonitorSizeOverride we already discovered, or obviously correspond with options in the control panel, but I'm curious to know what such settings as "EnableAPILog", "SLIOverride", "ContrastOverride", "StereoFlywheelCycle" and the curiously named "0x1800babe" and "__S" settings do :-p
I'll have to take a closer look at that tomorrow. For now I've added a new section to the wiki with settings I spotted in the driver that can only be set in the registry. Some like MonitorSizeOverride we already discovered, or obviously correspond with options in the control panel, but I'm curious to know what such settings as "EnableAPILog", "SLIOverride", "ContrastOverride", "StereoFlywheelCycle" and the curiously named "0x1800babe" and "__S" settings do :-p

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#36
Posted 07/11/2016 02:06 AM   
Perhaps you could also look at the issue that helifax mentioned in his stickie and report it to the author. [url=https://forums.geforce.com/default/topic/791450/3d-vision/guide-how-to-enable-and-tweak-3d-compatibility-mode-in-any-dx11-game/post/4375630/#4375630]from here[/url] [quote="helifax"][quote="DarkStarSword"]Interesting - I took a brief look at this as well, but I focused on the wrong setting (I was looking at the one the driver called "Compat", but I guess I was hoping too much for the naming to be self explanatory). 0x709adada is named "2DDHUDSettings" in the driver. There's a few other settings with 2DD in the name that might be related - 2DDConvergence (0x709ADADB), Disable2DD (0x709ADADD), 2DD_Notes (0x709ADADC) and RHW2DDetectionMin (0x7029432B). I put together a modified custom settings name file for use with nVidia inspector with all the stereo settings from the driver: https://raw.githubusercontent.com/DarkStarSword/3d-fixes/master/CustomSettingNames_en-EN.xml[/quote] Really good and really awesome and interesting find;)) I want to add one important thing: [color="orange"]If you export a profile using these BITS with nvidia Inspector and import it back 3D CM mode will NOT KICK IN. This is because nvidia Inspector doesn't IMPORT correctly. Instead of importing [code]Setting ID_0x709adada = 0x37f58240 InternalSettingFlag=V0[/code] it imports [code]Setting ID_0x709adada = 0x37f58240 UserDefined = True[/code] which will make the driver NOT enable CM mode. Like I said above the InternalSettingFlag MUST exist there for CM to work;)) However having a profile with CM mode and imported like I described above should allow changing the flags afterwars as long as is not imported as a nip but rather in the Nvidia Text format. [/color] Sorry for the orange text but I deemed necessary to color it.[/quote]
Perhaps you could also look at the issue that helifax mentioned in his stickie and report it to the author. from here

helifax said:
DarkStarSword said:Interesting - I took a brief look at this as well, but I focused on the wrong setting (I was looking at the one the driver called "Compat", but I guess I was hoping too much for the naming to be self explanatory).

0x709adada is named "2DDHUDSettings" in the driver. There's a few other settings with 2DD in the name that might be related - 2DDConvergence (0x709ADADB), Disable2DD (0x709ADADD), 2DD_Notes (0x709ADADC) and RHW2DDetectionMin (0x7029432B).

I put together a modified custom settings name file for use with nVidia inspector with all the stereo settings from the driver:

https://raw.githubusercontent.com/DarkStarSword/3d-fixes/master/CustomSettingNames_en-EN.xml



Really good and really awesome and interesting find;))

I want to add one important thing:

If you export a profile using these BITS with nvidia Inspector and import it back 3D CM mode will NOT KICK IN.
This is because nvidia Inspector doesn't IMPORT correctly.

Instead of importing
Setting ID_0x709adada = 0x37f58240 InternalSettingFlag=V0


it imports

Setting ID_0x709adada = 0x37f58240 UserDefined = True


which will make the driver NOT enable CM mode. Like I said above the InternalSettingFlag MUST exist there for CM to work;))

However having a profile with CM mode and imported like I described above should allow changing the flags afterwars as long as is not imported as a nip but rather in the Nvidia Text format.


Sorry for the orange text but I deemed necessary to color it.

#37
Posted 07/12/2016 07:21 AM   
I kind of think that ID_0x708db8c5 applies a shift on the entire image and shoves it back into the screen. I think the current value for zero depth at screen in adjusted so that what was once in eye space in now pushed past screen depth. Maybe z times 1.0 now becomes z times 1.50, wouldn't that move the image back in, or perhaps it's multiplied by 0.50 ? Setting ID_0x708db8c5 = 0x5c1beabe InternalSettingFlag=V0 0x5c1beabe would be the value of the multiplication? I've no idea, it's pretty much a wild guess, since I understand so little about this stuff. But I implore you to make the observation as outlined previously in the post above for The Wolf Among Us. I'm sure that you would understand exactly what it is doing.
I kind of think that ID_0x708db8c5 applies a shift on the entire image and shoves it back into the screen. I think the current value for zero depth at screen in adjusted so that what was once in eye space in now pushed past screen depth.

Maybe z times 1.0 now becomes z times 1.50, wouldn't that move the image back in, or perhaps it's multiplied by 0.50 ?

Setting ID_0x708db8c5 = 0x5c1beabe InternalSettingFlag=V0

0x5c1beabe would be the value of the multiplication?

I've no idea, it's pretty much a wild guess, since I understand so little about this stuff.

But I implore you to make the observation as outlined previously in the post above for The Wolf Among Us. I'm sure that you would understand exactly what it is doing.

#38
Posted 07/12/2016 08:01 AM   
0x708db8c5 is StereoConvergence, so I would have thought that would be self explanatory - you can see that value change if you adjust convergence then save a profile (though I cannot figure out what the encoding is in the profile - it's not a floating point value, so probably a fixed point value. Edit: Odd - looks like it's a floating point value on custom profiles, but not built in?).
0x708db8c5 is StereoConvergence, so I would have thought that would be self explanatory - you can see that value change if you adjust convergence then save a profile (though I cannot figure out what the encoding is in the profile - it's not a floating point value, so probably a fixed point value. Edit: Odd - looks like it's a floating point value on custom profiles, but not built in?).

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#39
Posted 07/12/2016 10:41 AM   
Hmm, I wasn't aware that that's what it is. I just looked in the log and I see AppName='TheWolfAmongUs' StereoSeparation=0.10452262 (3dd60ff4) StereoConvergenceBias=0.00000000 (0) StereoConvergence=1.00010562 (3f800376) MinRHW=9999.00000000 (461c3c00) (farthest) MaxRHW=-9999.00000000 (c61c3c00) (closest) Texture strategy(0x13):DBBS ESMT ET Suggested StereoConvergenceMultiplier=5.00000000 (40a00000) Suggested Infinity RHW=0.00050273 (3a03c979) StereoCutoff=1 (Far) edit: yah, I see this value change as I adjust convergence StereoConvergence=0.72852290 (3f3a807a), lol I thought it was a staic setting :P I should have looked it up in the wiki.
Hmm, I wasn't aware that that's what it is.

I just looked in the log and I see

AppName='TheWolfAmongUs'
StereoSeparation=0.10452262 (3dd60ff4)
StereoConvergenceBias=0.00000000 (0)
StereoConvergence=1.00010562 (3f800376)
MinRHW=9999.00000000 (461c3c00) (farthest)
MaxRHW=-9999.00000000 (c61c3c00) (closest)
Texture strategy(0x13):DBBS ESMT ET
Suggested StereoConvergenceMultiplier=5.00000000 (40a00000)
Suggested Infinity RHW=0.00050273 (3a03c979)
StereoCutoff=1 (Far)

edit: yah, I see this value change as I adjust convergence
StereoConvergence=0.72852290 (3f3a807a), lol I thought it was a staic setting :P

I should have looked it up in the wiki.

#40
Posted 07/12/2016 11:13 AM   
I'm moving this conversation over to the shaderhacker school thread since it's getting a little off topic for the new version announcement. [url]https://forums.geforce.com/default/topic/766890/3d-vision/bo3bs-school-for-shaderhackers/post/4927216/#4927216[/url]
I'm moving this conversation over to the shaderhacker school thread since it's getting a little off topic for the new version announcement.

https://forums.geforce.com/default/topic/766890/3d-vision/bo3bs-school-for-shaderhackers/post/4927216/#4927216

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#41
Posted 07/12/2016 05:46 PM   
There's a new test build out of NVIDIA Inspector which should decrypt the internal settings (no more 'chinese characters' or settings that make no sense, making this now superior to Geforce Profile Manager): https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/artifacts I'm still at work - can someone try it out in the meantime and see how it goes?
There's a new test build out of NVIDIA Inspector which should decrypt the internal settings (no more 'chinese characters' or settings that make no sense, making this now superior to Geforce Profile Manager): https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/artifacts


I'm still at work - can someone try it out in the meantime and see how it goes?

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#42
Posted 07/15/2016 04:21 AM   
I tried it. No more Chinese characters indeed. Looking in detail I see an option swap eyes. Was it always present?
I tried it.
No more Chinese characters indeed.
Looking in detail I see an option swap eyes. Was it always present?

Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits

#43
Posted 07/15/2016 05:31 AM   
The swap eyes is a disabled legacy feature. AFAIK, it was only present in their viewer for consumer GPUs. It can be seen on page 35 of their legacy pdf http://http.download.nvidia.com/Windows/71.84/71.84_ForceWare_3D_Stereo_Users_Guide.pdf.pdf
The swap eyes is a disabled legacy feature. AFAIK, it was only present in their viewer for consumer GPUs.

It can be seen on page 35 of their legacy pdf

http://http.download.nvidia.com/Windows/71.84/71.84_ForceWare_3D_Stereo_Users_Guide.pdf.pdf

#44
Posted 07/16/2016 02:43 PM   
I've just sent the author a revised pull request with only some of the more well known / important stereo settings (the tool already extracts the names of the other stereo settings): https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/2.1.2.8/artifacts
I've just sent the author a revised pull request with only some of the more well known / important stereo settings (the tool already extracts the names of the other stereo settings):

https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/2.1.2.8/artifacts

2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit

Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD

Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword

#45
Posted 07/17/2016 02:57 AM   
  3 / 4    
Scroll To Top