[quote="helifax"][quote="Volnaiskra"]
....
4. if on a SLI system, force the game to run in single-GPU mode via the nvidia control panel.
....
[/quote]
I don't want to be off-topic or anything but this line got me wondering.... Can you explain why this is needed? I played Bioshock Infinite on a GTX590 and now on 2x780Ti and I didn't had to do that (or maybe I missed something that I am not aware of..?)[/quote]When using Helix's fix, the water in the church at the beginning of Bioshock Infinite is very broken if you use SLI. Considering the general wow factor of that section, it'd be a shame to show someone the game in SLI if you're trying to win them over to 3D!
Volnaiskra said:
....
4. if on a SLI system, force the game to run in single-GPU mode via the nvidia control panel.
....
I don't want to be off-topic or anything but this line got me wondering.... Can you explain why this is needed? I played Bioshock Infinite on a GTX590 and now on 2x780Ti and I didn't had to do that (or maybe I missed something that I am not aware of..?)
When using Helix's fix, the water in the church at the beginning of Bioshock Infinite is very broken if you use SLI. Considering the general wow factor of that section, it'd be a shame to show someone the game in SLI if you're trying to win them over to 3D!
Volnaiskra - Very well said and put together post especially about the loyalty and spending on top end hardware.
SteveK - You guys do know about Helix and the work he has done yes???
I have never heard any comments from moderators acknowledging the work he has done and how well it works. Why can't you guys do similar things or just hire the guy already.
Volnaiskra - Very well said and put together post especially about the loyalty and spending on top end hardware.
SteveK - You guys do know about Helix and the work he has done yes???
I have never heard any comments from moderators acknowledging the work he has done and how well it works. Why can't you guys do similar things or just hire the guy already.
Intel 5960x, Asus RVE, 16 Gb Ram
Gtx 980
Samsung 850 pro 1TB
Win 10 64
[quote="Arioch1"]My guess is there are some legal issues with modifying shaders in games that they can't do.[/quote]I could be wrong, but if I remember correctly, eqzitara debunked that theory in a similar conversation a while back. I think he said that 3Dvision already automatically alters shaders.
Arioch1 said:My guess is there are some legal issues with modifying shaders in games that they can't do.
I could be wrong, but if I remember correctly, eqzitara debunked that theory in a similar conversation a while back. I think he said that 3Dvision already automatically alters shaders.
[quote="Volnaiskra"]Hi SteveK, I'd like to state my appreciation of the helpful and honest engagement with the community you've been providing. This new 3D Compatibility Mode sounds like it will be a great fallback to salvage those games which flat out didn't work in 3D. However, I do hope your team doesn't stop there, and that the Nvidia tech team see 3D compatibility mode as a preliminary 'safety net' that lays the ground for the real work of implemeting genuine 3D on a per-game basis.
3D compatibility mode (or 'fake 3D' as some uncharitably call it) isn't likely to win over many new converts to 3D in a post-Oculus Rift world. And as you know, it has only garnered a lukewarm reception at best from existing 3Dvision users.
I feel compelled to echo Conan's comments above: Nvidia's team of paid professionals have a long way to go before they match (let alone exceed) the work of a small ragtag group of modders working from the outside. We 3Dvision customers feel we deserve better.
Numerically, we may be a small section of the Nvidia customer base, but when it comes to spending, we are undoubtedly some of Nvidia's best customers. We care more about visual fidelity than most, and maintaining 120fps in 3Dvision isn't cheap; we routinely spend big on Nvidia hardware to do so. Look through some of the sigs on this forum, and you'll see that top-tier cards running in SLI is practically the norm. When I myself switched from 2D to 3D, I upgraded from a 680 to two Titans and a 650ti for PhysX - that's roughly a fivefold increase in spending directly related to 3Dvision.
As long as Nvidia continues to provide the best 3D experience, we are also some of your most loyal customers, bound as we are to Nvidia hardware. When legions of gamers recently switched to AMD's popular Rx series, enticed by low pricing, Mantle, and promise of better next-gen console compatibility, we did not. This, however, will only continue to be so as long as Nvidia continue to provide the best 3D solution. If 3Dvision does not remain compelling once the Oculus Rift arrives, then you'll likely lose many of us to AMD.
Finally, we are almost universally passionate about 3Dvision, and tend to evangelise it to our friends, frequently converting them to 3Dvision in the process (ie. cementing new nvidia customers).
In short, our loyalty to Nvidia is not merely emotional, but represents a significant monetary outlay on our part that goes far beyond just buying a 3Dvision monitor or kit. And we feel we deserve a certain level of commitment from Nvidia to uphold the technology that we are paying for. This 3D compatibility mode is a fine stop-gap, but it is not true 3D, and it is not true 3D Vision.
Conan suggested earlier that your tech guys compare Bioshock Infinite using 3D compatibility Mode, and using Helix's fix. I would second that. It'd probably be a worthwhile experiment. It's very easy to do, though the required info is scattered haphazardly over these forums, so I'll consolidate it here for your convenience. Please feel free to pass this on to your tech team.
1. [url=https://s3.amazonaws.com/-HeliX-/BI.zip]Download Helix's Bioshock Infinite fix[/url]
2. Unpack the files into the folder where BioShockInfinite.exe is located
3. Set Ambient Occlusion (SSAO) to "normal" in-game
4. if on a SLI system, force the game to run in single-GPU mode via the nvidia control panel.
5. Run Bioshock Infinite.
You'll see what all the fuss is about. Please note that Bioshock Infinite is the very first DX11 game that [url="https://forums.geforce.com/member/1867809/"]Helix [/url]has managed to fix (and possibly last - he hasn't been active for a while), and as such it is not a perfect fix. You may see small glitches such as cut-off shadows or certain reflections appearing not quite right.
So it is all the more impressive that this imperfect fix is much more convincing and enjoyable than the 3D compatibility mode. It is deeper and more life-like. Objects appear voluminous and solid, rather than like flat cardboard cutouts. The horizon stretches far out into the distance. The whole scene has a pleasing sense of cavernous space. Increasing the convergence enables objects to more effectively pop out of the screen.
It looks great because it uses the 3Dvision tech as it was always meant to be used. Please, do justice to your own tech. You'd be foolish not to: the world is madly salivating over 3-dimensional VR at the moment, yet commercial VR tech is still at least a year away. You have a head start - make the most of it!
.[/quote]
Thanks for writing what I was thinking in a much more profound and organized way.
Yes, I spend big bucks. I have 3 GTX 780s, 3 Benq 24" monitors, and was planing on buying at least 2 next gen cards (if/when they come out! HDMI 2.0!!!! NOW, sorry) BUT if this compatibility mode is all were gonna get then I'm done with Nvidia.
Please listen to us. WE WANT REAL 3D. Thanks.
Volnaiskra said:Hi SteveK, I'd like to state my appreciation of the helpful and honest engagement with the community you've been providing. This new 3D Compatibility Mode sounds like it will be a great fallback to salvage those games which flat out didn't work in 3D. However, I do hope your team doesn't stop there, and that the Nvidia tech team see 3D compatibility mode as a preliminary 'safety net' that lays the ground for the real work of implemeting genuine 3D on a per-game basis.
3D compatibility mode (or 'fake 3D' as some uncharitably call it) isn't likely to win over many new converts to 3D in a post-Oculus Rift world. And as you know, it has only garnered a lukewarm reception at best from existing 3Dvision users.
I feel compelled to echo Conan's comments above: Nvidia's team of paid professionals have a long way to go before they match (let alone exceed) the work of a small ragtag group of modders working from the outside. We 3Dvision customers feel we deserve better.
Numerically, we may be a small section of the Nvidia customer base, but when it comes to spending, we are undoubtedly some of Nvidia's best customers. We care more about visual fidelity than most, and maintaining 120fps in 3Dvision isn't cheap; we routinely spend big on Nvidia hardware to do so. Look through some of the sigs on this forum, and you'll see that top-tier cards running in SLI is practically the norm. When I myself switched from 2D to 3D, I upgraded from a 680 to two Titans and a 650ti for PhysX - that's roughly a fivefold increase in spending directly related to 3Dvision.
As long as Nvidia continues to provide the best 3D experience, we are also some of your most loyal customers, bound as we are to Nvidia hardware. When legions of gamers recently switched to AMD's popular Rx series, enticed by low pricing, Mantle, and promise of better next-gen console compatibility, we did not. This, however, will only continue to be so as long as Nvidia continue to provide the best 3D solution. If 3Dvision does not remain compelling once the Oculus Rift arrives, then you'll likely lose many of us to AMD.
Finally, we are almost universally passionate about 3Dvision, and tend to evangelise it to our friends, frequently converting them to 3Dvision in the process (ie. cementing new nvidia customers).
In short, our loyalty to Nvidia is not merely emotional, but represents a significant monetary outlay on our part that goes far beyond just buying a 3Dvision monitor or kit. And we feel we deserve a certain level of commitment from Nvidia to uphold the technology that we are paying for. This 3D compatibility mode is a fine stop-gap, but it is not true 3D, and it is not true 3D Vision.
Conan suggested earlier that your tech guys compare Bioshock Infinite using 3D compatibility Mode, and using Helix's fix. I would second that. It'd probably be a worthwhile experiment. It's very easy to do, though the required info is scattered haphazardly over these forums, so I'll consolidate it here for your convenience. Please feel free to pass this on to your tech team.
1. Download Helix's Bioshock Infinite fix
2. Unpack the files into the folder where BioShockInfinite.exe is located
3. Set Ambient Occlusion (SSAO) to "normal" in-game
4. if on a SLI system, force the game to run in single-GPU mode via the nvidia control panel.
5. Run Bioshock Infinite.
You'll see what all the fuss is about. Please note that Bioshock Infinite is the very first DX11 game that Helix has managed to fix (and possibly last - he hasn't been active for a while), and as such it is not a perfect fix. You may see small glitches such as cut-off shadows or certain reflections appearing not quite right.
So it is all the more impressive that this imperfect fix is much more convincing and enjoyable than the 3D compatibility mode. It is deeper and more life-like. Objects appear voluminous and solid, rather than like flat cardboard cutouts. The horizon stretches far out into the distance. The whole scene has a pleasing sense of cavernous space. Increasing the convergence enables objects to more effectively pop out of the screen.
It looks great because it uses the 3Dvision tech as it was always meant to be used. Please, do justice to your own tech. You'd be foolish not to: the world is madly salivating over 3-dimensional VR at the moment, yet commercial VR tech is still at least a year away. You have a head start - make the most of it!
.
Thanks for writing what I was thinking in a much more profound and organized way.
Yes, I spend big bucks. I have 3 GTX 780s, 3 Benq 24" monitors, and was planing on buying at least 2 next gen cards (if/when they come out! HDMI 2.0!!!! NOW, sorry) BUT if this compatibility mode is all were gonna get then I'm done with Nvidia.
[quote="Volnaiskra"][quote="Arioch1"]My guess is there are some legal issues with modifying shaders in games that they can't do.[/quote]I could be wrong, but if I remember correctly, eqzitara debunked that theory in a similar conversation a while back. I think he said that 3Dvision already automatically alters shaders. [/quote]
Well I don't know then. Wish they would just hire Helix or something.
Arioch1 said:My guess is there are some legal issues with modifying shaders in games that they can't do.
I could be wrong, but if I remember correctly, eqzitara debunked that theory in a similar conversation a while back. I think he said that 3Dvision already automatically alters shaders.
Well I don't know then. Wish they would just hire Helix or something.
1080 GTX 8GB SLI | I7-4770K@4.5GHz | 16GB RAM | Win10x64
Asus ROG Swift PG278Q | 3D Vision 2
With the new drivers AC4 doesn't engage the pseudo3D properly. The image looks like it cosntantly changing back and forth from two different separation/convergence levels. I even tried to disable SLI and change in-game settings but with no luck
With the new drivers AC4 doesn't engage the pseudo3D properly. The image looks like it cosntantly changing back and forth from two different separation/convergence levels. I even tried to disable SLI and change in-game settings but with no luck
1080 GTX 8GB SLI | I7-4770K@4.5GHz | 16GB RAM | Win10x64
Asus ROG Swift PG278Q | 3D Vision 2
Still no support for 'Generic 3D DLP HDTV', I'd like to know why the hell it was removed in the first place ... 'Generic CRT' is still being supported. Without 'Generic 3D DLP HDTV' support I can't pass the signal through my Onkyo receiver, and plugging directly into the Mitsubishi 3D DLP HDTV forces me to use depth hacks(2 HDMI to DVI cables, an extended desktop which the mouse gets lost on the 'imaginary' other monitor ... blah, blah, blah all of which I've 'reported' multiple times already) ... right back to 320.49 drivers. What a waste of time ...
Still no support for 'Generic 3D DLP HDTV', I'd like to know why the hell it was removed in the first place ... 'Generic CRT' is still being supported. Without 'Generic 3D DLP HDTV' support I can't pass the signal through my Onkyo receiver, and plugging directly into the Mitsubishi 3D DLP HDTV forces me to use depth hacks(2 HDMI to DVI cables, an extended desktop which the mouse gets lost on the 'imaginary' other monitor ... blah, blah, blah all of which I've 'reported' multiple times already) ... right back to 320.49 drivers. What a waste of time ...
Hi SteveK
[i]I have C&Ped this from another thread.[/i]
I think it is fair to say, this what the community wants:
PLEASE
Take convergence settings out of 'Advanced'.
Convergence isn't advanced; it is necessary! 3D quality DEPENDS on convergence.
I understand newbies might break 3D by applying too much, but it is fixable, 100% of the time, by holding Ctrl + F5.
Why don't you add an additional slide to the setup wizard explaining convergence e.g.
Use the convergence keys to tweak popout.
WARNING: Too much convergence can break the image. Should this happen, simply hold the lower convergence keys until the screen appears normal again.
Also suggest new users experiment with convergence using the spinning nVidia logo in the built in demo.
I understand too much convergence breaks the new 3D tech. If this is a concern you could just mention this issue in the game's OSD. "Too much convergence can cause anomolies."
PLEASE
With regard to the new 3D tech, don't hold back on depth.
Shogun 2's fix looked great. The only seriuous issue I had with it was, for me, the max depth was way too low. On your official thread for the driver, I posted a link to a file containing comparisons between nvidia's new tech and a community fix. The community fix was the better fix because it allowed max depth! nVidia's Shogun 2 profile's max depth was too low and the 3D effect was frankly boring.
If depth should cause anomolies update the OSD to reflect this.
PLEASE
[i]mike_ar69 said:[/i]
Don't set depth to 15% by default, that's just feeding the people who call 3D a "gimmick".
There is simply no reason for the default to be be set this low. By all means warn people to not look at the screen whilst changing depth, because that can be jarring, but please mention the fact that many 3D gamers enjoy 100% depth. I honestly thought playing at 100% depth was risky when I first got the kit and was scared to increase depth! I still remember when I did take the plunge and experimented with 100% depth; I was frankly blown away and have never been satisfied with less since.
SkeveK
I appreciate the concerns the good folk have on these board, but to say the nVidia's recent activity isn't a HUGE leap in the right direction would be churlish beyond belief. So thanks!
I think it is fair to say, this what the community wants:
PLEASE
Take convergence settings out of 'Advanced'.
Convergence isn't advanced; it is necessary! 3D quality DEPENDS on convergence.
I understand newbies might break 3D by applying too much, but it is fixable, 100% of the time, by holding Ctrl + F5.
Why don't you add an additional slide to the setup wizard explaining convergence e.g.
Use the convergence keys to tweak popout.
WARNING: Too much convergence can break the image. Should this happen, simply hold the lower convergence keys until the screen appears normal again.
Also suggest new users experiment with convergence using the spinning nVidia logo in the built in demo.
I understand too much convergence breaks the new 3D tech. If this is a concern you could just mention this issue in the game's OSD. "Too much convergence can cause anomolies."
PLEASE
With regard to the new 3D tech, don't hold back on depth.
Shogun 2's fix looked great. The only seriuous issue I had with it was, for me, the max depth was way too low. On your official thread for the driver, I posted a link to a file containing comparisons between nvidia's new tech and a community fix. The community fix was the better fix because it allowed max depth! nVidia's Shogun 2 profile's max depth was too low and the 3D effect was frankly boring.
If depth should cause anomolies update the OSD to reflect this.
PLEASE mike_ar69 said:
Don't set depth to 15% by default, that's just feeding the people who call 3D a "gimmick".
There is simply no reason for the default to be be set this low. By all means warn people to not look at the screen whilst changing depth, because that can be jarring, but please mention the fact that many 3D gamers enjoy 100% depth. I honestly thought playing at 100% depth was risky when I first got the kit and was scared to increase depth! I still remember when I did take the plunge and experimented with 100% depth; I was frankly blown away and have never been satisfied with less since.
SkeveK
I appreciate the concerns the good folk have on these board, but to say the nVidia's recent activity isn't a HUGE leap in the right direction would be churlish beyond belief. So thanks!
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Thanks for listening to the community about an on and off toggle for the new 3D mode. Any chance you can throw in side by side 3D as a 3D vision or 3DTV play option? I know there is checkerboard but side by side would be really great. We'd still have to pay to access it, and it makes more sense than having checkerboard only as an option...
Thanks for listening to the community about an on and off toggle for the new 3D mode. Any chance you can throw in side by side 3D as a 3D vision or 3DTV play option? I know there is checkerboard but side by side would be really great. We'd still have to pay to access it, and it makes more sense than having checkerboard only as an option...
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
Hello SteveK. I know this has been brought up before, but is it possible for Nvidia to remove the "Stereoscopic 3D laser sight is on/off" notification that pops up whenever 3D crosshair is toggled? It's really serves no purpose.
http://img152.imageshack.us/img152/54/wtfisthis.jpg
Hello SteveK. I know this has been brought up before, but is it possible for Nvidia to remove the "Stereoscopic 3D laser sight is on/off" notification that pops up whenever 3D crosshair is toggled? It's really serves no purpose.
[quote="Volnaiskra"][quote="Arioch1"]My guess is there are some legal issues with modifying shaders in games that they can't do.[/quote]I could be wrong, but if I remember correctly, eqzitara debunked that theory in a similar conversation a while back. I think he said that 3Dvision already automatically alters shaders. [/quote]
Modifying shaders on the fly via drivers, and sharing modified shaders files on internet is obviously different. One does not go against copyright, the other one does. Helix mods are "tolerated" because it's free, but a business compagny like Nvidia can not modify and distribute pieces of software from other compagny to its customers (maybe unless exceptional agreement).
Don't know how and why this needs to be debunked.
Arioch1 said:My guess is there are some legal issues with modifying shaders in games that they can't do.
I could be wrong, but if I remember correctly, eqzitara debunked that theory in a similar conversation a while back. I think he said that 3Dvision already automatically alters shaders.
Modifying shaders on the fly via drivers, and sharing modified shaders files on internet is obviously different. One does not go against copyright, the other one does. Helix mods are "tolerated" because it's free, but a business compagny like Nvidia can not modify and distribute pieces of software from other compagny to its customers (maybe unless exceptional agreement).
They're improving a product for free, and any distributed shader files wouldn't be useful for anyone other than their customers. No company is going to waste time and money trying to prevent them from improving a product. Not when it benefits developers, nVidia and customers all at once.
They're improving a product for free, and any distributed shader files wouldn't be useful for anyone other than their customers. No company is going to waste time and money trying to prevent them from improving a product. Not when it benefits developers, nVidia and customers all at once.
SteveK - You guys do know about Helix and the work he has done yes???
I have never heard any comments from moderators acknowledging the work he has done and how well it works. Why can't you guys do similar things or just hire the guy already.
Intel 5960x, Asus RVE, 16 Gb Ram
Gtx 980
Samsung 850 pro 1TB
Win 10 64
1080 GTX 8GB SLI | I7-4770K@4.5GHz | 16GB RAM | Win10x64
Asus ROG Swift PG278Q | 3D Vision 2
Thanks for writing what I was thinking in a much more profound and organized way.
Yes, I spend big bucks. I have 3 GTX 780s, 3 Benq 24" monitors, and was planing on buying at least 2 next gen cards (if/when they come out! HDMI 2.0!!!! NOW, sorry) BUT if this compatibility mode is all were gonna get then I'm done with Nvidia.
Please listen to us. WE WANT REAL 3D. Thanks.
Well I don't know then. Wish they would just hire Helix or something.
1080 GTX 8GB SLI | I7-4770K@4.5GHz | 16GB RAM | Win10x64
Asus ROG Swift PG278Q | 3D Vision 2
1080 GTX 8GB SLI | I7-4770K@4.5GHz | 16GB RAM | Win10x64
Asus ROG Swift PG278Q | 3D Vision 2
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
Enthusiastically seconded.
I have C&Ped this from another thread.
I think it is fair to say, this what the community wants:
PLEASE
Take convergence settings out of 'Advanced'.
Convergence isn't advanced; it is necessary! 3D quality DEPENDS on convergence.
I understand newbies might break 3D by applying too much, but it is fixable, 100% of the time, by holding Ctrl + F5.
Why don't you add an additional slide to the setup wizard explaining convergence e.g.
Use the convergence keys to tweak popout.
WARNING: Too much convergence can break the image. Should this happen, simply hold the lower convergence keys until the screen appears normal again.
Also suggest new users experiment with convergence using the spinning nVidia logo in the built in demo.
I understand too much convergence breaks the new 3D tech. If this is a concern you could just mention this issue in the game's OSD. "Too much convergence can cause anomolies."
PLEASE
With regard to the new 3D tech, don't hold back on depth.
Shogun 2's fix looked great. The only seriuous issue I had with it was, for me, the max depth was way too low. On your official thread for the driver, I posted a link to a file containing comparisons between nvidia's new tech and a community fix. The community fix was the better fix because it allowed max depth! nVidia's Shogun 2 profile's max depth was too low and the 3D effect was frankly boring.
If depth should cause anomolies update the OSD to reflect this.
PLEASE
mike_ar69 said:
Don't set depth to 15% by default, that's just feeding the people who call 3D a "gimmick".
There is simply no reason for the default to be be set this low. By all means warn people to not look at the screen whilst changing depth, because that can be jarring, but please mention the fact that many 3D gamers enjoy 100% depth. I honestly thought playing at 100% depth was risky when I first got the kit and was scared to increase depth! I still remember when I did take the plunge and experimented with 100% depth; I was frankly blown away and have never been satisfied with less since.
SkeveK
I appreciate the concerns the good folk have on these board, but to say the nVidia's recent activity isn't a HUGE leap in the right direction would be churlish beyond belief. So thanks!
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
http://img152.imageshack.us/img152/54/wtfisthis.jpg
Dual boot Win 7 x64 & Win 10 (1809) | Geforce Drivers 417.35
My 3D Vision Gallery
Helix 3D Fixes
Win 7 x64
i7 4960X Extreme Edition
MSI Big Bang XPower II
2x EVGA Titan Z
Silverstone Evo 1200w
Modifying shaders on the fly via drivers, and sharing modified shaders files on internet is obviously different. One does not go against copyright, the other one does. Helix mods are "tolerated" because it's free, but a business compagny like Nvidia can not modify and distribute pieces of software from other compagny to its customers (maybe unless exceptional agreement).
Don't know how and why this needs to be debunked.