My definition of 3d is for example, the results we are getting with Skyrim.
That kind of depth, convergence and immersion used to be available on pretty much EVERY game released.
This 'pseudo-3d' peddled currently (usually as 3d Ready) is a very worrying development and seems indicative of a certain laziness on the part of the drivers programmers.
My definition of 3d is for example, the results we are getting with Skyrim.
That kind of depth, convergence and immersion used to be available on pretty much EVERY game released.
This 'pseudo-3d' peddled currently (usually as 3d Ready) is a very worrying development and seems indicative of a certain laziness on the part of the drivers programmers.
Well guys, the Convergence issue is really nothing to do with the laziness of driver programmers as it actually takes more work for them to lock Convergence for a specific game. There's definitely a disconnect with Nvidia and their partners about the impact and importance of Convergence control on the 3D experience, but never once have I seen Andrew say the decision was Nvidia's to lock convergence, it is always "the developer decided to lock Convergence controls." If this is the case, there's really nothing Nvidia can do as its a simple case of quid pro quo or conditional implementation. Developer says "we help you make the game 3D Vision Ready but only if you lock Convergence." Nvidia agrees or developer pulls 3D Vision support completely. Which is the lesser evil?
Now why would a developer insist on locking Convergence? Why would Nvidia disable Convergence controls by default in the NVCP? If you play with the controls enough you can clearly see why. With a modest Convergence setting, you can adjust Depth from 0-100 on the wheel and still maintain a comfortable (albeit sometimes flat) 3D experience. If you go too far with Convergence, the 3D image very quickly falls apart and can result in eye strain or even cross-eyed inclinations while viewing. My guess is this is what Nvidia and their dev partners are trying to avoid, the problem is it also takes away a lot of our control over our 3D experience.
While its certainly frustrating to see a 3D Vision Ready rated game with locked Convergence, I'm not going to be too hard on them because some of the very best examples of 3D Vision are still 3D Vision Ready games, like both Batman titles, Just Cause 2, etc. Also, in many games 2D assets or rendering issues result in 3D artifacts that are a MUCH bigger problem than lack of Convergence controls.
In the meantime, all we can do is ask Nvidia to do a better job of explaining to the devs why Convergence control is important and lobby DICE to allow for Convergence controls in BF3. Let's not kid ourselves though, without Nvidia's involvement with 3D Vision titles there'd be 0% chance of mouse cursors, crosshairs, or HUD elements being rendered properly in 3D, so overall their contributions are certainly appreciated (by me at least).
Well guys, the Convergence issue is really nothing to do with the laziness of driver programmers as it actually takes more work for them to lock Convergence for a specific game. There's definitely a disconnect with Nvidia and their partners about the impact and importance of Convergence control on the 3D experience, but never once have I seen Andrew say the decision was Nvidia's to lock convergence, it is always "the developer decided to lock Convergence controls." If this is the case, there's really nothing Nvidia can do as its a simple case of quid pro quo or conditional implementation. Developer says "we help you make the game 3D Vision Ready but only if you lock Convergence." Nvidia agrees or developer pulls 3D Vision support completely. Which is the lesser evil?
Now why would a developer insist on locking Convergence? Why would Nvidia disable Convergence controls by default in the NVCP? If you play with the controls enough you can clearly see why. With a modest Convergence setting, you can adjust Depth from 0-100 on the wheel and still maintain a comfortable (albeit sometimes flat) 3D experience. If you go too far with Convergence, the 3D image very quickly falls apart and can result in eye strain or even cross-eyed inclinations while viewing. My guess is this is what Nvidia and their dev partners are trying to avoid, the problem is it also takes away a lot of our control over our 3D experience.
While its certainly frustrating to see a 3D Vision Ready rated game with locked Convergence, I'm not going to be too hard on them because some of the very best examples of 3D Vision are still 3D Vision Ready games, like both Batman titles, Just Cause 2, etc. Also, in many games 2D assets or rendering issues result in 3D artifacts that are a MUCH bigger problem than lack of Convergence controls.
In the meantime, all we can do is ask Nvidia to do a better job of explaining to the devs why Convergence control is important and lobby DICE to allow for Convergence controls in BF3. Let's not kid ourselves though, without Nvidia's involvement with 3D Vision titles there'd be 0% chance of mouse cursors, crosshairs, or HUD elements being rendered properly in 3D, so overall their contributions are certainly appreciated (by me at least).
chiz, I would agree in principle with your opinion... if NVIDIA 3D Vision were a free extra add-on at no additional cost that was not being used in marketing, then yes, I'd be thankful to NVIDIA for trying to provide us with support, and I also wouldn't be too hard on developers who decided to lock convergence, for instance.
As it stands, NVIDIA 3D Vision is being marketed heavily by NVIDIA and is a very expensive consumer technology - I decided to go with a GTX 580 for $600 instead of a GTX 560 for maybe $200 (which would actually still run Battlefield 3 better in 2D than my GTX 580 does in 3D), as well as additional $40 for the NVIDIA 3DTV Play utility, and let's not forget the $800 for the Sony HMZ-T1 headset to go with it (or alternatively, a couple of hundred dollars for a decent 3D Vision compatible monitor).
All in all, we're easily talking about up to $1000 in up front additional investment based on marketing claims by NVIDIA and DICE that Battlefield 3 was "3D Vision Ready" and was given a full five stars by NVIDIA and featured prominently on the 3D Vision homepage. I've 'appreciated' NVIDIA's contribution in the only way that they care about me appreciating, which is by purchasing their products because of their marketing. If there were "0% chance of mouse cursors, crosshairs or HUD elements being rendered properly in 3D", as you put it, then there would also be a 0% chance of me buying NVIDIA products for a total of $640 because of NVIDIA 3D Vision, so I certainly don't owe them any extra thanks for being involved in the development of 3D Vision games - it's their big selling point that carries a huge premium, so expect to be judged on your marketing claims.
When you make those kind of claims and get me to part with more than $1000 because of them, don't count on me shrugging it off if the game is nowhere near "3D Vision Ready" as claimed - as people have noted before, the 3D in this game is (due to bad performance issues and completely locked convergence) nowhere near comparable to the experience delivered by Skyrim, for instance (ironically a game that isn't even "3D Vision Ready"). It's a pretty blatant marketing lie, and if you try to boost sales this way, be prepared for a consumer backlash - that's just the way it is.
Secondly, the arguments for convergence being locked sound idiotic to me - so the idea is to restrict the controls to make sure that in no case the user can accidentally trigger settings that might become a problem for him? First off, given that 3D is still very much a niche market and requires a substantial investment, I think it's safe to say that 3D Vision users are on average more technologically versed than most PC gamers (or at least not substantially less versed, if you don't trust in that claim), so would we really not be able to just turn back convergence if we had indeed accidentally turned it up too high?
Think of it this way - if that's what we're going for, let's also lock down the brightness setting of every monitor so people don't accidentally turn the brightness so low that they can't properly see the picture anymore, and let's make sure to lock down all volume controls on all devices so we don't accidentally turn down the volume so low that we won't be able to hear anything anymore.
If you want to be extremely careful, then how about an "advanced settings" menu where you have to unlock convergence manually and that comes with a warning that unlocking convergence and setting it incorrectly might be detrimental to the overall 3D viewing experience?
As is, the concept is patronizing, plain stupid and takes away so much from the experience that I have no idea who thought up this idea in the first place.
NVIDIA certainly hold a certain leverage over developers - how about they inform developers that locked convergence is highly discouraged, don't give out any "3D Vision Ready" seals and high ratings unless developers leave convergence unlocked or just override the convergence lock in the driver, which I am fairly certain could be done rather easily if NVIDIA wanted to.
It's the same kind of problem that we have with NVIDIA not letting us adjust the screen size (and associated maximum depth) in the control panel, forcing us to use registry hacks for this instead - in trying to make things more comfortable, all they end up doing is forcing us to use hugely inconvenient workarounds instead, and in this case there doesn't even seem to be a workaround for unlocking the convergence.
chiz, I would agree in principle with your opinion... if NVIDIA 3D Vision were a free extra add-on at no additional cost that was not being used in marketing, then yes, I'd be thankful to NVIDIA for trying to provide us with support, and I also wouldn't be too hard on developers who decided to lock convergence, for instance.
As it stands, NVIDIA 3D Vision is being marketed heavily by NVIDIA and is a very expensive consumer technology - I decided to go with a GTX 580 for $600 instead of a GTX 560 for maybe $200 (which would actually still run Battlefield 3 better in 2D than my GTX 580 does in 3D), as well as additional $40 for the NVIDIA 3DTV Play utility, and let's not forget the $800 for the Sony HMZ-T1 headset to go with it (or alternatively, a couple of hundred dollars for a decent 3D Vision compatible monitor).
All in all, we're easily talking about up to $1000 in up front additional investment based on marketing claims by NVIDIA and DICE that Battlefield 3 was "3D Vision Ready" and was given a full five stars by NVIDIA and featured prominently on the 3D Vision homepage. I've 'appreciated' NVIDIA's contribution in the only way that they care about me appreciating, which is by purchasing their products because of their marketing. If there were "0% chance of mouse cursors, crosshairs or HUD elements being rendered properly in 3D", as you put it, then there would also be a 0% chance of me buying NVIDIA products for a total of $640 because of NVIDIA 3D Vision, so I certainly don't owe them any extra thanks for being involved in the development of 3D Vision games - it's their big selling point that carries a huge premium, so expect to be judged on your marketing claims.
When you make those kind of claims and get me to part with more than $1000 because of them, don't count on me shrugging it off if the game is nowhere near "3D Vision Ready" as claimed - as people have noted before, the 3D in this game is (due to bad performance issues and completely locked convergence) nowhere near comparable to the experience delivered by Skyrim, for instance (ironically a game that isn't even "3D Vision Ready"). It's a pretty blatant marketing lie, and if you try to boost sales this way, be prepared for a consumer backlash - that's just the way it is.
Secondly, the arguments for convergence being locked sound idiotic to me - so the idea is to restrict the controls to make sure that in no case the user can accidentally trigger settings that might become a problem for him? First off, given that 3D is still very much a niche market and requires a substantial investment, I think it's safe to say that 3D Vision users are on average more technologically versed than most PC gamers (or at least not substantially less versed, if you don't trust in that claim), so would we really not be able to just turn back convergence if we had indeed accidentally turned it up too high?
Think of it this way - if that's what we're going for, let's also lock down the brightness setting of every monitor so people don't accidentally turn the brightness so low that they can't properly see the picture anymore, and let's make sure to lock down all volume controls on all devices so we don't accidentally turn down the volume so low that we won't be able to hear anything anymore.
If you want to be extremely careful, then how about an "advanced settings" menu where you have to unlock convergence manually and that comes with a warning that unlocking convergence and setting it incorrectly might be detrimental to the overall 3D viewing experience?
As is, the concept is patronizing, plain stupid and takes away so much from the experience that I have no idea who thought up this idea in the first place.
NVIDIA certainly hold a certain leverage over developers - how about they inform developers that locked convergence is highly discouraged, don't give out any "3D Vision Ready" seals and high ratings unless developers leave convergence unlocked or just override the convergence lock in the driver, which I am fairly certain could be done rather easily if NVIDIA wanted to.
It's the same kind of problem that we have with NVIDIA not letting us adjust the screen size (and associated maximum depth) in the control panel, forcing us to use registry hacks for this instead - in trying to make things more comfortable, all they end up doing is forcing us to use hugely inconvenient workarounds instead, and in this case there doesn't even seem to be a workaround for unlocking the convergence.
Yes there is a cost of admission and barrier to entry with the additional cost of hardware (and software in some cases), but this is the same with anything else. You can watch a movie on your 15" laptop or you can shell out for a 100" HT set-up. As you increase the hardware and costs involved, inevitably the level of potential headache and frustration increases. The same can be said with virtually anything in the realm of "enthusiast-level" products. As you increase the barrier to entry, the installed user-base naturally decreases, making our concerns a lower priority over the masses.
I think the problem here is you seem stuck on the "3D Vision Ready" rating even with locked convergence, and you keep referring to Battlefield 3 as your key reference. Unfortunately, you should have realized that historically Nvidia has not considered control over Convergence to be a deciding metric for their 3D Vision Ready rating system, as numerous 3D Vision Ready titles have launched in the past with locked Convergence. Mafia 2, Resident Evil 5, LA Noire as a few examples. You can argue the rating system itself is fundamentally flawed, but that won't change Nvidia's criteria for what they consider "3D Vision Ready", meaning you load the game up, you adjust depth 0-100 to your liking, and 3D just works without making you googly-eyed.
As for the reasoning being idiotic, that's certainly your opinion, but Nvidia actually has numerous controls in place to try and make the end-user experience as seamless as possible without requiring any significant user-intervention. Your screen-size example is just one of them. Exposing some controls like the advanced 3D keys or SLI/AA flags does seem contradictory at times and leave us wanting for more, but overall its pretty clear their corporate guidance steers them away from giving us access to additional control/functionality. In the end, one control is relatively benign (depth) while the other (convergence) can completely break the 3D experience and leave a very negative impression of 3D Vision while generating a potential for more customer service calls. The other (depth) will just result in a complaint that the 3D looks too flat. See the difference there? All it would take is a set-up in a board room demo'ing the game in 3D, have one guy play with Convergence controls and say "whoa what was that? My 3D just broke! Yeah turn off whatever control that was."
But yeah, I think you overestimate the amount of influence Nvidia has with developers, since we can't even get most devs to fix minor 3D issues or expose controls to turn off problematic 2D effects in games. Nvidia is the one walking hat in hand looking for support of a niche feature that a tiny % of users will benefit from, and for the most part, they do an excellent job of it in supported 3D Vision titles. I've already agreed they could do a better job of expressing the importance of Convergence controls to their dev partners, but they have no card to play with threatening to pull 3D Vision support, because if push came to shove, the devs would just tell them to go whistle.
Yes there is a cost of admission and barrier to entry with the additional cost of hardware (and software in some cases), but this is the same with anything else. You can watch a movie on your 15" laptop or you can shell out for a 100" HT set-up. As you increase the hardware and costs involved, inevitably the level of potential headache and frustration increases. The same can be said with virtually anything in the realm of "enthusiast-level" products. As you increase the barrier to entry, the installed user-base naturally decreases, making our concerns a lower priority over the masses.
I think the problem here is you seem stuck on the "3D Vision Ready" rating even with locked convergence, and you keep referring to Battlefield 3 as your key reference. Unfortunately, you should have realized that historically Nvidia has not considered control over Convergence to be a deciding metric for their 3D Vision Ready rating system, as numerous 3D Vision Ready titles have launched in the past with locked Convergence. Mafia 2, Resident Evil 5, LA Noire as a few examples. You can argue the rating system itself is fundamentally flawed, but that won't change Nvidia's criteria for what they consider "3D Vision Ready", meaning you load the game up, you adjust depth 0-100 to your liking, and 3D just works without making you googly-eyed.
As for the reasoning being idiotic, that's certainly your opinion, but Nvidia actually has numerous controls in place to try and make the end-user experience as seamless as possible without requiring any significant user-intervention. Your screen-size example is just one of them. Exposing some controls like the advanced 3D keys or SLI/AA flags does seem contradictory at times and leave us wanting for more, but overall its pretty clear their corporate guidance steers them away from giving us access to additional control/functionality. In the end, one control is relatively benign (depth) while the other (convergence) can completely break the 3D experience and leave a very negative impression of 3D Vision while generating a potential for more customer service calls. The other (depth) will just result in a complaint that the 3D looks too flat. See the difference there? All it would take is a set-up in a board room demo'ing the game in 3D, have one guy play with Convergence controls and say "whoa what was that? My 3D just broke! Yeah turn off whatever control that was."
But yeah, I think you overestimate the amount of influence Nvidia has with developers, since we can't even get most devs to fix minor 3D issues or expose controls to turn off problematic 2D effects in games. Nvidia is the one walking hat in hand looking for support of a niche feature that a tiny % of users will benefit from, and for the most part, they do an excellent job of it in supported 3D Vision titles. I've already agreed they could do a better job of expressing the importance of Convergence controls to their dev partners, but they have no card to play with threatening to pull 3D Vision support, because if push came to shove, the devs would just tell them to go whistle.
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
[quote name='Adrian stealth' date='01 December 2011 - 02:23 PM' timestamp='1322767412' post='1335960']
All Hi
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
[/quote]
Its really something you need to tinker with yourself to fully grasp, because similar to our discussion about settings and potential problems, every single person's taste in 3D and what they find comfortable or ideal is going to be different. Also, your particular 3D set-up, how far away you're sitting from the TV etc are all going to impact your 3D experience at the same settings as another person. Not a single person on these forums will have the exact settings for both Convergence and Depth, which is an equally compelling argument from both directions for controlling Convergence.
I would say first read MTBS article as a primer as its probably the best concise write-up I've seen with illustrations, what it doesn't really describe as well is the impact on the actual 3D effect that Convergence can have:
http://www.mtbs3d.com/index.php?option=com_content&view=article&id=6197&Itemid=98&limitstart=2
Then, try it out in a few games that have convergence unlocked. Make sure to enable it in the Nvidia control panel under advanced settings first. Generally, Ctrl-F6 moves 0 parallax/neutral point/screen depth/2D towards you. Ctrl-F5 moves it further into the scene.
I went ahead and took some screenshots for you though to help illustrate the problem, again, what you see and what I see may be very different in results or comfort level, but you should still be able to see the differences between each screenshot. All screenshots were taken with ~50% Depth and variable Convergence settings:
[list=1]
[*][b]Screenshot 1[/b]: This shows Depth but low Convergence, meaning neutral point is set at the same depth as distant objects resulting in a very flat 3D environment, slightly sunk into the screen. If you were to peek over your glasses, you can see there is good separation between objects, but the separation is too consistent between far and near objects to give a good sense of depth. Instead it seems like it is a 2D image in stereo. This is what Deus Ex Human Revolution is like, and there's really no way to adjust the image to make it better.
[*][b]Screenshot 2[/b]: This image shows Depth with higher Convergence. If you peek over your glasses you can see I tried setting my neutral point to Batman himself. As a result, Batman appears in "2D" or screen depth, any objects in front of him have a pop-out effect. You should immediately notice however that all objects appear far more "3D" than the first screenshot.
[*][b]Screenshot 3:[/b] This image shows Depth with High Convergence. Basically adjusting convergence any further starts breaking focus of any objects in front of the neutral point and makes it very uncomfortable or cross-eyed viewing if you try to focus on the near-field objects. However, you can see again how every object has a more robust, almost synthetic/holographic quality to it. Neutral point is somewhere inbetween Batman and the bushes behind him.
[/list]
I think these should give a pretty good indication of what Convergence is capable of in terms of controlling your viewing experience, but again, its really something you need to tinker with yourself. Even after tinkering with the settings for a bit, you may not fully grasp all of the concepts (I still don't), but you will at least get a better understanding of what 3D settings you prefer. After playing with it you may soon also find yourself wanting control over Convergence even more, so maybe its better if you don't mess around with it too much. /wink.gif' class='bbc_emoticon' alt=';)' />
And to be fair, Battlefield 3 is not as bad as the 1st screenshot or Deus Ex (worst possible example), its Convergence is locked somewhere inbetween the 1st and 2nd screenshot settings.
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
Its really something you need to tinker with yourself to fully grasp, because similar to our discussion about settings and potential problems, every single person's taste in 3D and what they find comfortable or ideal is going to be different. Also, your particular 3D set-up, how far away you're sitting from the TV etc are all going to impact your 3D experience at the same settings as another person. Not a single person on these forums will have the exact settings for both Convergence and Depth, which is an equally compelling argument from both directions for controlling Convergence.
I would say first read MTBS article as a primer as its probably the best concise write-up I've seen with illustrations, what it doesn't really describe as well is the impact on the actual 3D effect that Convergence can have:
Then, try it out in a few games that have convergence unlocked. Make sure to enable it in the Nvidia control panel under advanced settings first. Generally, Ctrl-F6 moves 0 parallax/neutral point/screen depth/2D towards you. Ctrl-F5 moves it further into the scene.
I went ahead and took some screenshots for you though to help illustrate the problem, again, what you see and what I see may be very different in results or comfort level, but you should still be able to see the differences between each screenshot. All screenshots were taken with ~50% Depth and variable Convergence settings:
[list=1]
Screenshot 1: This shows Depth but low Convergence, meaning neutral point is set at the same depth as distant objects resulting in a very flat 3D environment, slightly sunk into the screen. If you were to peek over your glasses, you can see there is good separation between objects, but the separation is too consistent between far and near objects to give a good sense of depth. Instead it seems like it is a 2D image in stereo. This is what Deus Ex Human Revolution is like, and there's really no way to adjust the image to make it better.
Screenshot 2: This image shows Depth with higher Convergence. If you peek over your glasses you can see I tried setting my neutral point to Batman himself. As a result, Batman appears in "2D" or screen depth, any objects in front of him have a pop-out effect. You should immediately notice however that all objects appear far more "3D" than the first screenshot.
Screenshot 3: This image shows Depth with High Convergence. Basically adjusting convergence any further starts breaking focus of any objects in front of the neutral point and makes it very uncomfortable or cross-eyed viewing if you try to focus on the near-field objects. However, you can see again how every object has a more robust, almost synthetic/holographic quality to it. Neutral point is somewhere inbetween Batman and the bushes behind him.
[/list]
I think these should give a pretty good indication of what Convergence is capable of in terms of controlling your viewing experience, but again, its really something you need to tinker with yourself. Even after tinkering with the settings for a bit, you may not fully grasp all of the concepts (I still don't), but you will at least get a better understanding of what 3D settings you prefer. After playing with it you may soon also find yourself wanting control over Convergence even more, so maybe its better if you don't mess around with it too much. /wink.gif' class='bbc_emoticon' alt=';)' />
And to be fair, Battlefield 3 is not as bad as the 1st screenshot or Deus Ex (worst possible example), its Convergence is locked somewhere inbetween the 1st and 2nd screenshot settings.
[quote name='photios' date='01 December 2011 - 09:40 PM' timestamp='1322775657' post='1336025']
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the image and the driver.
[/quote]
You're technically correct, well observed - in order to spell things out more clearly, my argument is more empirical than strictly logical:
Empirical observation #1: Convergence seems to be a very essential ingredient for the quality of the 3D experience. This is backed up both by my own observation (by comparing Skyrim with unlocked convergence to Battlefield 3 with locked convergence, for instance) and by a large number of people in this thread complaining about the shallow depth provided by Battlefield 3.
Empirical observation #2: While technically you could make a point that performance is indifferent to the quality of (static) 3D, that argument is a mere technicality as people do not seem interested in only the static quality of 3D but rather the overall 3D experience, a term which encompasses both the 3D quality of a static image as well as the fluidity of the moving picture, much like the term "video quality" refers to not only a static image but also to concepts such as the absence of motion artifacts and stuttering.
Logical Argument #1: For every individual frame of the game, CPU and GPU cycles are dedicated to the game logic (such as physics and AI) as well as to 3D operations in world space (such as transforming vertices according to model view matrices) and 3D operations in screen space (such as converting world vertex coordinates to screenspace coordinates and running fragment shaders on every pixel on screen). If a game is to be rendered in 3D then for every frame, the 3D operations in screen space must be performed twice (once for each eye) while the game logic and 3D operations in world space must be performed only once. It follows logically that the time needed to render the two frames for both eyes can hence not be longer than twice the time needed for a single frame.
Empirical observation #3: Logical Argument #1 is supported by the vast majority of all games, in which 3D performance is never worse than 50% of 2D performance for the same resolution.
Argument #1: If NVIDIA is to provide a viable and informative rating system, then the 3D Vision Rating of any game should correlate highly with people's perceived 3D experience in that game. It's true that there is no commonly accepted definition of what the 3D Vision rating of any game truly means in measurable terms, but the way in which I assume most people would interpret the rating system is the following: The higher a game's rating, the higher any given group of gamers would rate the 3D experience delivered by the game. It's certainly the impression that NVIDIA are trying to evoke with the ratings, and as this is the only thing gamers would be interested in, I would assume that this is what the ratings are intended to convey.
I've actually not found a single clear definition of the ratings, but I am aware that the possible levels are "Not Recommended", "Poor", "Fair", "Good", "Excellent" and "3D Vision Ready".
Empirical Observation #4: People are not happy with the 3D quality provided by Battlefield 3 (as evidenced by the reactions in this thread, for instance).
Now, Argument #1 (that the 3D Vision Rating should correlate highly with people's perceived quality of the 3D experience) together with Empirical Observation #4 suggest that assigning a 3D Vision Rating of "3D Vision Ready" (which is more than even a humble "Excellent") is a questionable decision.
Empirical Observation #1 (that convergence is inherently important to 3D quality) and Empirical Observation #2 (that performance is important for the quality of the overall 3D experience) back up this conclusion as two major points that are seemingly very important to perceived 3D quality have been neglected here.
You make a good point that one could infer the same conclusion from trying to play 3D Vision Rated games with vastly underpowered hardware, and indeed it would not be fair to chastise NVIDIA or DICE if the performance problems were unavoidable and hence would have to be tolerated.
However, Logical Argument #1 (that the fps hit should not be greater than 50%) backed up by Empirical Observation #3 (that this holds true for virtually all games) strongly suggests that the 3D implementation in Battlefield 3 is sloppy at best. Whether this is due to DICE or NVIDIA's drivers is irrelevant at this point - the overall experience falls well short of what could reasonably be expected.
Due to all these points, I'm merely pointing out that it is most questionable that Battlefield 3 should receive a rating of "3D Vision Ready". This raises the question of what type of implementation would be considered to be merely "Excellent" or even "Good", as well as the question of what the rating of the game would have been if these problems had been fixed?
TL;DR If people care about having convergence unlocked, and the frame rate problems suggest sloppy programming or a bug, but not a technical limitation, then under no conceivable circumstances would a relevant rating system assign the highest possible rating to this game.
[quote name='photios' date='01 December 2011 - 09:40 PM' timestamp='1322775657' post='1336025']
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the image and the driver.
You're technically correct, well observed - in order to spell things out more clearly, my argument is more empirical than strictly logical:
Empirical observation #1: Convergence seems to be a very essential ingredient for the quality of the 3D experience. This is backed up both by my own observation (by comparing Skyrim with unlocked convergence to Battlefield 3 with locked convergence, for instance) and by a large number of people in this thread complaining about the shallow depth provided by Battlefield 3.
Empirical observation #2: While technically you could make a point that performance is indifferent to the quality of (static) 3D, that argument is a mere technicality as people do not seem interested in only the static quality of 3D but rather the overall 3D experience, a term which encompasses both the 3D quality of a static image as well as the fluidity of the moving picture, much like the term "video quality" refers to not only a static image but also to concepts such as the absence of motion artifacts and stuttering.
Logical Argument #1: For every individual frame of the game, CPU and GPU cycles are dedicated to the game logic (such as physics and AI) as well as to 3D operations in world space (such as transforming vertices according to model view matrices) and 3D operations in screen space (such as converting world vertex coordinates to screenspace coordinates and running fragment shaders on every pixel on screen). If a game is to be rendered in 3D then for every frame, the 3D operations in screen space must be performed twice (once for each eye) while the game logic and 3D operations in world space must be performed only once. It follows logically that the time needed to render the two frames for both eyes can hence not be longer than twice the time needed for a single frame.
Empirical observation #3: Logical Argument #1 is supported by the vast majority of all games, in which 3D performance is never worse than 50% of 2D performance for the same resolution.
Argument #1: If NVIDIA is to provide a viable and informative rating system, then the 3D Vision Rating of any game should correlate highly with people's perceived 3D experience in that game. It's true that there is no commonly accepted definition of what the 3D Vision rating of any game truly means in measurable terms, but the way in which I assume most people would interpret the rating system is the following: The higher a game's rating, the higher any given group of gamers would rate the 3D experience delivered by the game. It's certainly the impression that NVIDIA are trying to evoke with the ratings, and as this is the only thing gamers would be interested in, I would assume that this is what the ratings are intended to convey.
I've actually not found a single clear definition of the ratings, but I am aware that the possible levels are "Not Recommended", "Poor", "Fair", "Good", "Excellent" and "3D Vision Ready".
Empirical Observation #4: People are not happy with the 3D quality provided by Battlefield 3 (as evidenced by the reactions in this thread, for instance).
Now, Argument #1 (that the 3D Vision Rating should correlate highly with people's perceived quality of the 3D experience) together with Empirical Observation #4 suggest that assigning a 3D Vision Rating of "3D Vision Ready" (which is more than even a humble "Excellent") is a questionable decision.
Empirical Observation #1 (that convergence is inherently important to 3D quality) and Empirical Observation #2 (that performance is important for the quality of the overall 3D experience) back up this conclusion as two major points that are seemingly very important to perceived 3D quality have been neglected here.
You make a good point that one could infer the same conclusion from trying to play 3D Vision Rated games with vastly underpowered hardware, and indeed it would not be fair to chastise NVIDIA or DICE if the performance problems were unavoidable and hence would have to be tolerated.
However, Logical Argument #1 (that the fps hit should not be greater than 50%) backed up by Empirical Observation #3 (that this holds true for virtually all games) strongly suggests that the 3D implementation in Battlefield 3 is sloppy at best. Whether this is due to DICE or NVIDIA's drivers is irrelevant at this point - the overall experience falls well short of what could reasonably be expected.
Due to all these points, I'm merely pointing out that it is most questionable that Battlefield 3 should receive a rating of "3D Vision Ready". This raises the question of what type of implementation would be considered to be merely "Excellent" or even "Good", as well as the question of what the rating of the game would have been if these problems had been fixed?
TL;DR If people care about having convergence unlocked, and the frame rate problems suggest sloppy programming or a bug, but not a technical limitation, then under no conceivable circumstances would a relevant rating system assign the highest possible rating to this game.
You should be careful to call anything "sloppy" unless you know the details. To claim something has been done sloppy just by looking at the end result is, well, sloppy.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
You should be careful to call anything "sloppy" unless you know the details. To claim something has been done sloppy just by looking at the end result is, well, sloppy.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
I don't think adjusting the convergence is going to fix the in game sight either; it's a value the developers need to adjust as it is not a static but dynamic object.
I REALLY hope they do something about it otherwise it's unplayable.
I don't think adjusting the convergence is going to fix the in game sight either; it's a value the developers need to adjust as it is not a static but dynamic object.
I REALLY hope they do something about it otherwise it's unplayable.
[quote name='Grestorn' date='02 December 2011 - 07:38 AM' timestamp='1322811539' post='1336278']
You should be careful to call anything "sloppy" unless you know the details. To claim something has been done sloppy just by looking at the end result is, well, sloppy.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
[/quote]
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?
Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it.
I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready". Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience, I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.
In any case, I've laid out my argument in great detail above - Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose. I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
[quote name='Grestorn' date='02 December 2011 - 07:38 AM' timestamp='1322811539' post='1336278']
You should be careful to call anything "sloppy" unless you know the details. To claim something has been done sloppy just by looking at the end result is, well, sloppy.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?
Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it.
I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready". Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience, I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.
In any case, I've laid out my argument in great detail above - Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose. I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?[/QUOTE]
Well, not having BF3 myself, I can only work with what you're saying.
What happens if you start the game with S3D and then disable it in the game using Ctrl-T? What happens if you use 2D rendering, but force VSync using the driver and/or the in game setting (if there is any)?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it. [/quote]Well, I don't know what you mean by "people". You certainly have a valid opinion, and I might even share it - but it's by no means a general consensus. As I told you, I agree that it would be better if we could change the convergence, but I don't see why this would be necessary to qualify a game for being 3D Ready.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready".[/quote]I see your point, and I agree that there are many examples where 3DReady games, or games where the developer actively integrates 3D into their games, the end result is worse than with games where the developer couldn't care less about S3D (Crysis2, Deus Ex, and now obviously BF3).
But there are also examples of the opposite, like Witcher 2 (even though it took a while to get there).
Still, the lable "3DVision" doesn't imply that the 3D experience will be perfect (even though that's what nVidia likes to imply, certainly). It just means that the developer put some thought into S3D, and that there was some consulting done by nVidia. In the end, it's still absolutely in the responsibility of the developer how the game will look like in the end. The nVidia developers might not like it either, especially if the developers of the game don't follow their recommendations (I almost can feel their frustration sometimes). But in the end, it's better for nVidia and 3DVision as a brand name, if there are more games out there with that label, even if they are not perfect.
Most users will play Crysis2 in S3D and never realize that it could be better. So do you think it would have been better if they didn't put that logo on the package, just because some people don't like the implementation?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience...[/quote]Be careful about thinking to know what the majority of people think. Even here in the forum I wouldn't agree that the majority of people even KNOW how to adjust convergence correctly (the most active writers certainly do, but that's by no means the majority of users in this forum!). And that's not even considering all those people that are using S3D but never bothered to look into this forum.
Or do you really think that there are just those odd 100 users actively writing here that are using 3DVision all over the world?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.[/quote]3DReady means that the game is supposed to work in S3D out of the box, because the developer had 3DVision in mind during its development. Not less but also not more.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose.[/quote]No, I don't say that there's one correct convergence, and I never said that. I said that the developer has a default convergence he thinks suits his game best. You might disagree, and I might disagree, too. So if the developer has the budget and fore-sightedness to allow the user to allow change that, this is certainly is a big plus for the game. But it's just not a necessity for a game to receive the title "3DReady", because that's not what this lable implies.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
[/quote]Well, I can point you to myriard threads where people complain about the "wrong" FoV in games. There are even many tools which force the game to use another FoV. Would you deny those games the label "Direct3D compatible" because of that?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?[/QUOTE]
Well, not having BF3 myself, I can only work with what you're saying.
What happens if you start the game with S3D and then disable it in the game using Ctrl-T? What happens if you use 2D rendering, but force VSync using the driver and/or the in game setting (if there is any)?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it. Well, I don't know what you mean by "people". You certainly have a valid opinion, and I might even share it - but it's by no means a general consensus. As I told you, I agree that it would be better if we could change the convergence, but I don't see why this would be necessary to qualify a game for being 3D Ready.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready".I see your point, and I agree that there are many examples where 3DReady games, or games where the developer actively integrates 3D into their games, the end result is worse than with games where the developer couldn't care less about S3D (Crysis2, Deus Ex, and now obviously BF3).
But there are also examples of the opposite, like Witcher 2 (even though it took a while to get there).
Still, the lable "3DVision" doesn't imply that the 3D experience will be perfect (even though that's what nVidia likes to imply, certainly). It just means that the developer put some thought into S3D, and that there was some consulting done by nVidia. In the end, it's still absolutely in the responsibility of the developer how the game will look like in the end. The nVidia developers might not like it either, especially if the developers of the game don't follow their recommendations (I almost can feel their frustration sometimes). But in the end, it's better for nVidia and 3DVision as a brand name, if there are more games out there with that label, even if they are not perfect.
Most users will play Crysis2 in S3D and never realize that it could be better. So do you think it would have been better if they didn't put that logo on the package, just because some people don't like the implementation?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience...Be careful about thinking to know what the majority of people think. Even here in the forum I wouldn't agree that the majority of people even KNOW how to adjust convergence correctly (the most active writers certainly do, but that's by no means the majority of users in this forum!). And that's not even considering all those people that are using S3D but never bothered to look into this forum.
Or do you really think that there are just those odd 100 users actively writing here that are using 3DVision all over the world?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.3DReady means that the game is supposed to work in S3D out of the box, because the developer had 3DVision in mind during its development. Not less but also not more.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose.No, I don't say that there's one correct convergence, and I never said that. I said that the developer has a default convergence he thinks suits his game best. You might disagree, and I might disagree, too. So if the developer has the budget and fore-sightedness to allow the user to allow change that, this is certainly is a big plus for the game. But it's just not a necessity for a game to receive the title "3DReady", because that's not what this lable implies.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
Well, I can point you to myriard threads where people complain about the "wrong" FoV in games. There are even many tools which force the game to use another FoV. Would you deny those games the label "Direct3D compatible" because of that?
Grestorn, thanks for your reply! I'm starting to think that the answer is probably that we have different ideas concerning what "3D Vision Ready" means:
I'm under the impression that NVIDIA rates all 3D Vision compatible games on a five-point scale of "Poor" - "Fair" - "Good" - "Excellent" - "3D Vision Ready". Have a look at the [url="http://www.nvidia.com/object/3d-vision-games.html"]3D Vision 'Featured Games' page[/url], where in the dropdown menu for "3D Vision Ratings" these are your five choices, and they seem to correspond directly with the one to five stars awarded to games. This to me implies that any game from a rating of "Fair" on is compatible with 3D Vision, and as "3D Vision Ready" is a notch above even "Excellent", any game rated "3D Vision Ready" would have a near-flawless stereo implementation that is among the best in class.
[b]Unless I'm mistaken (and please correct me if I am), "3D Vision Ready" is NOT the same as "3D Vision compatible" but instead the highest rating awarded to a subset of all games that are "3D Vision compatible".
[/b]
If "3D Vision Ready" were to simply mean that the game is expected to work in S3D out of the box, as you suggested, then I would have no problem with Battlefield 3 being rated "3D Vision Ready", I agree with you there; however, I would for instance certainly expect any game that has a 3D Vision Rating of "Good" to work out of the box already, so a rating of "3D Vision Ready" must have an additional meaning beyond that!
My point is that if the rating system is to mean something, then the rating of "3D Vision Ready" must be reserved for the games that truly deliver an amazing stereo experience, and that is true for instance for Skyrim (or also FIFA 12, which I love to play in 3D), but most certainly not at present for Battlefield 3, which should not hold a rating higher than "Good". This is not about whether Battlefield 3 is "3D Vision compatible", but about whether it should deserve the very highest ranking concerning the 3D Vision implementation.
I don't think that the prerequisite of having a feature be adjustable is that the majority of users be able to perfectly and consistenly choose the correct setting: for reference, [url="http://www.panzerskulls.de/sonstieges/bf3/settings/graphics-quality-options.jpg"]have a look[/url] at the adjustable graphics options for BF3 - do you really think the average user will know how to correctly choose the anisotropic filter or the amount of deferred antialiasing? Of course not, but the point of having an advanced menu is letting users choose the settings they want, and it is simply incomprehensible to me why one of the single most important (and highly subjective) settings for the quality of the 3D experience is locked at an arbitrary value!
You could easily make the argument that all of the graphics settings should be locked down then so that people don't accidentally choose values that lower their graphics quality or framerate drastically and then go on complaining about that in online forums, or as I suggested before that the developers simply fix a brightness and volume setting at which they think the game is to be played - again, what is the argument for why precisely only convergence should be locked?
Concerning your other question - I've tried BF3 with and without VSync in 2D and 3D and disabling S3D in-game: 2D consistently runs at a framerate of above 200fps on lowest settings (at 1280x720) (measured without VSync, of course) and a solid 55fps on Ultra, but never gets above 35fps in 3D on any detail settings (even without any form of Anti-Aliasing and everything turned down to bare minimum), and the 3D performance seems almost independent of the detail setting, so something isn't quite right here.
Grestorn, thanks for your reply! I'm starting to think that the answer is probably that we have different ideas concerning what "3D Vision Ready" means:
I'm under the impression that NVIDIA rates all 3D Vision compatible games on a five-point scale of "Poor" - "Fair" - "Good" - "Excellent" - "3D Vision Ready". Have a look at the 3D Vision 'Featured Games' page, where in the dropdown menu for "3D Vision Ratings" these are your five choices, and they seem to correspond directly with the one to five stars awarded to games. This to me implies that any game from a rating of "Fair" on is compatible with 3D Vision, and as "3D Vision Ready" is a notch above even "Excellent", any game rated "3D Vision Ready" would have a near-flawless stereo implementation that is among the best in class.
Unless I'm mistaken (and please correct me if I am), "3D Vision Ready" is NOT the same as "3D Vision compatible" but instead the highest rating awarded to a subset of all games that are "3D Vision compatible".
If "3D Vision Ready" were to simply mean that the game is expected to work in S3D out of the box, as you suggested, then I would have no problem with Battlefield 3 being rated "3D Vision Ready", I agree with you there; however, I would for instance certainly expect any game that has a 3D Vision Rating of "Good" to work out of the box already, so a rating of "3D Vision Ready" must have an additional meaning beyond that!
My point is that if the rating system is to mean something, then the rating of "3D Vision Ready" must be reserved for the games that truly deliver an amazing stereo experience, and that is true for instance for Skyrim (or also FIFA 12, which I love to play in 3D), but most certainly not at present for Battlefield 3, which should not hold a rating higher than "Good". This is not about whether Battlefield 3 is "3D Vision compatible", but about whether it should deserve the very highest ranking concerning the 3D Vision implementation.
I don't think that the prerequisite of having a feature be adjustable is that the majority of users be able to perfectly and consistenly choose the correct setting: for reference, have a look at the adjustable graphics options for BF3 - do you really think the average user will know how to correctly choose the anisotropic filter or the amount of deferred antialiasing? Of course not, but the point of having an advanced menu is letting users choose the settings they want, and it is simply incomprehensible to me why one of the single most important (and highly subjective) settings for the quality of the 3D experience is locked at an arbitrary value!
You could easily make the argument that all of the graphics settings should be locked down then so that people don't accidentally choose values that lower their graphics quality or framerate drastically and then go on complaining about that in online forums, or as I suggested before that the developers simply fix a brightness and volume setting at which they think the game is to be played - again, what is the argument for why precisely only convergence should be locked?
Concerning your other question - I've tried BF3 with and without VSync in 2D and 3D and disabling S3D in-game: 2D consistently runs at a framerate of above 200fps on lowest settings (at 1280x720) (measured without VSync, of course) and a solid 55fps on Ultra, but never gets above 35fps in 3D on any detail settings (even without any form of Anti-Aliasing and everything turned down to bare minimum), and the 3D performance seems almost independent of the detail setting, so something isn't quite right here.
[quote name='photios' date='01 December 2011 - 09:40 PM' timestamp='1322775657' post='1336025']
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
[/quote]
Photios, I just reread this and I think the problem is that you are also misunderstanding what "3D Vision Ready" and "3D Vision compatible" mean - perhaps NVIDIA should be more clear on this, and I certainly agree that a game with locked convergence and bad performance can still be considered "3D Vision compatible" (which it certainly is by definition of compatibility) but should not hold the highest possible rating of "3D Vision Ready" - does that answer your question?
[quote name='photios' date='01 December 2011 - 09:40 PM' timestamp='1322775657' post='1336025']
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
Photios, I just reread this and I think the problem is that you are also misunderstanding what "3D Vision Ready" and "3D Vision compatible" mean - perhaps NVIDIA should be more clear on this, and I certainly agree that a game with locked convergence and bad performance can still be considered "3D Vision compatible" (which it certainly is by definition of compatibility) but should not hold the highest possible rating of "3D Vision Ready" - does that answer your question?
That kind of depth, convergence and immersion used to be available on pretty much EVERY game released.
This 'pseudo-3d' peddled currently (usually as 3d Ready) is a very worrying development and seems indicative of a certain laziness on the part of the drivers programmers.
That kind of depth, convergence and immersion used to be available on pretty much EVERY game released.
This 'pseudo-3d' peddled currently (usually as 3d Ready) is a very worrying development and seems indicative of a certain laziness on the part of the drivers programmers.
Now why would a developer insist on locking Convergence? Why would Nvidia disable Convergence controls by default in the NVCP? If you play with the controls enough you can clearly see why. With a modest Convergence setting, you can adjust Depth from 0-100 on the wheel and still maintain a comfortable (albeit sometimes flat) 3D experience. If you go too far with Convergence, the 3D image very quickly falls apart and can result in eye strain or even cross-eyed inclinations while viewing. My guess is this is what Nvidia and their dev partners are trying to avoid, the problem is it also takes away a lot of our control over our 3D experience.
While its certainly frustrating to see a 3D Vision Ready rated game with locked Convergence, I'm not going to be too hard on them because some of the very best examples of 3D Vision are still 3D Vision Ready games, like both Batman titles, Just Cause 2, etc. Also, in many games 2D assets or rendering issues result in 3D artifacts that are a MUCH bigger problem than lack of Convergence controls.
In the meantime, all we can do is ask Nvidia to do a better job of explaining to the devs why Convergence control is important and lobby DICE to allow for Convergence controls in BF3. Let's not kid ourselves though, without Nvidia's involvement with 3D Vision titles there'd be 0% chance of mouse cursors, crosshairs, or HUD elements being rendered properly in 3D, so overall their contributions are certainly appreciated (by me at least).
Now why would a developer insist on locking Convergence? Why would Nvidia disable Convergence controls by default in the NVCP? If you play with the controls enough you can clearly see why. With a modest Convergence setting, you can adjust Depth from 0-100 on the wheel and still maintain a comfortable (albeit sometimes flat) 3D experience. If you go too far with Convergence, the 3D image very quickly falls apart and can result in eye strain or even cross-eyed inclinations while viewing. My guess is this is what Nvidia and their dev partners are trying to avoid, the problem is it also takes away a lot of our control over our 3D experience.
While its certainly frustrating to see a 3D Vision Ready rated game with locked Convergence, I'm not going to be too hard on them because some of the very best examples of 3D Vision are still 3D Vision Ready games, like both Batman titles, Just Cause 2, etc. Also, in many games 2D assets or rendering issues result in 3D artifacts that are a MUCH bigger problem than lack of Convergence controls.
In the meantime, all we can do is ask Nvidia to do a better job of explaining to the devs why Convergence control is important and lobby DICE to allow for Convergence controls in BF3. Let's not kid ourselves though, without Nvidia's involvement with 3D Vision titles there'd be 0% chance of mouse cursors, crosshairs, or HUD elements being rendered properly in 3D, so overall their contributions are certainly appreciated (by me at least).
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
As it stands, NVIDIA 3D Vision is being marketed heavily by NVIDIA and is a very expensive consumer technology - I decided to go with a GTX 580 for $600 instead of a GTX 560 for maybe $200 (which would actually still run Battlefield 3 better in 2D than my GTX 580 does in 3D), as well as additional $40 for the NVIDIA 3DTV Play utility, and let's not forget the $800 for the Sony HMZ-T1 headset to go with it (or alternatively, a couple of hundred dollars for a decent 3D Vision compatible monitor).
All in all, we're easily talking about up to $1000 in up front additional investment based on marketing claims by NVIDIA and DICE that Battlefield 3 was "3D Vision Ready" and was given a full five stars by NVIDIA and featured prominently on the 3D Vision homepage. I've 'appreciated' NVIDIA's contribution in the only way that they care about me appreciating, which is by purchasing their products because of their marketing. If there were "0% chance of mouse cursors, crosshairs or HUD elements being rendered properly in 3D", as you put it, then there would also be a 0% chance of me buying NVIDIA products for a total of $640 because of NVIDIA 3D Vision, so I certainly don't owe them any extra thanks for being involved in the development of 3D Vision games - it's their big selling point that carries a huge premium, so expect to be judged on your marketing claims.
When you make those kind of claims and get me to part with more than $1000 because of them, don't count on me shrugging it off if the game is nowhere near "3D Vision Ready" as claimed - as people have noted before, the 3D in this game is (due to bad performance issues and completely locked convergence) nowhere near comparable to the experience delivered by Skyrim, for instance (ironically a game that isn't even "3D Vision Ready"). It's a pretty blatant marketing lie, and if you try to boost sales this way, be prepared for a consumer backlash - that's just the way it is.
Secondly, the arguments for convergence being locked sound idiotic to me - so the idea is to restrict the controls to make sure that in no case the user can accidentally trigger settings that might become a problem for him? First off, given that 3D is still very much a niche market and requires a substantial investment, I think it's safe to say that 3D Vision users are on average more technologically versed than most PC gamers (or at least not substantially less versed, if you don't trust in that claim), so would we really not be able to just turn back convergence if we had indeed accidentally turned it up too high?
Think of it this way - if that's what we're going for, let's also lock down the brightness setting of every monitor so people don't accidentally turn the brightness so low that they can't properly see the picture anymore, and let's make sure to lock down all volume controls on all devices so we don't accidentally turn down the volume so low that we won't be able to hear anything anymore.
If you want to be extremely careful, then how about an "advanced settings" menu where you have to unlock convergence manually and that comes with a warning that unlocking convergence and setting it incorrectly might be detrimental to the overall 3D viewing experience?
As is, the concept is patronizing, plain stupid and takes away so much from the experience that I have no idea who thought up this idea in the first place.
NVIDIA certainly hold a certain leverage over developers - how about they inform developers that locked convergence is highly discouraged, don't give out any "3D Vision Ready" seals and high ratings unless developers leave convergence unlocked or just override the convergence lock in the driver, which I am fairly certain could be done rather easily if NVIDIA wanted to.
It's the same kind of problem that we have with NVIDIA not letting us adjust the screen size (and associated maximum depth) in the control panel, forcing us to use registry hacks for this instead - in trying to make things more comfortable, all they end up doing is forcing us to use hugely inconvenient workarounds instead, and in this case there doesn't even seem to be a workaround for unlocking the convergence.
As it stands, NVIDIA 3D Vision is being marketed heavily by NVIDIA and is a very expensive consumer technology - I decided to go with a GTX 580 for $600 instead of a GTX 560 for maybe $200 (which would actually still run Battlefield 3 better in 2D than my GTX 580 does in 3D), as well as additional $40 for the NVIDIA 3DTV Play utility, and let's not forget the $800 for the Sony HMZ-T1 headset to go with it (or alternatively, a couple of hundred dollars for a decent 3D Vision compatible monitor).
All in all, we're easily talking about up to $1000 in up front additional investment based on marketing claims by NVIDIA and DICE that Battlefield 3 was "3D Vision Ready" and was given a full five stars by NVIDIA and featured prominently on the 3D Vision homepage. I've 'appreciated' NVIDIA's contribution in the only way that they care about me appreciating, which is by purchasing their products because of their marketing. If there were "0% chance of mouse cursors, crosshairs or HUD elements being rendered properly in 3D", as you put it, then there would also be a 0% chance of me buying NVIDIA products for a total of $640 because of NVIDIA 3D Vision, so I certainly don't owe them any extra thanks for being involved in the development of 3D Vision games - it's their big selling point that carries a huge premium, so expect to be judged on your marketing claims.
When you make those kind of claims and get me to part with more than $1000 because of them, don't count on me shrugging it off if the game is nowhere near "3D Vision Ready" as claimed - as people have noted before, the 3D in this game is (due to bad performance issues and completely locked convergence) nowhere near comparable to the experience delivered by Skyrim, for instance (ironically a game that isn't even "3D Vision Ready"). It's a pretty blatant marketing lie, and if you try to boost sales this way, be prepared for a consumer backlash - that's just the way it is.
Secondly, the arguments for convergence being locked sound idiotic to me - so the idea is to restrict the controls to make sure that in no case the user can accidentally trigger settings that might become a problem for him? First off, given that 3D is still very much a niche market and requires a substantial investment, I think it's safe to say that 3D Vision users are on average more technologically versed than most PC gamers (or at least not substantially less versed, if you don't trust in that claim), so would we really not be able to just turn back convergence if we had indeed accidentally turned it up too high?
Think of it this way - if that's what we're going for, let's also lock down the brightness setting of every monitor so people don't accidentally turn the brightness so low that they can't properly see the picture anymore, and let's make sure to lock down all volume controls on all devices so we don't accidentally turn down the volume so low that we won't be able to hear anything anymore.
If you want to be extremely careful, then how about an "advanced settings" menu where you have to unlock convergence manually and that comes with a warning that unlocking convergence and setting it incorrectly might be detrimental to the overall 3D viewing experience?
As is, the concept is patronizing, plain stupid and takes away so much from the experience that I have no idea who thought up this idea in the first place.
NVIDIA certainly hold a certain leverage over developers - how about they inform developers that locked convergence is highly discouraged, don't give out any "3D Vision Ready" seals and high ratings unless developers leave convergence unlocked or just override the convergence lock in the driver, which I am fairly certain could be done rather easily if NVIDIA wanted to.
It's the same kind of problem that we have with NVIDIA not letting us adjust the screen size (and associated maximum depth) in the control panel, forcing us to use registry hacks for this instead - in trying to make things more comfortable, all they end up doing is forcing us to use hugely inconvenient workarounds instead, and in this case there doesn't even seem to be a workaround for unlocking the convergence.
I think the problem here is you seem stuck on the "3D Vision Ready" rating even with locked convergence, and you keep referring to Battlefield 3 as your key reference. Unfortunately, you should have realized that historically Nvidia has not considered control over Convergence to be a deciding metric for their 3D Vision Ready rating system, as numerous 3D Vision Ready titles have launched in the past with locked Convergence. Mafia 2, Resident Evil 5, LA Noire as a few examples. You can argue the rating system itself is fundamentally flawed, but that won't change Nvidia's criteria for what they consider "3D Vision Ready", meaning you load the game up, you adjust depth 0-100 to your liking, and 3D just works without making you googly-eyed.
As for the reasoning being idiotic, that's certainly your opinion, but Nvidia actually has numerous controls in place to try and make the end-user experience as seamless as possible without requiring any significant user-intervention. Your screen-size example is just one of them. Exposing some controls like the advanced 3D keys or SLI/AA flags does seem contradictory at times and leave us wanting for more, but overall its pretty clear their corporate guidance steers them away from giving us access to additional control/functionality. In the end, one control is relatively benign (depth) while the other (convergence) can completely break the 3D experience and leave a very negative impression of 3D Vision while generating a potential for more customer service calls. The other (depth) will just result in a complaint that the 3D looks too flat. See the difference there? All it would take is a set-up in a board room demo'ing the game in 3D, have one guy play with Convergence controls and say "whoa what was that? My 3D just broke! Yeah turn off whatever control that was."
But yeah, I think you overestimate the amount of influence Nvidia has with developers, since we can't even get most devs to fix minor 3D issues or expose controls to turn off problematic 2D effects in games. Nvidia is the one walking hat in hand looking for support of a niche feature that a tiny % of users will benefit from, and for the most part, they do an excellent job of it in supported 3D Vision titles. I've already agreed they could do a better job of expressing the importance of Convergence controls to their dev partners, but they have no card to play with threatening to pull 3D Vision support, because if push came to shove, the devs would just tell them to go whistle.
I think the problem here is you seem stuck on the "3D Vision Ready" rating even with locked convergence, and you keep referring to Battlefield 3 as your key reference. Unfortunately, you should have realized that historically Nvidia has not considered control over Convergence to be a deciding metric for their 3D Vision Ready rating system, as numerous 3D Vision Ready titles have launched in the past with locked Convergence. Mafia 2, Resident Evil 5, LA Noire as a few examples. You can argue the rating system itself is fundamentally flawed, but that won't change Nvidia's criteria for what they consider "3D Vision Ready", meaning you load the game up, you adjust depth 0-100 to your liking, and 3D just works without making you googly-eyed.
As for the reasoning being idiotic, that's certainly your opinion, but Nvidia actually has numerous controls in place to try and make the end-user experience as seamless as possible without requiring any significant user-intervention. Your screen-size example is just one of them. Exposing some controls like the advanced 3D keys or SLI/AA flags does seem contradictory at times and leave us wanting for more, but overall its pretty clear their corporate guidance steers them away from giving us access to additional control/functionality. In the end, one control is relatively benign (depth) while the other (convergence) can completely break the 3D experience and leave a very negative impression of 3D Vision while generating a potential for more customer service calls. The other (depth) will just result in a complaint that the 3D looks too flat. See the difference there? All it would take is a set-up in a board room demo'ing the game in 3D, have one guy play with Convergence controls and say "whoa what was that? My 3D just broke! Yeah turn off whatever control that was."
But yeah, I think you overestimate the amount of influence Nvidia has with developers, since we can't even get most devs to fix minor 3D issues or expose controls to turn off problematic 2D effects in games. Nvidia is the one walking hat in hand looking for support of a niche feature that a tiny % of users will benefit from, and for the most part, they do an excellent job of it in supported 3D Vision titles. I've already agreed they could do a better job of expressing the importance of Convergence controls to their dev partners, but they have no card to play with threatening to pull 3D Vision support, because if push came to shove, the devs would just tell them to go whistle.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
All Hi
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
[/quote]
Convergence can have as big an effect on the quality/impact of the 3D as adjusting the depth does.
All Hi
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
Convergence can have as big an effect on the quality/impact of the 3D as adjusting the depth does.
i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
All Hi
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
[/quote]
Its really something you need to tinker with yourself to fully grasp, because similar to our discussion about settings and potential problems, every single person's taste in 3D and what they find comfortable or ideal is going to be different. Also, your particular 3D set-up, how far away you're sitting from the TV etc are all going to impact your 3D experience at the same settings as another person. Not a single person on these forums will have the exact settings for both Convergence and Depth, which is an equally compelling argument from both directions for controlling Convergence.
I would say first read MTBS article as a primer as its probably the best concise write-up I've seen with illustrations, what it doesn't really describe as well is the impact on the actual 3D effect that Convergence can have:
http://www.mtbs3d.com/index.php?option=com_content&view=article&id=6197&Itemid=98&limitstart=2
Then, try it out in a few games that have convergence unlocked. Make sure to enable it in the Nvidia control panel under advanced settings first. Generally, Ctrl-F6 moves 0 parallax/neutral point/screen depth/2D towards you. Ctrl-F5 moves it further into the scene.
I went ahead and took some screenshots for you though to help illustrate the problem, again, what you see and what I see may be very different in results or comfort level, but you should still be able to see the differences between each screenshot. All screenshots were taken with ~50% Depth and variable Convergence settings:
[list=1]
[*][b]Screenshot 1[/b]: This shows Depth but low Convergence, meaning neutral point is set at the same depth as distant objects resulting in a very flat 3D environment, slightly sunk into the screen. If you were to peek over your glasses, you can see there is good separation between objects, but the separation is too consistent between far and near objects to give a good sense of depth. Instead it seems like it is a 2D image in stereo. This is what Deus Ex Human Revolution is like, and there's really no way to adjust the image to make it better.
[*][b]Screenshot 2[/b]: This image shows Depth with higher Convergence. If you peek over your glasses you can see I tried setting my neutral point to Batman himself. As a result, Batman appears in "2D" or screen depth, any objects in front of him have a pop-out effect. You should immediately notice however that all objects appear far more "3D" than the first screenshot.
[*][b]Screenshot 3:[/b] This image shows Depth with High Convergence. Basically adjusting convergence any further starts breaking focus of any objects in front of the neutral point and makes it very uncomfortable or cross-eyed viewing if you try to focus on the near-field objects. However, you can see again how every object has a more robust, almost synthetic/holographic quality to it. Neutral point is somewhere inbetween Batman and the bushes behind him.
[/list]
I think these should give a pretty good indication of what Convergence is capable of in terms of controlling your viewing experience, but again, its really something you need to tinker with yourself. Even after tinkering with the settings for a bit, you may not fully grasp all of the concepts (I still don't), but you will at least get a better understanding of what 3D settings you prefer. After playing with it you may soon also find yourself wanting control over Convergence even more, so maybe its better if you don't mess around with it too much.
And to be fair, Battlefield 3 is not as bad as the 1st screenshot or Deus Ex (worst possible example), its Convergence is locked somewhere inbetween the 1st and 2nd screenshot settings.
All Hi
Re.convergence -why is it so important? I don't reall use it (always use 3d though)
Its really something you need to tinker with yourself to fully grasp, because similar to our discussion about settings and potential problems, every single person's taste in 3D and what they find comfortable or ideal is going to be different. Also, your particular 3D set-up, how far away you're sitting from the TV etc are all going to impact your 3D experience at the same settings as another person. Not a single person on these forums will have the exact settings for both Convergence and Depth, which is an equally compelling argument from both directions for controlling Convergence.
I would say first read MTBS article as a primer as its probably the best concise write-up I've seen with illustrations, what it doesn't really describe as well is the impact on the actual 3D effect that Convergence can have:
http://www.mtbs3d.com/index.php?option=com_content&view=article&id=6197&Itemid=98&limitstart=2
Then, try it out in a few games that have convergence unlocked. Make sure to enable it in the Nvidia control panel under advanced settings first. Generally, Ctrl-F6 moves 0 parallax/neutral point/screen depth/2D towards you. Ctrl-F5 moves it further into the scene.
I went ahead and took some screenshots for you though to help illustrate the problem, again, what you see and what I see may be very different in results or comfort level, but you should still be able to see the differences between each screenshot. All screenshots were taken with ~50% Depth and variable Convergence settings:
[list=1]
I think these should give a pretty good indication of what Convergence is capable of in terms of controlling your viewing experience, but again, its really something you need to tinker with yourself. Even after tinkering with the settings for a bit, you may not fully grasp all of the concepts (I still don't), but you will at least get a better understanding of what 3D settings you prefer. After playing with it you may soon also find yourself wanting control over Convergence even more, so maybe its better if you don't mess around with it too much.
And to be fair, Battlefield 3 is not as bad as the 1st screenshot or Deus Ex (worst possible example), its Convergence is locked somewhere inbetween the 1st and 2nd screenshot settings.
-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings
Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the image and the driver.
[/quote]
You're technically correct, well observed - in order to spell things out more clearly, my argument is more empirical than strictly logical:
Empirical observation #1: Convergence seems to be a very essential ingredient for the quality of the 3D experience. This is backed up both by my own observation (by comparing Skyrim with unlocked convergence to Battlefield 3 with locked convergence, for instance) and by a large number of people in this thread complaining about the shallow depth provided by Battlefield 3.
Empirical observation #2: While technically you could make a point that performance is indifferent to the quality of (static) 3D, that argument is a mere technicality as people do not seem interested in only the static quality of 3D but rather the overall 3D experience, a term which encompasses both the 3D quality of a static image as well as the fluidity of the moving picture, much like the term "video quality" refers to not only a static image but also to concepts such as the absence of motion artifacts and stuttering.
Logical Argument #1: For every individual frame of the game, CPU and GPU cycles are dedicated to the game logic (such as physics and AI) as well as to 3D operations in world space (such as transforming vertices according to model view matrices) and 3D operations in screen space (such as converting world vertex coordinates to screenspace coordinates and running fragment shaders on every pixel on screen). If a game is to be rendered in 3D then for every frame, the 3D operations in screen space must be performed twice (once for each eye) while the game logic and 3D operations in world space must be performed only once. It follows logically that the time needed to render the two frames for both eyes can hence not be longer than twice the time needed for a single frame.
Empirical observation #3: Logical Argument #1 is supported by the vast majority of all games, in which 3D performance is never worse than 50% of 2D performance for the same resolution.
Argument #1: If NVIDIA is to provide a viable and informative rating system, then the 3D Vision Rating of any game should correlate highly with people's perceived 3D experience in that game. It's true that there is no commonly accepted definition of what the 3D Vision rating of any game truly means in measurable terms, but the way in which I assume most people would interpret the rating system is the following: The higher a game's rating, the higher any given group of gamers would rate the 3D experience delivered by the game. It's certainly the impression that NVIDIA are trying to evoke with the ratings, and as this is the only thing gamers would be interested in, I would assume that this is what the ratings are intended to convey.
I've actually not found a single clear definition of the ratings, but I am aware that the possible levels are "Not Recommended", "Poor", "Fair", "Good", "Excellent" and "3D Vision Ready".
Empirical Observation #4: People are not happy with the 3D quality provided by Battlefield 3 (as evidenced by the reactions in this thread, for instance).
Now, Argument #1 (that the 3D Vision Rating should correlate highly with people's perceived quality of the 3D experience) together with Empirical Observation #4 suggest that assigning a 3D Vision Rating of "3D Vision Ready" (which is more than even a humble "Excellent") is a questionable decision.
Empirical Observation #1 (that convergence is inherently important to 3D quality) and Empirical Observation #2 (that performance is important for the quality of the overall 3D experience) back up this conclusion as two major points that are seemingly very important to perceived 3D quality have been neglected here.
You make a good point that one could infer the same conclusion from trying to play 3D Vision Rated games with vastly underpowered hardware, and indeed it would not be fair to chastise NVIDIA or DICE if the performance problems were unavoidable and hence would have to be tolerated.
However, Logical Argument #1 (that the fps hit should not be greater than 50%) backed up by Empirical Observation #3 (that this holds true for virtually all games) strongly suggests that the 3D implementation in Battlefield 3 is sloppy at best. Whether this is due to DICE or NVIDIA's drivers is irrelevant at this point - the overall experience falls well short of what could reasonably be expected.
Due to all these points, I'm merely pointing out that it is most questionable that Battlefield 3 should receive a rating of "3D Vision Ready". This raises the question of what type of implementation would be considered to be merely "Excellent" or even "Good", as well as the question of what the rating of the game would have been if these problems had been fixed?
TL;DR If people care about having convergence unlocked, and the frame rate problems suggest sloppy programming or a bug, but not a technical limitation, then under no conceivable circumstances would a relevant rating system assign the highest possible rating to this game.
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the image and the driver.
You're technically correct, well observed - in order to spell things out more clearly, my argument is more empirical than strictly logical:
Empirical observation #1: Convergence seems to be a very essential ingredient for the quality of the 3D experience. This is backed up both by my own observation (by comparing Skyrim with unlocked convergence to Battlefield 3 with locked convergence, for instance) and by a large number of people in this thread complaining about the shallow depth provided by Battlefield 3.
Empirical observation #2: While technically you could make a point that performance is indifferent to the quality of (static) 3D, that argument is a mere technicality as people do not seem interested in only the static quality of 3D but rather the overall 3D experience, a term which encompasses both the 3D quality of a static image as well as the fluidity of the moving picture, much like the term "video quality" refers to not only a static image but also to concepts such as the absence of motion artifacts and stuttering.
Logical Argument #1: For every individual frame of the game, CPU and GPU cycles are dedicated to the game logic (such as physics and AI) as well as to 3D operations in world space (such as transforming vertices according to model view matrices) and 3D operations in screen space (such as converting world vertex coordinates to screenspace coordinates and running fragment shaders on every pixel on screen). If a game is to be rendered in 3D then for every frame, the 3D operations in screen space must be performed twice (once for each eye) while the game logic and 3D operations in world space must be performed only once. It follows logically that the time needed to render the two frames for both eyes can hence not be longer than twice the time needed for a single frame.
Empirical observation #3: Logical Argument #1 is supported by the vast majority of all games, in which 3D performance is never worse than 50% of 2D performance for the same resolution.
Argument #1: If NVIDIA is to provide a viable and informative rating system, then the 3D Vision Rating of any game should correlate highly with people's perceived 3D experience in that game. It's true that there is no commonly accepted definition of what the 3D Vision rating of any game truly means in measurable terms, but the way in which I assume most people would interpret the rating system is the following: The higher a game's rating, the higher any given group of gamers would rate the 3D experience delivered by the game. It's certainly the impression that NVIDIA are trying to evoke with the ratings, and as this is the only thing gamers would be interested in, I would assume that this is what the ratings are intended to convey.
I've actually not found a single clear definition of the ratings, but I am aware that the possible levels are "Not Recommended", "Poor", "Fair", "Good", "Excellent" and "3D Vision Ready".
Empirical Observation #4: People are not happy with the 3D quality provided by Battlefield 3 (as evidenced by the reactions in this thread, for instance).
Now, Argument #1 (that the 3D Vision Rating should correlate highly with people's perceived quality of the 3D experience) together with Empirical Observation #4 suggest that assigning a 3D Vision Rating of "3D Vision Ready" (which is more than even a humble "Excellent") is a questionable decision.
Empirical Observation #1 (that convergence is inherently important to 3D quality) and Empirical Observation #2 (that performance is important for the quality of the overall 3D experience) back up this conclusion as two major points that are seemingly very important to perceived 3D quality have been neglected here.
You make a good point that one could infer the same conclusion from trying to play 3D Vision Rated games with vastly underpowered hardware, and indeed it would not be fair to chastise NVIDIA or DICE if the performance problems were unavoidable and hence would have to be tolerated.
However, Logical Argument #1 (that the fps hit should not be greater than 50%) backed up by Empirical Observation #3 (that this holds true for virtually all games) strongly suggests that the 3D implementation in Battlefield 3 is sloppy at best. Whether this is due to DICE or NVIDIA's drivers is irrelevant at this point - the overall experience falls well short of what could reasonably be expected.
Due to all these points, I'm merely pointing out that it is most questionable that Battlefield 3 should receive a rating of "3D Vision Ready". This raises the question of what type of implementation would be considered to be merely "Excellent" or even "Good", as well as the question of what the rating of the game would have been if these problems had been fixed?
TL;DR If people care about having convergence unlocked, and the frame rate problems suggest sloppy programming or a bug, but not a technical limitation, then under no conceivable circumstances would a relevant rating system assign the highest possible rating to this game.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
I REALLY hope they do something about it otherwise it's unplayable.
I REALLY hope they do something about it otherwise it's unplayable.
You should be careful to call anything "sloppy" unless you know the details. To claim something has been done sloppy just by looking at the end result is, well, sloppy.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
[/quote]
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?
Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it.
I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready". Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience, I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.
In any case, I've laid out my argument in great detail above - Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose. I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
You should be careful to call anything "sloppy" unless you know the details. To claim something has been done sloppy just by looking at the end result is, well, sloppy.
The reason why many games are using effects which are not rendered at the correct depth in S3D mode is that they're using shortcuts to improve performance in normal "2D" rendering. And if you have an in-game S3D mode which happens to be a bit slower than 50% of the 2D performance, it's most likely caused by the fact that you cannot use those shortcuts anymore, since everything has to be rendered at the correct depth.
That's not "sloppy", it's just the result of getting the desired effect with the correct rendering. I don't like people calling programmers sloppy without even a hint of knowledge about what they're talking about (being a programmer myself).
And regarding Convergence:
It's not that easy for an inexperienced person to find the correct convergence setting. That's why nVidia tries to put a reasonable default convergence settings in the profiles for the games and why they disable the convergence controls by default.
Only someone who actually understands how convergence is working should try to change it, otherwise he will get a very unsatisfactory result and will most likely - again - start to complain about the sloppyness of the developers (failing to understand that he himself screwed up). You see that here in the forum all the time.
The developer of the 3D ready game put some time into finding the correct convergence for what he wanted the game to be presented. Just like he found the correct FoV and all the other rendering parameters, which are usually not open to be changed. I agree that it would be nice if the developer would allow the user to change the preset (many games allow that, using .ini files or the command console), but that's hardly a prerequisite to call a game 3D ready.
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?
Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it.
I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready". Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience, I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.
In any case, I've laid out my argument in great detail above - Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose. I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?[/QUOTE]
Well, not having BF3 myself, I can only work with what you're saying.
What happens if you start the game with S3D and then disable it in the game using Ctrl-T? What happens if you use 2D rendering, but force VSync using the driver and/or the in game setting (if there is any)?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it. [/quote]Well, I don't know what you mean by "people". You certainly have a valid opinion, and I might even share it - but it's by no means a general consensus. As I told you, I agree that it would be better if we could change the convergence, but I don't see why this would be necessary to qualify a game for being 3D Ready.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready".[/quote]I see your point, and I agree that there are many examples where 3DReady games, or games where the developer actively integrates 3D into their games, the end result is worse than with games where the developer couldn't care less about S3D (Crysis2, Deus Ex, and now obviously BF3).
But there are also examples of the opposite, like Witcher 2 (even though it took a while to get there).
Still, the lable "3DVision" doesn't imply that the 3D experience will be perfect (even though that's what nVidia likes to imply, certainly). It just means that the developer put some thought into S3D, and that there was some consulting done by nVidia. In the end, it's still absolutely in the responsibility of the developer how the game will look like in the end. The nVidia developers might not like it either, especially if the developers of the game don't follow their recommendations (I almost can feel their frustration sometimes). But in the end, it's better for nVidia and 3DVision as a brand name, if there are more games out there with that label, even if they are not perfect.
Most users will play Crysis2 in S3D and never realize that it could be better. So do you think it would have been better if they didn't put that logo on the package, just because some people don't like the implementation?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience...[/quote]Be careful about thinking to know what the majority of people think. Even here in the forum I wouldn't agree that the majority of people even KNOW how to adjust convergence correctly (the most active writers certainly do, but that's by no means the majority of users in this forum!). And that's not even considering all those people that are using S3D but never bothered to look into this forum.
Or do you really think that there are just those odd 100 users actively writing here that are using 3DVision all over the world?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.[/quote]3DReady means that the game is supposed to work in S3D out of the box, because the developer had 3DVision in mind during its development. Not less but also not more.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose.[/quote]No, I don't say that there's one correct convergence, and I never said that. I said that the developer has a default convergence he thinks suits his game best. You might disagree, and I might disagree, too. So if the developer has the budget and fore-sightedness to allow the user to allow change that, this is certainly is a big plus for the game. But it's just not a necessity for a game to receive the title "3DReady", because that's not what this lable implies.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
[/quote]Well, I can point you to myriard threads where people complain about the "wrong" FoV in games. There are even many tools which force the game to use another FoV. Would you deny those games the label "Direct3D compatible" because of that?
Alright, so perhaps "sloppy" is a strong word to be used here - in any case, the fact that 3D performance is practically unaffected by changing the detail settings (even going down to medium or low settings) very strongly suggests to me that the implementation is suboptimal. If you want to be technical, the performance loss on "low" settings when going from 2D to 3D is about 85%, going from 240fps to 35fps - being a programmer myself, I'd be very interested in your explanation of how that would happen with a proper and clean implementation?[/QUOTE]
Well, not having BF3 myself, I can only work with what you're saying.
What happens if you start the game with S3D and then disable it in the game using Ctrl-T? What happens if you use 2D rendering, but force VSync using the driver and/or the in game setting (if there is any)?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Let me reiterate the point that the "correct" convergence by the developer does NOT seem to be anywhere near what people would like to use, and there is no point in denying people the possibility of adjusting it to their own preference. If the option to do so were put somewhere in an "advanced settings" menu and users were given a warning before changing it, then I very much doubt that people would be complaining about it. Well, I don't know what you mean by "people". You certainly have a valid opinion, and I might even share it - but it's by no means a general consensus. As I told you, I agree that it would be better if we could change the convergence, but I don't see why this would be necessary to qualify a game for being 3D Ready.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I wouldn't feel this strongly about the issue if I didn't have the immediate comparison of Skyrim (which provides a 3D experience on a completely different level of quality due to me being able to choose the CORRECT convergence for myself, not the "correct" convergence that the developer chose for some reason) and Battlefield 3, which provides a much less satisfying and more limited 3D experience due to the aforementioned points and hence should not be rated "3D Vision Ready".I see your point, and I agree that there are many examples where 3DReady games, or games where the developer actively integrates 3D into their games, the end result is worse than with games where the developer couldn't care less about S3D (Crysis2, Deus Ex, and now obviously BF3).
But there are also examples of the opposite, like Witcher 2 (even though it took a while to get there).
Still, the lable "3DVision" doesn't imply that the 3D experience will be perfect (even though that's what nVidia likes to imply, certainly). It just means that the developer put some thought into S3D, and that there was some consulting done by nVidia. In the end, it's still absolutely in the responsibility of the developer how the game will look like in the end. The nVidia developers might not like it either, especially if the developers of the game don't follow their recommendations (I almost can feel their frustration sometimes). But in the end, it's better for nVidia and 3DVision as a brand name, if there are more games out there with that label, even if they are not perfect.
Most users will play Crysis2 in S3D and never realize that it could be better. So do you think it would have been better if they didn't put that logo on the package, just because some people don't like the implementation?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Since the large majority of people seem to think (and I completely agree with them) that convergence is just as important as depth for the 3D experience...Be careful about thinking to know what the majority of people think. Even here in the forum I wouldn't agree that the majority of people even KNOW how to adjust convergence correctly (the most active writers certainly do, but that's by no means the majority of users in this forum!). And that's not even considering all those people that are using S3D but never bothered to look into this forum.
Or do you really think that there are just those odd 100 users actively writing here that are using 3DVision all over the world?
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I do believe that being able to adjust convergence to one's own preference should be a prerequisite for the very highest level of certification as "3D Vision Ready" - I'd have no problem with NVIDIA rating the game "Fair" or even "Good", but "3D Vision Ready" is simply a marketing ploy and takes away strongly from the integrity of NVIDIA's rating system.3DReady means that the game is supposed to work in S3D out of the box, because the developer had 3DVision in mind during its development. Not less but also not more.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']Grestorn, would you please be so kind as to tell me what part of that argument precisely you disagree with and why? Your argument hinges critically on the assumption that there is such a thing as a single "correct" convergence parameter for everyone which the developer should choose.No, I don't say that there's one correct convergence, and I never said that. I said that the developer has a default convergence he thinks suits his game best. You might disagree, and I might disagree, too. So if the developer has the budget and fore-sightedness to allow the user to allow change that, this is certainly is a big plus for the game. But it's just not a necessity for a game to receive the title "3DReady", because that's not what this lable implies.
[quote name='ds445' date='02 December 2011 - 10:38 AM' timestamp='1322818693' post='1336319']I strongly disagree with this - convergence is inherently much more subjective than other parameters such as FOV or screen brightness, due to the high heterogeneity in the way people perceive 3D images and what different individuals find comfortable, so having this parameter locked with no way of changing it will create a huge and entirely unnecessary problem for many users.
Well, I can point you to myriard threads where people complain about the "wrong" FoV in games. There are even many tools which force the game to use another FoV. Would you deny those games the label "Direct3D compatible" because of that?
I'm under the impression that NVIDIA rates all 3D Vision compatible games on a five-point scale of "Poor" - "Fair" - "Good" - "Excellent" - "3D Vision Ready". Have a look at the [url="http://www.nvidia.com/object/3d-vision-games.html"]3D Vision 'Featured Games' page[/url], where in the dropdown menu for "3D Vision Ratings" these are your five choices, and they seem to correspond directly with the one to five stars awarded to games. This to me implies that any game from a rating of "Fair" on is compatible with 3D Vision, and as "3D Vision Ready" is a notch above even "Excellent", any game rated "3D Vision Ready" would have a near-flawless stereo implementation that is among the best in class.
[b]Unless I'm mistaken (and please correct me if I am), "3D Vision Ready" is NOT the same as "3D Vision compatible" but instead the highest rating awarded to a subset of all games that are "3D Vision compatible".
[/b]
If "3D Vision Ready" were to simply mean that the game is expected to work in S3D out of the box, as you suggested, then I would have no problem with Battlefield 3 being rated "3D Vision Ready", I agree with you there; however, I would for instance certainly expect any game that has a 3D Vision Rating of "Good" to work out of the box already, so a rating of "3D Vision Ready" must have an additional meaning beyond that!
My point is that if the rating system is to mean something, then the rating of "3D Vision Ready" must be reserved for the games that truly deliver an amazing stereo experience, and that is true for instance for Skyrim (or also FIFA 12, which I love to play in 3D), but most certainly not at present for Battlefield 3, which should not hold a rating higher than "Good". This is not about whether Battlefield 3 is "3D Vision compatible", but about whether it should deserve the very highest ranking concerning the 3D Vision implementation.
I don't think that the prerequisite of having a feature be adjustable is that the majority of users be able to perfectly and consistenly choose the correct setting: for reference, [url="http://www.panzerskulls.de/sonstieges/bf3/settings/graphics-quality-options.jpg"]have a look[/url] at the adjustable graphics options for BF3 - do you really think the average user will know how to correctly choose the anisotropic filter or the amount of deferred antialiasing? Of course not, but the point of having an advanced menu is letting users choose the settings they want, and it is simply incomprehensible to me why one of the single most important (and highly subjective) settings for the quality of the 3D experience is locked at an arbitrary value!
You could easily make the argument that all of the graphics settings should be locked down then so that people don't accidentally choose values that lower their graphics quality or framerate drastically and then go on complaining about that in online forums, or as I suggested before that the developers simply fix a brightness and volume setting at which they think the game is to be played - again, what is the argument for why precisely only convergence should be locked?
Concerning your other question - I've tried BF3 with and without VSync in 2D and 3D and disabling S3D in-game: 2D consistently runs at a framerate of above 200fps on lowest settings (at 1280x720) (measured without VSync, of course) and a solid 55fps on Ultra, but never gets above 35fps in 3D on any detail settings (even without any form of Anti-Aliasing and everything turned down to bare minimum), and the 3D performance seems almost independent of the detail setting, so something isn't quite right here.
I'm under the impression that NVIDIA rates all 3D Vision compatible games on a five-point scale of "Poor" - "Fair" - "Good" - "Excellent" - "3D Vision Ready". Have a look at the 3D Vision 'Featured Games' page, where in the dropdown menu for "3D Vision Ratings" these are your five choices, and they seem to correspond directly with the one to five stars awarded to games. This to me implies that any game from a rating of "Fair" on is compatible with 3D Vision, and as "3D Vision Ready" is a notch above even "Excellent", any game rated "3D Vision Ready" would have a near-flawless stereo implementation that is among the best in class.
Unless I'm mistaken (and please correct me if I am), "3D Vision Ready" is NOT the same as "3D Vision compatible" but instead the highest rating awarded to a subset of all games that are "3D Vision compatible".
If "3D Vision Ready" were to simply mean that the game is expected to work in S3D out of the box, as you suggested, then I would have no problem with Battlefield 3 being rated "3D Vision Ready", I agree with you there; however, I would for instance certainly expect any game that has a 3D Vision Rating of "Good" to work out of the box already, so a rating of "3D Vision Ready" must have an additional meaning beyond that!
My point is that if the rating system is to mean something, then the rating of "3D Vision Ready" must be reserved for the games that truly deliver an amazing stereo experience, and that is true for instance for Skyrim (or also FIFA 12, which I love to play in 3D), but most certainly not at present for Battlefield 3, which should not hold a rating higher than "Good". This is not about whether Battlefield 3 is "3D Vision compatible", but about whether it should deserve the very highest ranking concerning the 3D Vision implementation.
I don't think that the prerequisite of having a feature be adjustable is that the majority of users be able to perfectly and consistenly choose the correct setting: for reference, have a look at the adjustable graphics options for BF3 - do you really think the average user will know how to correctly choose the anisotropic filter or the amount of deferred antialiasing? Of course not, but the point of having an advanced menu is letting users choose the settings they want, and it is simply incomprehensible to me why one of the single most important (and highly subjective) settings for the quality of the 3D experience is locked at an arbitrary value!
You could easily make the argument that all of the graphics settings should be locked down then so that people don't accidentally choose values that lower their graphics quality or framerate drastically and then go on complaining about that in online forums, or as I suggested before that the developers simply fix a brightness and volume setting at which they think the game is to be played - again, what is the argument for why precisely only convergence should be locked?
Concerning your other question - I've tried BF3 with and without VSync in 2D and 3D and disabling S3D in-game: 2D consistently runs at a framerate of above 200fps on lowest settings (at 1280x720) (measured without VSync, of course) and a solid 55fps on Ultra, but never gets above 35fps in 3D on any detail settings (even without any form of Anti-Aliasing and everything turned down to bare minimum), and the 3D performance seems almost independent of the detail setting, so something isn't quite right here.
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
[/quote]
Photios, I just reread this and I think the problem is that you are also misunderstanding what "3D Vision Ready" and "3D Vision compatible" mean - perhaps NVIDIA should be more clear on this, and I certainly agree that a game with locked convergence and bad performance can still be considered "3D Vision compatible" (which it certainly is by definition of compatibility) but should not hold the highest possible rating of "3D Vision Ready" - does that answer your question?
Guys,
I do not understand the argument here. Perhaps someone can draw it out for me clearly.
It seems like I have found something of the following:
If Convergence is locked and performance is low, then the game is not qualified to be '3D Vision ready'.
Supporting premises:
Major Premise 1: All convergence is locked by developer in BF3
Major Premise 2: Battlefield 3 suffers from 50% or more performance issues with 3D.
By modus poenens, Battlefield 3 should not be rated 3D Vision Ready.
This argument begs the question. Neither of these two premises infer the conclusion, I.E., that they fulfill a necessary condition for what counts as '3D Vision.'
As a reductio ad absurdum, one could play Resident Evil 5 or Mafia 2 with an 8800 GT and come to such a conclusion, but such is false. Convergence is simply a matter of preference and not a sine qua non of 3D vision, and secondly performance is always a one way arrow of time and only denotes a relative position. Performance is indifferent to the quality of 3D, which is a function of the game engine and the driver.
Photios, I just reread this and I think the problem is that you are also misunderstanding what "3D Vision Ready" and "3D Vision compatible" mean - perhaps NVIDIA should be more clear on this, and I certainly agree that a game with locked convergence and bad performance can still be considered "3D Vision compatible" (which it certainly is by definition of compatibility) but should not hold the highest possible rating of "3D Vision Ready" - does that answer your question?