[quote name='whodamanxbox' post='1007999' date='Feb 25 2010, 11:42 AM']That is exactly my point. Why waste the power on frames when the human eye cant see it. Are you REALLY getting that many more kills because of it? If that were true, all online games should have a locked framerate.[/quote]
90 fps is not the limit.
[quote name='whodamanxbox' post='1007999' date='Feb 25 2010, 11:42 AM']That is exactly my point. Why waste the power on frames when the human eye cant see it. Are you REALLY getting that many more kills because of it? If that were true, all online games should have a locked framerate.
[quote name='Adz 3000' post='1007989' date='Feb 25 2010, 01:28 PM']I agree with you on this but i thought the limatation for a healthy human eye is 90fps.
Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!![/quote]
I have read that the human eye can’t perceive above 60 fps and I have read the same for 90. Yet I know I can tell 120 is better than 85 (never done 90 specifically).
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.
[quote name='Adz 3000' post='1007989' date='Feb 25 2010, 01:28 PM']I agree with you on this but i thought the limatation for a healthy human eye is 90fps.
Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!!
I have read that the human eye can’t perceive above 60 fps and I have read the same for 90. Yet I know I can tell 120 is better than 85 (never done 90 specifically).
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.
[quote name='msm903' post='1007966' date='Feb 25 2010, 01:43 PM']I guess my question is this.
Does the driver send a pair of 3d images( in serial not parallel) to the gpu and it renders them one after the other, or does the software send one 3d image to the gpu which then in terns does the recalculation to create the two distinct 3d images (left and right eyes) then the gpu renders the images in serial.
the first option is less hardware but more software intensive. I have no idea how this is done.
the program (game) in this case is only outputing one 3d image for any given time it's up to either the driver or the gpu to create and provide the two differing images.
James Cameron's Avatar may be different come ot think of it. it may actually be producing the left and right images itself.[/quote]
It must be the latter, otherwise the entire calculation process would have to rely on the cpu yet the cpu doesn't seem to work any harder in stereo mode than it does in regular 2d mode. The calculations have to be done somewhere, which I assume is the gpu (something akin to CUDA, probably)
[quote name='msm903' post='1007966' date='Feb 25 2010, 01:43 PM']I guess my question is this.
Does the driver send a pair of 3d images( in serial not parallel) to the gpu and it renders them one after the other, or does the software send one 3d image to the gpu which then in terns does the recalculation to create the two distinct 3d images (left and right eyes) then the gpu renders the images in serial.
the first option is less hardware but more software intensive. I have no idea how this is done.
the program (game) in this case is only outputing one 3d image for any given time it's up to either the driver or the gpu to create and provide the two differing images.
James Cameron's Avatar may be different come ot think of it. it may actually be producing the left and right images itself.
It must be the latter, otherwise the entire calculation process would have to rely on the cpu yet the cpu doesn't seem to work any harder in stereo mode than it does in regular 2d mode. The calculations have to be done somewhere, which I assume is the gpu (something akin to CUDA, probably)
[quote name='MistaP' post='1008009' date='Feb 25 2010, 08:58 PM']I have read that the human eye can’t perceive above 60 fps and I have read the same for 90. Yet I know I can tell 120 is better than 85 (never done 90 specifically).
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.[/quote]
You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
[quote name='MistaP' post='1008009' date='Feb 25 2010, 08:58 PM']I have read that the human eye can’t perceive above 60 fps and I have read the same for 90. Yet I know I can tell 120 is better than 85 (never done 90 specifically).
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.
You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
[quote name='BlackSharkfr' post='1008082' date='Feb 25 2010, 01:48 PM']You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...[/quote]
Now here's a guy who knows what he's talking about.
At one end of the spectrum is detecting the shortest duration pulse of a bright light that can be seen in a dark room. This is obviously far beyond 120 fps. We don't need to get into it here but while there are diminishing returns above 60 fps for normal motion, higher framerates definitely help.
[quote name='BlackSharkfr' post='1008082' date='Feb 25 2010, 01:48 PM']You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
Now here's a guy who knows what he's talking about.
At one end of the spectrum is detecting the shortest duration pulse of a bright light that can be seen in a dark room. This is obviously far beyond 120 fps. We don't need to get into it here but while there are diminishing returns above 60 fps for normal motion, higher framerates definitely help.
I think more is better, even if you dont notice it helping PLUS a more reliable shot. You with your 120hz screen @ 120 fps vs the enemy with his/her 60hz screen @ 60 fps, could mean the differance between a solid headshot or a miss and you die. You know where your enemy is 2x more than they know where you are. My pc can not do anything in 2D @ 120 fps let alone 3d but with a beefy rig I say its a big enuff differance to have more fps.
I think more is better, even if you dont notice it helping PLUS a more reliable shot. You with your 120hz screen @ 120 fps vs the enemy with his/her 60hz screen @ 60 fps, could mean the differance between a solid headshot or a miss and you die. You know where your enemy is 2x more than they know where you are. My pc can not do anything in 2D @ 120 fps let alone 3d but with a beefy rig I say its a big enuff differance to have more fps.
[quote name='BlackSharkfr' post='1008082' date='Feb 25 2010, 03:48 PM']You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...[/quote]
I appreciate all the science and all, but you missed the point I was gettin at. I was simply explaining that people claim this and that, but I can use myself as evidence that not everyone will percieve it the same and some will percieve more or less than others. However, your explaination still agress with my conclusion. Not everyone is created equal.
[quote name='BlackSharkfr' post='1008082' date='Feb 25 2010, 03:48 PM']You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
I appreciate all the science and all, but you missed the point I was gettin at. I was simply explaining that people claim this and that, but I can use myself as evidence that not everyone will percieve it the same and some will percieve more or less than others. However, your explaination still agress with my conclusion. Not everyone is created equal.
They already have cards that render 3D just as fast as 2D. Its called a GTX295! :) That is the only way to render two images just as fast as rendering one unless the drivers are performing some cool tricks like DDD's Virtual 3D and accessing the back buffer.
They already have cards that render 3D just as fast as 2D. Its called a GTX295! :) That is the only way to render two images just as fast as rendering one unless the drivers are performing some cool tricks like DDD's Virtual 3D and accessing the back buffer.
[quote name='PiXeL' post='1008186' date='Feb 25 2010, 07:14 PM']They already have cards that render 3D just as fast as 2D. Its called a GTX295! :) That is the only way to render two images just as fast as rendering one unless the drivers are performing some cool tricks like DDD's Virtual 3D and accessing the back buffer.[/quote]
I guess it sort of is hehe.
Well I could see some kind of setup on osme future 3d solutions using a video card for left and one for right ... i see this as being something hard to achieve for shutter glasses as sli and crossfire both have "microstutter" although it is almost unnoticable on newer cards just thinking of running a card for left and a card for right and then expecting them to stay in perfect harmony seems like it could be a challenge
though and maybe more feasible for a polarized 2 pane monitor setup or something wierd
[quote name='PiXeL' post='1008186' date='Feb 25 2010, 07:14 PM']They already have cards that render 3D just as fast as 2D. Its called a GTX295! :) That is the only way to render two images just as fast as rendering one unless the drivers are performing some cool tricks like DDD's Virtual 3D and accessing the back buffer.
I guess it sort of is hehe.
Well I could see some kind of setup on osme future 3d solutions using a video card for left and one for right ... i see this as being something hard to achieve for shutter glasses as sli and crossfire both have "microstutter" although it is almost unnoticable on newer cards just thinking of running a card for left and a card for right and then expecting them to stay in perfect harmony seems like it could be a challenge
though and maybe more feasible for a polarized 2 pane monitor setup or something wierd
While more frames per second are always appreciated I'm perfectly fine with a relatively solid 30 fps myself. Right now my single gtx 275 serves me just fine despite me cranking up most quality settings in my games as high as possible. (except Crysis of course, but I don't give a rat's ass about fps games anyway)
While more frames per second are always appreciated I'm perfectly fine with a relatively solid 30 fps myself. Right now my single gtx 275 serves me just fine despite me cranking up most quality settings in my games as high as possible. (except Crysis of course, but I don't give a rat's ass about fps games anyway)
Well, I have the feeling there is a lot of misunderstanding conglomerated in this thread. Let me try to clarify a couple of things without being toooo teachy:
Let's start with the minimal/ maximal framerate perceivable. What it absically is referred to is the frame rate when the brain interpretes a sequence of indivuÃdual frames as a movement or flow and not as just sequential frames. This is consented to start at roughly 18 fps, still very stutering, though. To give a better illusion it was decided to go with 24fps in film. This was in the beginning and everybody old enough to recall theater experiences from the 60s will agree it was flickering like hell. The cure was introduced with doubling/tripling the picture by an rotating shutter. So in cinemas of today you will still have the film shot at 24fps that (beside of camera pans) give a quite natural flow, however it is projected at 72fps to reduce the judder and flicker. Also in TV (I will stick to NTSC in this example) picture frame rate is 60Hz interleaved. Allthough the motion effect is very fluid you can experience judder on content like sports on TFTs and flickering on CRT type TV sets.
So why this long prologue? It was meant to show practical experiments for everybody to verify on his/her own that the human visual complex will start to interprete framerates on from somewhere around 20fps as continuos motion albeit is very capable of perceiving quicker framerates than 60 or 72Hz.
The drawing line here is the allready known and consented fact that for gaming a minimum of 30fps can be considered [i]sufficient[/i] while higher is at no time worse, especially with games benefitting from quick reflexes such as online shooters. (Quit a few reviews of the first generation 3D ready displays all stated an improved Windows/desktop experience to to a more natural motion of the mouse cursor!)
The second misconception found in this thread is that a minimum framerate of 120Hz would be mandatory to maintain stereoscopic illusion. That is not true, as also here 30fps (per eye) can be considered absolutely sufficient, it can be even less since the brain cannot dissect left eye from the right eye (the very reason for shutter glasses to work at all). Only the output needs to beat 60Hz (per eye) - following the same principle of the triple shutter speed in theaters. generally speaking you can expect a good stereo experience if your grafic card can manage to run a game at 50fps average. If you cannot believe this, I recommend to go and watch Avatar 3D in the theaters, you will see it with 24fps/tripled to 72 per eye. 2x 24Fps also is the framerate for 3D Blu-Ray, it's just the monitor that makes it 120/240Hz.
There is one thing to keep in mind with shutter glasses though: Lower framerates per eye can be more fatigueing and maybe even cause nausea to people susceptive to that - so same here: more frames than neccessary = not bad!
Now back on-topic: There is nothing in the world a game engine could do on it's end to imrove framerate on a large scale. The only thing it could do with S3D implementation like on the Unigine engine is to make it cross compatible not depending on certain drivers like 3Dvision from nvidia. This way or the other, the GPU [i]must[/i] render the geometry twice with a different parallax which is the cause for the frame rate to ditch by almost 50% compared to 2D. There will be a little saved on using the same textures from the GPUS memory, but that's basically it. Geometry wise it is the very same workload, so there will be no way that an additional substituting S3D chip can be less performant than the main GPU. Single Card, SLI, Crossfire - it all doesnt matter, if you want high framerate with modern game engines in S3D, prepare for a beefy (or "NASA like" if you prefer the term) GPU - there will be no way round that, sorry.
Well, I have the feeling there is a lot of misunderstanding conglomerated in this thread. Let me try to clarify a couple of things without being toooo teachy:
Let's start with the minimal/ maximal framerate perceivable. What it absically is referred to is the frame rate when the brain interpretes a sequence of indivuÃdual frames as a movement or flow and not as just sequential frames. This is consented to start at roughly 18 fps, still very stutering, though. To give a better illusion it was decided to go with 24fps in film. This was in the beginning and everybody old enough to recall theater experiences from the 60s will agree it was flickering like hell. The cure was introduced with doubling/tripling the picture by an rotating shutter. So in cinemas of today you will still have the film shot at 24fps that (beside of camera pans) give a quite natural flow, however it is projected at 72fps to reduce the judder and flicker. Also in TV (I will stick to NTSC in this example) picture frame rate is 60Hz interleaved. Allthough the motion effect is very fluid you can experience judder on content like sports on TFTs and flickering on CRT type TV sets.
So why this long prologue? It was meant to show practical experiments for everybody to verify on his/her own that the human visual complex will start to interprete framerates on from somewhere around 20fps as continuos motion albeit is very capable of perceiving quicker framerates than 60 or 72Hz.
The drawing line here is the allready known and consented fact that for gaming a minimum of 30fps can be considered sufficient while higher is at no time worse, especially with games benefitting from quick reflexes such as online shooters. (Quit a few reviews of the first generation 3D ready displays all stated an improved Windows/desktop experience to to a more natural motion of the mouse cursor!)
The second misconception found in this thread is that a minimum framerate of 120Hz would be mandatory to maintain stereoscopic illusion. That is not true, as also here 30fps (per eye) can be considered absolutely sufficient, it can be even less since the brain cannot dissect left eye from the right eye (the very reason for shutter glasses to work at all). Only the output needs to beat 60Hz (per eye) - following the same principle of the triple shutter speed in theaters. generally speaking you can expect a good stereo experience if your grafic card can manage to run a game at 50fps average. If you cannot believe this, I recommend to go and watch Avatar 3D in the theaters, you will see it with 24fps/tripled to 72 per eye. 2x 24Fps also is the framerate for 3D Blu-Ray, it's just the monitor that makes it 120/240Hz.
There is one thing to keep in mind with shutter glasses though: Lower framerates per eye can be more fatigueing and maybe even cause nausea to people susceptive to that - so same here: more frames than neccessary = not bad!
Now back on-topic: There is nothing in the world a game engine could do on it's end to imrove framerate on a large scale. The only thing it could do with S3D implementation like on the Unigine engine is to make it cross compatible not depending on certain drivers like 3Dvision from nvidia. This way or the other, the GPU must render the geometry twice with a different parallax which is the cause for the frame rate to ditch by almost 50% compared to 2D. There will be a little saved on using the same textures from the GPUS memory, but that's basically it. Geometry wise it is the very same workload, so there will be no way that an additional substituting S3D chip can be less performant than the main GPU. Single Card, SLI, Crossfire - it all doesnt matter, if you want high framerate with modern game engines in S3D, prepare for a beefy (or "NASA like" if you prefer the term) GPU - there will be no way round that, sorry.
Some people have kind of over thought the whole scenario too much.
It seems they want something for the 3D enthusiast dedicated to stereo 3D graphics to go along side with or be built into their GPU (SLI or Dual GPU card covers it at an added cost in the real world. It isn’t dedicated S3D, but it provides the power to do it).
It seems to be they want this so they don’t have to build such a beefy PC. The thing is it seems to me the very thing they are thinking would be nice would infact make their PC beefy and expensive anyway.
All technical stuff aside. This is just random thought. You have a computer with a GTX 285 in it. You want to add something to the computer that is dedicated to S3D (and I know it doesn’t work this way but let’s pretend it did) … well whatever this magical “S3DPU“ is you would prolly have to drop $300.00 – $400.00 on it. Basically the cost of a second video card.
In the end it comes out the same. Lots of $$ dropped into the PC.
Some people have kind of over thought the whole scenario too much.
It seems they want something for the 3D enthusiast dedicated to stereo 3D graphics to go along side with or be built into their GPU (SLI or Dual GPU card covers it at an added cost in the real world. It isn’t dedicated S3D, but it provides the power to do it).
It seems to be they want this so they don’t have to build such a beefy PC. The thing is it seems to me the very thing they are thinking would be nice would infact make their PC beefy and expensive anyway.
All technical stuff aside. This is just random thought. You have a computer with a GTX 285 in it. You want to add something to the computer that is dedicated to S3D (and I know it doesn’t work this way but let’s pretend it did) … well whatever this magical “S3DPU“ is you would prolly have to drop $300.00 – $400.00 on it. Basically the cost of a second video card.
In the end it comes out the same. Lots of $$ dropped into the PC.
I have no idea, but it is difficult to believe that the only save in the 2nd rendering pass is the same loading textures. In some games the framerate lost in 3D is by far lower than 50%, and it seem to indicate that there are other calculations not needed to re-processed,... or maybe only mean that 2nd rendering only affect seriously the GPU, and not CPU calculations.
I have no idea, but it is difficult to believe that the only save in the 2nd rendering pass is the same loading textures. In some games the framerate lost in 3D is by far lower than 50%, and it seem to indicate that there are other calculations not needed to re-processed,... or maybe only mean that 2nd rendering only affect seriously the GPU, and not CPU calculations.
90 fps is not the limit.
90 fps is not the limit.
Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!![/quote]
I have read that the human eye can’t perceive above 60 fps and I have read the same for 90. Yet I know I can tell 120 is better than 85 (never done 90 specifically).
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.
Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!!
I have read that the human eye can’t perceive above 60 fps and I have read the same for 90. Yet I know I can tell 120 is better than 85 (never done 90 specifically).
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.
Does the driver send a pair of 3d images( in serial not parallel) to the gpu and it renders them one after the other, or does the software send one 3d image to the gpu which then in terns does the recalculation to create the two distinct 3d images (left and right eyes) then the gpu renders the images in serial.
the first option is less hardware but more software intensive. I have no idea how this is done.
the program (game) in this case is only outputing one 3d image for any given time it's up to either the driver or the gpu to create and provide the two differing images.
James Cameron's Avatar may be different come ot think of it. it may actually be producing the left and right images itself.[/quote]
It must be the latter, otherwise the entire calculation process would have to rely on the cpu yet the cpu doesn't seem to work any harder in stereo mode than it does in regular 2d mode. The calculations have to be done somewhere, which I assume is the gpu (something akin to CUDA, probably)
Does the driver send a pair of 3d images( in serial not parallel) to the gpu and it renders them one after the other, or does the software send one 3d image to the gpu which then in terns does the recalculation to create the two distinct 3d images (left and right eyes) then the gpu renders the images in serial.
the first option is less hardware but more software intensive. I have no idea how this is done.
the program (game) in this case is only outputing one 3d image for any given time it's up to either the driver or the gpu to create and provide the two differing images.
James Cameron's Avatar may be different come ot think of it. it may actually be producing the left and right images itself.
It must be the latter, otherwise the entire calculation process would have to rely on the cpu yet the cpu doesn't seem to work any harder in stereo mode than it does in regular 2d mode. The calculations have to be done somewhere, which I assume is the gpu (something akin to CUDA, probably)
Asus RIVBE • i7 4930K @ 4.7ghz • 8gb Corsair Dominator Platinum 2133 C8
2xSLI EVGA GTX 770 SC • Creative X-Fi Titanium • 2x 840 SSD + 1TB Seagate Hybrid
EVGA Supernova 1300W• Asus VG278H & nVidia 3d Vision
Phanteks Enthoo Primo w/ custom watercooling:
XSPC Raystorm (cpu & gpu), XSPC Photon 170, Swiftech D5 vario
Alphacool Monsta 360mm +6x NB e-loop, XT45 360mm +6x Corsair SP120
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.[/quote]
You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
I have two thoughts on reporting these kind of numbers. Either, one, they are just numbers someone used somewhere to justify something with no real research backing the or, two, they are averages.
Think about it like this. 20/20 is considered perfect vision, but there are people who have better vision than 20/20. The same would apply to one’s ability to perceive motion. Some people can discern details on things moving quickly better than others.
You can't put a number on the frame rate the eye sees.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...[/quote]
Now here's a guy who knows what he's talking about.
At one end of the spectrum is detecting the shortest duration pulse of a bright light that can be seen in a dark room. This is obviously far beyond 120 fps. We don't need to get into it here but while there are diminishing returns above 60 fps for normal motion, higher framerates definitely help.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
Now here's a guy who knows what he's talking about.
At one end of the spectrum is detecting the shortest duration pulse of a bright light that can be seen in a dark room. This is obviously far beyond 120 fps. We don't need to get into it here but while there are diminishing returns above 60 fps for normal motion, higher framerates definitely help.
Watercool any gpu cheap, AKA- "The Mod"
what if dog was really spelled "c-a-t"
jk
anyway, some very good info here that I did not know about
what if dog was really spelled "c-a-t"
jk
anyway, some very good info here that I did not know about
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...[/quote]
I appreciate all the science and all, but you missed the point I was gettin at. I was simply explaining that people claim this and that, but I can use myself as evidence that not everyone will percieve it the same and some will percieve more or less than others. However, your explaination still agress with my conclusion. Not everyone is created equal.
The problem is the human eye works continuously. If you wanted to give a number on the absolute framerate for the eye, you would have to measure the rate at which your eye nerves sends electric pulses to the brain. This value varies according the the brightness detected by the eye. I do not know the highest value it can attain but this rate is extremely high [not science](it's over nine thousaaaand)[/not science].
However this value has nothing to do with the actual amount of stuff (framerate + complexity of the frames you watch) your brain can understand and process, which also varies according to your attention. Motion perception is something even more complex. Some anime broadcasts can be acceptable with some animation as low as 15fps, cinema can appear smooth even at 24fps but if you focus on your cursor on a 120Hz monitor, you can notice the jittery motion.
When people say : 24fps is enough, 30fps is enough, 60fps is enough, 100fps is enough, etc... it has nothing to do with how the human eye or brain works. It's always a cost/benefit ratio : how much money does it cost me to go higher, how much money will i have to charge my clients for 60fps/120fps, is my client ready to spend that much money on his console/pc/DVD/BluRay/Satellite-cable decoder/HDTV/ etc... What if I'm happy with 60 why would I bother going higher ? etc...
I appreciate all the science and all, but you missed the point I was gettin at. I was simply explaining that people claim this and that, but I can use myself as evidence that not everyone will percieve it the same and some will percieve more or less than others. However, your explaination still agress with my conclusion. Not everyone is created equal.
I guess it sort of is hehe.
Well I could see some kind of setup on osme future 3d solutions using a video card for left and one for right ... i see this as being something hard to achieve for shutter glasses as sli and crossfire both have "microstutter" although it is almost unnoticable on newer cards just thinking of running a card for left and a card for right and then expecting them to stay in perfect harmony seems like it could be a challenge
though and maybe more feasible for a polarized 2 pane monitor setup or something wierd
I guess it sort of is hehe.
Well I could see some kind of setup on osme future 3d solutions using a video card for left and one for right ... i see this as being something hard to achieve for shutter glasses as sli and crossfire both have "microstutter" although it is almost unnoticable on newer cards just thinking of running a card for left and a card for right and then expecting them to stay in perfect harmony seems like it could be a challenge
though and maybe more feasible for a polarized 2 pane monitor setup or something wierd
Let's start with the minimal/ maximal framerate perceivable. What it absically is referred to is the frame rate when the brain interpretes a sequence of indivuÃdual frames as a movement or flow and not as just sequential frames. This is consented to start at roughly 18 fps, still very stutering, though. To give a better illusion it was decided to go with 24fps in film. This was in the beginning and everybody old enough to recall theater experiences from the 60s will agree it was flickering like hell. The cure was introduced with doubling/tripling the picture by an rotating shutter. So in cinemas of today you will still have the film shot at 24fps that (beside of camera pans) give a quite natural flow, however it is projected at 72fps to reduce the judder and flicker. Also in TV (I will stick to NTSC in this example) picture frame rate is 60Hz interleaved. Allthough the motion effect is very fluid you can experience judder on content like sports on TFTs and flickering on CRT type TV sets.
So why this long prologue? It was meant to show practical experiments for everybody to verify on his/her own that the human visual complex will start to interprete framerates on from somewhere around 20fps as continuos motion albeit is very capable of perceiving quicker framerates than 60 or 72Hz.
The drawing line here is the allready known and consented fact that for gaming a minimum of 30fps can be considered [i]sufficient[/i] while higher is at no time worse, especially with games benefitting from quick reflexes such as online shooters. (Quit a few reviews of the first generation 3D ready displays all stated an improved Windows/desktop experience to to a more natural motion of the mouse cursor!)
The second misconception found in this thread is that a minimum framerate of 120Hz would be mandatory to maintain stereoscopic illusion. That is not true, as also here 30fps (per eye) can be considered absolutely sufficient, it can be even less since the brain cannot dissect left eye from the right eye (the very reason for shutter glasses to work at all). Only the output needs to beat 60Hz (per eye) - following the same principle of the triple shutter speed in theaters. generally speaking you can expect a good stereo experience if your grafic card can manage to run a game at 50fps average. If you cannot believe this, I recommend to go and watch Avatar 3D in the theaters, you will see it with 24fps/tripled to 72 per eye. 2x 24Fps also is the framerate for 3D Blu-Ray, it's just the monitor that makes it 120/240Hz.
There is one thing to keep in mind with shutter glasses though: Lower framerates per eye can be more fatigueing and maybe even cause nausea to people susceptive to that - so same here: more frames than neccessary = not bad!
Now back on-topic: There is nothing in the world a game engine could do on it's end to imrove framerate on a large scale. The only thing it could do with S3D implementation like on the Unigine engine is to make it cross compatible not depending on certain drivers like 3Dvision from nvidia. This way or the other, the GPU [i]must[/i] render the geometry twice with a different parallax which is the cause for the frame rate to ditch by almost 50% compared to 2D. There will be a little saved on using the same textures from the GPUS memory, but that's basically it. Geometry wise it is the very same workload, so there will be no way that an additional substituting S3D chip can be less performant than the main GPU. Single Card, SLI, Crossfire - it all doesnt matter, if you want high framerate with modern game engines in S3D, prepare for a beefy (or "NASA like" if you prefer the term) GPU - there will be no way round that, sorry.
Let's start with the minimal/ maximal framerate perceivable. What it absically is referred to is the frame rate when the brain interpretes a sequence of indivuÃdual frames as a movement or flow and not as just sequential frames. This is consented to start at roughly 18 fps, still very stutering, though. To give a better illusion it was decided to go with 24fps in film. This was in the beginning and everybody old enough to recall theater experiences from the 60s will agree it was flickering like hell. The cure was introduced with doubling/tripling the picture by an rotating shutter. So in cinemas of today you will still have the film shot at 24fps that (beside of camera pans) give a quite natural flow, however it is projected at 72fps to reduce the judder and flicker. Also in TV (I will stick to NTSC in this example) picture frame rate is 60Hz interleaved. Allthough the motion effect is very fluid you can experience judder on content like sports on TFTs and flickering on CRT type TV sets.
So why this long prologue? It was meant to show practical experiments for everybody to verify on his/her own that the human visual complex will start to interprete framerates on from somewhere around 20fps as continuos motion albeit is very capable of perceiving quicker framerates than 60 or 72Hz.
The drawing line here is the allready known and consented fact that for gaming a minimum of 30fps can be considered sufficient while higher is at no time worse, especially with games benefitting from quick reflexes such as online shooters. (Quit a few reviews of the first generation 3D ready displays all stated an improved Windows/desktop experience to to a more natural motion of the mouse cursor!)
The second misconception found in this thread is that a minimum framerate of 120Hz would be mandatory to maintain stereoscopic illusion. That is not true, as also here 30fps (per eye) can be considered absolutely sufficient, it can be even less since the brain cannot dissect left eye from the right eye (the very reason for shutter glasses to work at all). Only the output needs to beat 60Hz (per eye) - following the same principle of the triple shutter speed in theaters. generally speaking you can expect a good stereo experience if your grafic card can manage to run a game at 50fps average. If you cannot believe this, I recommend to go and watch Avatar 3D in the theaters, you will see it with 24fps/tripled to 72 per eye. 2x 24Fps also is the framerate for 3D Blu-Ray, it's just the monitor that makes it 120/240Hz.
There is one thing to keep in mind with shutter glasses though: Lower framerates per eye can be more fatigueing and maybe even cause nausea to people susceptive to that - so same here: more frames than neccessary = not bad!
Now back on-topic: There is nothing in the world a game engine could do on it's end to imrove framerate on a large scale. The only thing it could do with S3D implementation like on the Unigine engine is to make it cross compatible not depending on certain drivers like 3Dvision from nvidia. This way or the other, the GPU must render the geometry twice with a different parallax which is the cause for the frame rate to ditch by almost 50% compared to 2D. There will be a little saved on using the same textures from the GPUS memory, but that's basically it. Geometry wise it is the very same workload, so there will be no way that an additional substituting S3D chip can be less performant than the main GPU. Single Card, SLI, Crossfire - it all doesnt matter, if you want high framerate with modern game engines in S3D, prepare for a beefy (or "NASA like" if you prefer the term) GPU - there will be no way round that, sorry.
It seems they want something for the 3D enthusiast dedicated to stereo 3D graphics to go along side with or be built into their GPU (SLI or Dual GPU card covers it at an added cost in the real world. It isn’t dedicated S3D, but it provides the power to do it).
It seems to be they want this so they don’t have to build such a beefy PC. The thing is it seems to me the very thing they are thinking would be nice would infact make their PC beefy and expensive anyway.
All technical stuff aside. This is just random thought. You have a computer with a GTX 285 in it. You want to add something to the computer that is dedicated to S3D (and I know it doesn’t work this way but let’s pretend it did) … well whatever this magical “S3DPU“ is you would prolly have to drop $300.00 – $400.00 on it. Basically the cost of a second video card.
In the end it comes out the same. Lots of $$ dropped into the PC.
It seems they want something for the 3D enthusiast dedicated to stereo 3D graphics to go along side with or be built into their GPU (SLI or Dual GPU card covers it at an added cost in the real world. It isn’t dedicated S3D, but it provides the power to do it).
It seems to be they want this so they don’t have to build such a beefy PC. The thing is it seems to me the very thing they are thinking would be nice would infact make their PC beefy and expensive anyway.
All technical stuff aside. This is just random thought. You have a computer with a GTX 285 in it. You want to add something to the computer that is dedicated to S3D (and I know it doesn’t work this way but let’s pretend it did) … well whatever this magical “S3DPU“ is you would prolly have to drop $300.00 – $400.00 on it. Basically the cost of a second video card.
In the end it comes out the same. Lots of $$ dropped into the PC.
- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)
I suppose it would be a bit like having a pyhx card
I suppose it would be a bit like having a pyhx card