3D-Vision future videocards
  1 / 3    
I only want to say that Nvidia must think to make in the future video cards with special 3d chips or something, to make possible to have the same bench with or without 3d vision activated. Now we play 3d games only with the brute forze of a high end video card, loosing a lot of frames/second, and developers make games for this powerfull videocard thinking in 2d, so always we are without enough power to play a lot of recent games, and of course next future games.

Seem to be a stupid thing, but to play 3d games with good framerates we need to play old games, or to buy a NASA computer, because always games are desingned to be played with top recent computers.

Nvidia must do something, because the solution is not to buy te best videocard and see that is not enough powerfull to play some games, and impossible to play next future games. The solution is not to reduce details in games because games are usually made to be played with these visuals, and most look terrible with low details. Without a decent framerate the game is not playable and then there is no reason to play in 3D.
I only want to say that Nvidia must think to make in the future video cards with special 3d chips or something, to make possible to have the same bench with or without 3d vision activated. Now we play 3d games only with the brute forze of a high end video card, loosing a lot of frames/second, and developers make games for this powerfull videocard thinking in 2d, so always we are without enough power to play a lot of recent games, and of course next future games.



Seem to be a stupid thing, but to play 3d games with good framerates we need to play old games, or to buy a NASA computer, because always games are desingned to be played with top recent computers.



Nvidia must do something, because the solution is not to buy te best videocard and see that is not enough powerfull to play some games, and impossible to play next future games. The solution is not to reduce details in games because games are usually made to be played with these visuals, and most look terrible with low details. Without a decent framerate the game is not playable and then there is no reason to play in 3D.

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

#1
Posted 02/25/2010 12:23 PM   
What you ask is out of the question.
Stereoscopic 3D works exactly the same way as normal monoscopic "2D" gaming, except instead of rendering one picture it renders two, and the framerate performance follows this principle perfectly (unless the game is cpu-limited in 2D mode, then performance is higher).
Having a GPU made specifically to have the same framerate in 2D and stereoscopic 3D would mean that half the GPU is doing absolutely nothing when rendering in monoscopic mode. 2D performance would be catastrophic and put nvidia to shame for people who do not play in stereo3D and the actual stereo3D performance would not improve much. It's a lose-lose scenario.
It won't happen.

If you want your games to run perfectly in stereo3D even with an old single GPU, the only people you can complain to are the game developers.
What you ask is out of the question.

Stereoscopic 3D works exactly the same way as normal monoscopic "2D" gaming, except instead of rendering one picture it renders two, and the framerate performance follows this principle perfectly (unless the game is cpu-limited in 2D mode, then performance is higher).

Having a GPU made specifically to have the same framerate in 2D and stereoscopic 3D would mean that half the GPU is doing absolutely nothing when rendering in monoscopic mode. 2D performance would be catastrophic and put nvidia to shame for people who do not play in stereo3D and the actual stereo3D performance would not improve much. It's a lose-lose scenario.

It won't happen.



If you want your games to run perfectly in stereo3D even with an old single GPU, the only people you can complain to are the game developers.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#2
Posted 02/25/2010 01:01 PM   
I want active shudder contact lenses. ahhhhh LoL /sorcerer.gif' class='bbc_emoticon' alt=':sorcerer:' />
I want active shudder contact lenses. ahhhhh LoL /sorcerer.gif' class='bbc_emoticon' alt=':sorcerer:' />
In 2d gaming to get better frame rates with lower end cards you have to decrease some of the graphic options. the same thing goes for S3d gaming. just that you may have to decrease more.
with 120hz monitors S3d will not go above 60frames per second per eye, no matter what graphics Card is being used. until we have 240hz lcd monitors then we can and should expect to see frames closer to 120 frames per second per eye.

the current hardware works fine in terms of performance for most of us i would think (visually may be another issue)
In 2d gaming to get better frame rates with lower end cards you have to decrease some of the graphic options. the same thing goes for S3d gaming. just that you may have to decrease more.

with 120hz monitors S3d will not go above 60frames per second per eye, no matter what graphics Card is being used. until we have 240hz lcd monitors then we can and should expect to see frames closer to 120 frames per second per eye.



the current hardware works fine in terms of performance for most of us i would think (visually may be another issue)

Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10

#4
Posted 02/25/2010 03:01 PM   
I actually agree with having a 3D card. I understand where you feel it may not be practical but with the trend definitely going towards 3D, we are going to need it. It always could be offered as an option. I am not asking for a full line of cards but at least one to appease us 3D hungry players. As far as FPS, why do we need 120 fps? All we should ever need is 60 fps at the most. Things are smooth enough at that rate. I would rather them focus on better detailed graphics and processing than giving me 200fps

just my 2 cents

(flame shield is up)
I actually agree with having a 3D card. I understand where you feel it may not be practical but with the trend definitely going towards 3D, we are going to need it. It always could be offered as an option. I am not asking for a full line of cards but at least one to appease us 3D hungry players. As far as FPS, why do we need 120 fps? All we should ever need is 60 fps at the most. Things are smooth enough at that rate. I would rather them focus on better detailed graphics and processing than giving me 200fps



just my 2 cents



(flame shield is up)

#5
Posted 02/25/2010 04:09 PM   
I guess it's because I'm running 2 gtx285s but I get between 40 to 60 fps per eye in most of the games i play with most of the eye candy on (anything I have off doesn't make a visual difference with our without) I only get between 20 to 25 when playing Avatar, but that doesn't seem too choppy to me compared ot other games i've dseen at 20 fps.

and even though i have 2 gtx285s in sli it appears alot of the games i play esp on dx10 don't use the sli when playing in 3d vision. I actually thought the SLI indicator just didn't show in 3d vision, but I noticed it in Far Cry 2 on dx9 but not on dx10.

any graphics card that can keep 120fps in 2d should be good enough to give me 60fps per eye in 3d.
I guess it's because I'm running 2 gtx285s but I get between 40 to 60 fps per eye in most of the games i play with most of the eye candy on (anything I have off doesn't make a visual difference with our without) I only get between 20 to 25 when playing Avatar, but that doesn't seem too choppy to me compared ot other games i've dseen at 20 fps.



and even though i have 2 gtx285s in sli it appears alot of the games i play esp on dx10 don't use the sli when playing in 3d vision. I actually thought the SLI indicator just didn't show in 3d vision, but I noticed it in Far Cry 2 on dx9 but not on dx10.



any graphics card that can keep 120fps in 2d should be good enough to give me 60fps per eye in 3d.

Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10

#6
Posted 02/25/2010 04:36 PM   
Some people think that 2 gtx285 are something like a NASA computer :)

Of course it is impossible to forze Nvidia to reduce power of 2D to make 3D with same power, but maybe any solution to have a "chip" or something to render specifically the 2nd picture with low cost, I don´t know. It is like if CPU process all the sounds and one day somebody have the ocurrence to make a sound card to proccess especifically that work.

Of course another solution is to force developers to make games adjusting hardware requeriments to 3D users, but it is still a utopia.

The real thing is that it is impossible to play a lot of games decently in 3D with a "high" PC (not a NASA pc) :)
Some people think that 2 gtx285 are something like a NASA computer :)



Of course it is impossible to forze Nvidia to reduce power of 2D to make 3D with same power, but maybe any solution to have a "chip" or something to render specifically the 2nd picture with low cost, I don´t know. It is like if CPU process all the sounds and one day somebody have the ocurrence to make a sound card to proccess especifically that work.



Of course another solution is to force developers to make games adjusting hardware requeriments to 3D users, but it is still a utopia.



The real thing is that it is impossible to play a lot of games decently in 3D with a "high" PC (not a NASA pc) :)

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

#7
Posted 02/25/2010 05:12 PM   
[quote name='b4thman' post='1007915' date='Feb 25 2010, 12:12 PM']Some people think that 2 gtx285 are something like a NASA computer :)

Of course it is impossible to forze Nvidia to reduce power of 2D to make 3D with same power, but maybe any solution to have a "chip" or something to render specifically the 2nd picture with low cost, I don´t know. It is like if CPU process all the sounds and one day somebody have the ocurrence to make a sound card to proccess especifically that work.

Of course another solution is to force developers to make games adjusting hardware requeriments to 3D users, but it is still a utopia.

The real thing is that it is impossible to play a lot of games decently in 3D with a "high" PC (not a NASA pc) :)[/quote]

There is a reason why you have your different levels of graphic cards. what would be the point of me buying a premium 3d graphics card (which these are) if the mainstream 3d graphics card can do the samething? How does the manufacturer make money?

Now you are refering to a S3D chip or specific S3D graphics card that can give you 60frames per second per eye with all eye candy on, but you want this on a mainstream card? from what I understand about S3D all it's doing is creating two frames of the same point of time from two slightly different positions, thus any card that can do 120frames per second with all the eye candy on in basic 2D should be able to give you your 60fps per eye in S3D. give or take a few frames for incidentals.

the question I have is where are the 2 images being created in software or actually in the graphic card? IS thte Program passing one image to the graphics card which is then calculating the 2 differing images and then sending it to the screen, or is the driver passing the two distinct images left adn right eye to the graphics card who just renders in. I wish I knew the answer to this.
[quote name='b4thman' post='1007915' date='Feb 25 2010, 12:12 PM']Some people think that 2 gtx285 are something like a NASA computer :)



Of course it is impossible to forze Nvidia to reduce power of 2D to make 3D with same power, but maybe any solution to have a "chip" or something to render specifically the 2nd picture with low cost, I don´t know. It is like if CPU process all the sounds and one day somebody have the ocurrence to make a sound card to proccess especifically that work.



Of course another solution is to force developers to make games adjusting hardware requeriments to 3D users, but it is still a utopia.



The real thing is that it is impossible to play a lot of games decently in 3D with a "high" PC (not a NASA pc) :)



There is a reason why you have your different levels of graphic cards. what would be the point of me buying a premium 3d graphics card (which these are) if the mainstream 3d graphics card can do the samething? How does the manufacturer make money?



Now you are refering to a S3D chip or specific S3D graphics card that can give you 60frames per second per eye with all eye candy on, but you want this on a mainstream card? from what I understand about S3D all it's doing is creating two frames of the same point of time from two slightly different positions, thus any card that can do 120frames per second with all the eye candy on in basic 2D should be able to give you your 60fps per eye in S3D. give or take a few frames for incidentals.



the question I have is where are the 2 images being created in software or actually in the graphic card? IS thte Program passing one image to the graphics card which is then calculating the 2 differing images and then sending it to the screen, or is the driver passing the two distinct images left adn right eye to the graphics card who just renders in. I wish I knew the answer to this.

Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10

#8
Posted 02/25/2010 05:28 PM   
Well there is no such a thing as a chip to make a quick 2nd picture in no time. You may spare a bit on cpu power and memory bandwidth if the game developer and the driver are well optimized but whatever happens, the picture is different : it must be re-calculated almost completely.

You have to either do it properly and then the 2nd picture costs almost exactly the same as rendering the first picture.
Or you do it like a dirty pig (virtual 3D mode in the DDD driver) which calculates only one picture and then uses the Z-buffer to calculate the left and right eye views without recomputing the entire scene. It runs almost as fast as 2D but where you get massive artefacts if you try to push separation just a little. which is clearly not the way to go if you want to have great depth and pop-out.
Well there is no such a thing as a chip to make a quick 2nd picture in no time. You may spare a bit on cpu power and memory bandwidth if the game developer and the driver are well optimized but whatever happens, the picture is different : it must be re-calculated almost completely.



You have to either do it properly and then the 2nd picture costs almost exactly the same as rendering the first picture.

Or you do it like a dirty pig (virtual 3D mode in the DDD driver) which calculates only one picture and then uses the Z-buffer to calculate the left and right eye views without recomputing the entire scene. It runs almost as fast as 2D but where you get massive artefacts if you try to push separation just a little. which is clearly not the way to go if you want to have great depth and pop-out.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#9
Posted 02/25/2010 05:43 PM   
Maybe its because I have a gtx295, but I have yet to notice any drop in performance between 3d vision and non 3d vision. GTA4, Rise of flight,Black shark, rFactor these all run as well for me with 3d as without.
Maybe its because I have a gtx295, but I have yet to notice any drop in performance between 3d vision and non 3d vision. GTA4, Rise of flight,Black shark, rFactor these all run as well for me with 3d as without.

#10
Posted 02/25/2010 05:46 PM   
I guess my question is this.

Does the driver send a pair of 3d images( in serial not parallel) to the gpu and it renders them one after the other, or does the software send one 3d image to the gpu which then in terns does the recalculation to create the two distinct 3d images (left and right eyes) then the gpu renders the images in serial.

the first option is less hardware but more software intensive. I have no idea how this is done.

the program (game) in this case is only outputing one 3d image for any given time it's up to either the driver or the gpu to create and provide the two differing images.

James Cameron's Avatar may be different come ot think of it. it may actually be producing the left and right images itself.
I guess my question is this.



Does the driver send a pair of 3d images( in serial not parallel) to the gpu and it renders them one after the other, or does the software send one 3d image to the gpu which then in terns does the recalculation to create the two distinct 3d images (left and right eyes) then the gpu renders the images in serial.



the first option is less hardware but more software intensive. I have no idea how this is done.



the program (game) in this case is only outputing one 3d image for any given time it's up to either the driver or the gpu to create and provide the two differing images.



James Cameron's Avatar may be different come ot think of it. it may actually be producing the left and right images itself.

Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10

#11
Posted 02/25/2010 06:43 PM   
[quote name='whodamanxbox' post='1007870' date='Feb 25 2010, 10:09 AM']As far as FPS, why do we need 120 fps? All we should ever need is 60 fps at the most. Things are smooth enough at that rate. I would rather them focus on better detailed graphics and processing than giving me 200fps[/quote]

I will be forced to strongly disagree with you. As a competative counter-strike player I can tell you without a doubt that 60fps is flat out not enough for compeative gaming. While you may not be able to tell anything is moving smoother at fps above 60 there are many of us that can. Granted if your monitor is 60hz you wont notice anyhting over 60 because you are limited by the displayed refresh rate. On a 120hz monitor you can't benefit visually from a framerate over 120fps.

The entire reason I bought my 120hz AW2310 is for the increased refresh rate NOT 3D. I got the 3D kit after the fact just to check it out. While I can't look at a screen and tell you what the refresh is I can see a very noticable difference between 60 and 85 aswell as 85 and 120. I can even look at my 120hz monitor and see that the animations, while as smooth as I have ever seen on any LCD, are not perfectly smooth.

Here is a test. If you move your mouse pointer across the screen really quickly and try and focus on it do you or do you not see space between each "draw" of the pointer (I am not talking about ghosting)? If you don't see this then you wont likely see a benefit from a higher refresh rate or frame rate. I see this even on my 120hz @ 120+ fps.
[quote name='whodamanxbox' post='1007870' date='Feb 25 2010, 10:09 AM']As far as FPS, why do we need 120 fps? All we should ever need is 60 fps at the most. Things are smooth enough at that rate. I would rather them focus on better detailed graphics and processing than giving me 200fps



I will be forced to strongly disagree with you. As a competative counter-strike player I can tell you without a doubt that 60fps is flat out not enough for compeative gaming. While you may not be able to tell anything is moving smoother at fps above 60 there are many of us that can. Granted if your monitor is 60hz you wont notice anyhting over 60 because you are limited by the displayed refresh rate. On a 120hz monitor you can't benefit visually from a framerate over 120fps.



The entire reason I bought my 120hz AW2310 is for the increased refresh rate NOT 3D. I got the 3D kit after the fact just to check it out. While I can't look at a screen and tell you what the refresh is I can see a very noticable difference between 60 and 85 aswell as 85 and 120. I can even look at my 120hz monitor and see that the animations, while as smooth as I have ever seen on any LCD, are not perfectly smooth.



Here is a test. If you move your mouse pointer across the screen really quickly and try and focus on it do you or do you not see space between each "draw" of the pointer (I am not talking about ghosting)? If you don't see this then you wont likely see a benefit from a higher refresh rate or frame rate. I see this even on my 120hz @ 120+ fps.

#12
Posted 02/25/2010 07:24 PM   
[quote name='MistaP' post='1007987' date='Feb 25 2010, 07:24 PM']I will be forced to strongly disagree with you. As a competative counter-strike player I can tell you without a doubt that 60fps is flat out not enough for compeative gaming. While you may not be able to tell anything is moving smoother at fps above 60 there are many of us that can. Granted if your monitor is 60hz you wont notice anyhting over 60 because you are limited by the displayed refresh rate. On a 120hz monitor you can't benefit visually from a framerate over 120fps.

The entire reason I bought my 120hz AW2310 is for the increased refresh rate NOT 3D. I got the 3D kit after the fact just to check it out. While I can't look at a screen and tell you what the refresh is I can see a very noticable difference between 60 and 85 aswell as 85 and 120. I can even look at my 120hz monitor and see that the animations, while as smooth as I have ever seen on any LCD, are not perfectly smooth.

Here is a test. If you move your mouse pointer across the screen really quickly and try and focus on it do you or do you not see space between each "draw" of the pointer (I am not talking about ghosting)? If you don't see this then you wont likely see a benefit from a higher refresh rate or frame rate. I see this even on my 120hz @ 120+ fps.[/quote]

I agree with you on this but i thought the limatation for a healthy human eye is 90fps.

Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!!
[quote name='MistaP' post='1007987' date='Feb 25 2010, 07:24 PM']I will be forced to strongly disagree with you. As a competative counter-strike player I can tell you without a doubt that 60fps is flat out not enough for compeative gaming. While you may not be able to tell anything is moving smoother at fps above 60 there are many of us that can. Granted if your monitor is 60hz you wont notice anyhting over 60 because you are limited by the displayed refresh rate. On a 120hz monitor you can't benefit visually from a framerate over 120fps.



The entire reason I bought my 120hz AW2310 is for the increased refresh rate NOT 3D. I got the 3D kit after the fact just to check it out. While I can't look at a screen and tell you what the refresh is I can see a very noticable difference between 60 and 85 aswell as 85 and 120. I can even look at my 120hz monitor and see that the animations, while as smooth as I have ever seen on any LCD, are not perfectly smooth.



Here is a test. If you move your mouse pointer across the screen really quickly and try and focus on it do you or do you not see space between each "draw" of the pointer (I am not talking about ghosting)? If you don't see this then you wont likely see a benefit from a higher refresh rate or frame rate. I see this even on my 120hz @ 120+ fps.



I agree with you on this but i thought the limatation for a healthy human eye is 90fps.



Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!!

#13
Posted 02/25/2010 07:28 PM   
[quote name='Adz 3000' post='1007989' date='Feb 25 2010, 01:28 PM']I agree with you on this but i thought the limatation for a healthy human eye is 90fps.

Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!![/quote]
That is exactly my point. Why waste the power on frames when the human eye cant see it. Are you REALLY getting that many more kills because of it? If that were true, all online games should have a locked framerate.
[quote name='Adz 3000' post='1007989' date='Feb 25 2010, 01:28 PM']I agree with you on this but i thought the limatation for a healthy human eye is 90fps.



Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!!

That is exactly my point. Why waste the power on frames when the human eye cant see it. Are you REALLY getting that many more kills because of it? If that were true, all online games should have a locked framerate.

#14
Posted 02/25/2010 07:41 PM   
[quote name='Adz 3000' post='1007989' date='Feb 25 2010, 01:28 PM']I agree with you on this but i thought the limatation for a healthy human eye is 90fps.

Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!![/quote]
That is exactly my point. Why waste the power on frames when the human eye cant see it. Are you REALLY getting that many more kills because of it? If that were true, all online games should have a locked framerate.
[quote name='Adz 3000' post='1007989' date='Feb 25 2010, 01:28 PM']I agree with you on this but i thought the limatation for a healthy human eye is 90fps.



Lol i got mates who think l4d1+2 runs smooth on the xbox (30fps) i carnt stand it!!!!

That is exactly my point. Why waste the power on frames when the human eye cant see it. Are you REALLY getting that many more kills because of it? If that were true, all online games should have a locked framerate.

#15
Posted 02/25/2010 07:42 PM   
  1 / 3    
Scroll To Top