GeForce 3D Vision QA Response Latest: 1/18/2010
  9 / 9    
Hello,

Can someone please tell me where the IR emitter should be placed? Does it has to be close to the monitor or it doesn't matter?
The thing is that I use a projector, projecting the picture on the ceiling. Can I use Nvidia 3D like that?
Hello,



Can someone please tell me where the IR emitter should be placed? Does it has to be close to the monitor or it doesn't matter?

The thing is that I use a projector, projecting the picture on the ceiling. Can I use Nvidia 3D like that?

Posted 11/26/2011 05:27 PM   
There should be a clear line of sight between the glasses (i.e. your face) and the emitter. The distance shouldn't be to big either.

You don't have to look at the emitter, but the emitter has to "look" at you, so to say, i.e. it's angle should face towards the head.

I wouldn't try to project it on the ceiling, though in theory that could work, too (because it's using IR light, and that's reflected on the ceiling, too, but you might get some problems because of the distance).
There should be a clear line of sight between the glasses (i.e. your face) and the emitter. The distance shouldn't be to big either.



You don't have to look at the emitter, but the emitter has to "look" at you, so to say, i.e. it's angle should face towards the head.



I wouldn't try to project it on the ceiling, though in theory that could work, too (because it's using IR light, and that's reflected on the ceiling, too, but you might get some problems because of the distance).

Posted 11/27/2011 04:52 PM   
Grestorn, thank you very much. That was the answer I was waiting for. It is so comfortable watching movies lying on the bed ^_^
I was not sure if the IR works also if the emitter is not placed infornt of the glasses. Mine would have to be placed like under the glasses, close to my feet but looking to my head as you said. I hope it works without problems.
One last question, the distance between the emitter and the glasses has to be the same as between the ceiling and the glasses? Or it does not matter?
Grestorn, thank you very much. That was the answer I was waiting for. It is so comfortable watching movies lying on the bed ^_^

I was not sure if the IR works also if the emitter is not placed infornt of the glasses. Mine would have to be placed like under the glasses, close to my feet but looking to my head as you said. I hope it works without problems.

One last question, the distance between the emitter and the glasses has to be the same as between the ceiling and the glasses? Or it does not matter?

Posted 11/28/2011 06:40 AM   
The distance doesn't have to be the same. The speed of light is high enough, that you certainly won't notice the difference ... :)

If the emitter is placed very low, make sure that your face or the bed sheets aren't actually obstructing the line of sight between the emitter and the receiving sensor in the glasses. The sensor is placed on the left side of the left glass.

I just checked if there's a problem if I hold the glasses in various angles relatively to the imaginary line between the emitter and the glasses sensor. And it worked just fine, even at extreme angles (up to 90°).
The distance doesn't have to be the same. The speed of light is high enough, that you certainly won't notice the difference ... :)



If the emitter is placed very low, make sure that your face or the bed sheets aren't actually obstructing the line of sight between the emitter and the receiving sensor in the glasses. The sensor is placed on the left side of the left glass.



I just checked if there's a problem if I hold the glasses in various angles relatively to the imaginary line between the emitter and the glasses sensor. And it worked just fine, even at extreme angles (up to 90°).

Posted 11/28/2011 09:48 AM   
Thank you very much, I should be fine then. You have been of great help.
Thank you very much, I should be fine then. You have been of great help.

Posted 11/28/2011 05:32 PM   
[i]We can't commit to a support schedule yet for OpenGL 3D Vision. We are working on it and many other new features. I would like to ask the community, what are the applications that are driving the demand for this feature?[/i]

Most smaller companies who works with CAD and other OGL applications still use gaming graphic cards because thay can not afford Quadro cards (which supports OGL 3D vision via quad buffer). If you add quad buffer support to standard 3D vision, then everyone can start working with 3D! I think that's a very resonable demand from all 3D companies, except Nvidia because you will sell less of expensive cards =) But instead you will sell more gaming cards, because companies will choose Nvidia over AMD.
We can't commit to a support schedule yet for OpenGL 3D Vision. We are working on it and many other new features. I would like to ask the community, what are the applications that are driving the demand for this feature?



Most smaller companies who works with CAD and other OGL applications still use gaming graphic cards because thay can not afford Quadro cards (which supports OGL 3D vision via quad buffer). If you add quad buffer support to standard 3D vision, then everyone can start working with 3D! I think that's a very resonable demand from all 3D companies, except Nvidia because you will sell less of expensive cards =) But instead you will sell more gaming cards, because companies will choose Nvidia over AMD.

Posted 12/15/2011 03:30 PM   
if there any chance that optimus will support 3d vision as well?
if there any chance that optimus will support 3d vision as well?

Posted 12/20/2011 02:44 AM   
I want to know for sure if 3D Vision uses up more Video Memory than in normal mode. When I'm playing Battlefield 3 in 1080P on Ultra settings without 3D vision, I get about 120 Frames per second on two GTX 580's in SLI. When I switch to 3D Vision, I only get 30-40 Frames per second. My Vram is 1.5GB If I upgraded to 3GB cards, would it bring my fps back to 60fps in 3d vision or is this a driver issue?
I want to know for sure if 3D Vision uses up more Video Memory than in normal mode. When I'm playing Battlefield 3 in 1080P on Ultra settings without 3D vision, I get about 120 Frames per second on two GTX 580's in SLI. When I switch to 3D Vision, I only get 30-40 Frames per second. My Vram is 1.5GB If I upgraded to 3GB cards, would it bring my fps back to 60fps in 3d vision or is this a driver issue?

Posted 01/13/2012 09:33 AM   
I want to know for sure if 3D Vision uses up more Video Memory than in normal mode. When I'm playing Battlefield 3 in 1080P on Ultra settings without 3D vision, I get about 120 Frames per second on two GTX 580's in SLI. When I switch to 3D Vision, I only get 30-40 Frames per second. My Vram is 1.5GB If I upgraded to 3GB cards, would it bring my fps back to 60fps in 3d vision or is this a driver issue?
I want to know for sure if 3D Vision uses up more Video Memory than in normal mode. When I'm playing Battlefield 3 in 1080P on Ultra settings without 3D vision, I get about 120 Frames per second on two GTX 580's in SLI. When I switch to 3D Vision, I only get 30-40 Frames per second. My Vram is 1.5GB If I upgraded to 3GB cards, would it bring my fps back to 60fps in 3d vision or is this a driver issue?

Posted 01/13/2012 09:33 AM   
I am a noob at bf3 and with my single gtx 580 I get nowhere near 120fps at 1080p as expected on Ultra settings.

I wouldn't expect to get consistent measurements in multiplayer so I benched a section early in the single player campain righte when you step outside the military wehicle for the first time with a large vista and troops surrounding you. In 3D mode it was quite pretty much 20fps, I stood still while my comrades moved further into the distance. I turned 3D off and the framerate became 43. I'm not completely certain if vsync is applied in 2D mode as it is in 3D mode. What suprised me is that on what I assume is a fairly basic mission my memory usage on the gfx card went up to 1500mb on my 1,5gb card. These settings are clearly on the limit of the cad while playing in 3D.

As soon as I switched off 3D the memory usage dropped to 1330mb which is 170mb less so any memory related problems in 3D would not affect 2D performance. The maximum framerate in 3D mode is 60fps and with your system if you can push around 120fps in 2D you would be jumping between 40fps and 60fps because of vsync while mostly staying on 60fps if there was no bottleneck.

By removing antialiasing the memory footprint is cut by almost 200mb but comparing performance with and without AA is not a very good reference.
By replacing your cards with 3gb you will likely get rid of the bottleneck that reduces your framerate more than expected.

Personally in my current situation I would remove or reduce AA as I find the artifacts less disturbing in 3D compared to 2D.

Edit: File removed
I am a noob at bf3 and with my single gtx 580 I get nowhere near 120fps at 1080p as expected on Ultra settings.



I wouldn't expect to get consistent measurements in multiplayer so I benched a section early in the single player campain righte when you step outside the military wehicle for the first time with a large vista and troops surrounding you. In 3D mode it was quite pretty much 20fps, I stood still while my comrades moved further into the distance. I turned 3D off and the framerate became 43. I'm not completely certain if vsync is applied in 2D mode as it is in 3D mode. What suprised me is that on what I assume is a fairly basic mission my memory usage on the gfx card went up to 1500mb on my 1,5gb card. These settings are clearly on the limit of the cad while playing in 3D.



As soon as I switched off 3D the memory usage dropped to 1330mb which is 170mb less so any memory related problems in 3D would not affect 2D performance. The maximum framerate in 3D mode is 60fps and with your system if you can push around 120fps in 2D you would be jumping between 40fps and 60fps because of vsync while mostly staying on 60fps if there was no bottleneck.



By removing antialiasing the memory footprint is cut by almost 200mb but comparing performance with and without AA is not a very good reference.

By replacing your cards with 3gb you will likely get rid of the bottleneck that reduces your framerate more than expected.



Personally in my current situation I would remove or reduce AA as I find the artifacts less disturbing in 3D compared to 2D.



Edit: File removed

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 01/13/2012 04:44 PM   
I am a noob at bf3 and with my single gtx 580 I get nowhere near 120fps at 1080p as expected on Ultra settings.

I wouldn't expect to get consistent measurements in multiplayer so I benched a section early in the single player campain righte when you step outside the military wehicle for the first time with a large vista and troops surrounding you. In 3D mode it was quite pretty much 20fps, I stood still while my comrades moved further into the distance. I turned 3D off and the framerate became 43. I'm not completely certain if vsync is applied in 2D mode as it is in 3D mode. What suprised me is that on what I assume is a fairly basic mission my memory usage on the gfx card went up to 1500mb on my 1,5gb card. These settings are clearly on the limit of the cad while playing in 3D.

As soon as I switched off 3D the memory usage dropped to 1330mb which is 170mb less so any memory related problems in 3D would not affect 2D performance. The maximum framerate in 3D mode is 60fps and with your system if you can push around 120fps in 2D you would be jumping between 40fps and 60fps because of vsync while mostly staying on 60fps if there was no bottleneck.

By removing antialiasing the memory footprint is cut by almost 200mb but comparing performance with and without AA is not a very good reference.
By replacing your cards with 3gb you will likely get rid of the bottleneck that reduces your framerate more than expected.

Personally in my current situation I would remove or reduce AA as I find the artifacts less disturbing in 3D compared to 2D.

Edit: File removed
I am a noob at bf3 and with my single gtx 580 I get nowhere near 120fps at 1080p as expected on Ultra settings.



I wouldn't expect to get consistent measurements in multiplayer so I benched a section early in the single player campain righte when you step outside the military wehicle for the first time with a large vista and troops surrounding you. In 3D mode it was quite pretty much 20fps, I stood still while my comrades moved further into the distance. I turned 3D off and the framerate became 43. I'm not completely certain if vsync is applied in 2D mode as it is in 3D mode. What suprised me is that on what I assume is a fairly basic mission my memory usage on the gfx card went up to 1500mb on my 1,5gb card. These settings are clearly on the limit of the cad while playing in 3D.



As soon as I switched off 3D the memory usage dropped to 1330mb which is 170mb less so any memory related problems in 3D would not affect 2D performance. The maximum framerate in 3D mode is 60fps and with your system if you can push around 120fps in 2D you would be jumping between 40fps and 60fps because of vsync while mostly staying on 60fps if there was no bottleneck.



By removing antialiasing the memory footprint is cut by almost 200mb but comparing performance with and without AA is not a very good reference.

By replacing your cards with 3gb you will likely get rid of the bottleneck that reduces your framerate more than expected.



Personally in my current situation I would remove or reduce AA as I find the artifacts less disturbing in 3D compared to 2D.



Edit: File removed

Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?

donations: ulfjalmbrant@hotmail.com

Posted 01/13/2012 04:44 PM   
[quote name='Amorphous' date='19 January 2010 - 12:21 PM' timestamp='1263864081' post='983748']
[b]2. Will we see OpenGl 3DVision support soon?[/b]

Great question. We can't commit to a support schedule yet for OpenGL 3D Vision. We are working on it and many other new features. I would like to ask the community, what are the applications that are driving the demand for this feature?
[/quote]

Hello, NVIDIA. Can you please develop the OpenGL support already?

I posted [url="http://forums.nvidia.com/index.php?showtopic=231268"]my full rant in a new topic[/url].

OpenGL is the portable, open standard for 3d graphics.

DirectX/3d is a Microsoft proprietary API, for games on Windows.

It's no good to neglect OpenGL and support only DirectX.

[url="http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX"]Why you should use OpenGL and not DirectX[/url] - from indie game developer David Rosen, founder of the humble indie bundle.
[quote name='Amorphous' date='19 January 2010 - 12:21 PM' timestamp='1263864081' post='983748']

2. Will we see OpenGl 3DVision support soon?



Great question. We can't commit to a support schedule yet for OpenGL 3D Vision. We are working on it and many other new features. I would like to ask the community, what are the applications that are driving the demand for this feature?





Hello, NVIDIA. Can you please develop the OpenGL support already?



I posted my full rant in a new topic.



OpenGL is the portable, open standard for 3d graphics.



DirectX/3d is a Microsoft proprietary API, for games on Windows.



It's no good to neglect OpenGL and support only DirectX.



Why you should use OpenGL and not DirectX - from indie game developer David Rosen, founder of the humble indie bundle.

Posted 06/08/2012 08:23 AM   
  9 / 9    
Scroll To Top