[Join The Revolt] Lets make 3D Vision available on 'Optimus' laptops
  1 / 3    
[b]UPDATE: This thread was supposed to rise awareness of the problem, so that more people would be aware of its nature, and maybe start banging at NVIDIA's doors at large numbers, but reading on, you will see, NVIDIA became sort of a 'goddess' in this section of the forum and some of the people swore their lifes to protect it. Instead of fighting each other at the start-line we could join forces and influance NVIDIA to make small adjustments to their software. I encourage you to stay open-minded and read on. We can not stay ignorant, there are very important fuatures missing, which are essential, easy to implement, but NVIDIA stays ignorant to a single request. If you are equiped with a 4K screen for instance or plan buying one, you might also want to look at this, which is another example of NVIDIA's ignorance: [/b][url]https://forums.geforce.com/default/topic/844905/-feature-request-nonblurry-upscaling-at-integer-ratios/[/url] Hello, I would like to request this technology to be made available to laptops which do not have a direct HDMI connectivity to NVIDIA GPUs. I do not have access to statistics but a couple of my friends including me are in such a situation. These are not some low-end laptops, these are highest end laptops made by ASUS and MSI and are powerfull enough to run all the games we desire. Me, being a CS scientist, I do not see any reason why this technology should'nt or couldn't be made available to us. I am not into hardware, I'm rather into security, reverse engineering, crypto etc. BUT. to me it is clear. It is a purely software problem. I can see reasons, such as a more difficult estimation of delays, etc. - but this could be easilly overcome by adding another level of 'profiling' or please create a wizard, using which, we could tweak the delay to our likings. I guess you already create such profiles for 3D Vision enabled devices. Please do not ignore this request just like you seem to be ignoring a request of adding a nearest neighboor interpolation to the driver which is a basic need for one playing on lower resolutions than native 4K All the laptops which I've tested do support gaming at 120HZ, tested on Rise Of The Tomb Raider. So there is no reason not to support 3D Vision in our configurations. I'm not asking for 3D Vision to be made available on DLP-Link projectors without buying the 3D vision kit (though I think that would be great). Please just meet our expectations in the middle.
UPDATE: This thread was supposed to rise awareness of the problem, so that more people would be aware of its nature, and maybe start banging at NVIDIA's doors at large numbers, but reading on, you will see, NVIDIA became sort of a 'goddess' in this section of the forum and some of the people swore their lifes to protect it. Instead of fighting each other at the start-line we could join forces and influance NVIDIA to make small adjustments to their software.

I encourage you to stay open-minded and read on. We can not stay ignorant, there are very important fuatures missing, which are essential, easy to implement, but NVIDIA stays ignorant to a single request. If you are equiped with a 4K screen for instance or plan buying one, you might also want to look at this, which is another example of NVIDIA's ignorance:
https://forums.geforce.com/default/topic/844905/-feature-request-nonblurry-upscaling-at-integer-ratios/



Hello,

I would like to request this technology to be made available to laptops which do not have a direct HDMI connectivity to NVIDIA GPUs. I do not have access to statistics but a couple of my friends including me are in such a situation.

These are not some low-end laptops, these are highest end laptops made by ASUS and MSI and are powerfull enough to run all the games we desire.

Me, being a CS scientist, I do not see any reason why this technology should'nt or couldn't be made available to us. I am not into hardware, I'm rather into security, reverse engineering, crypto etc. BUT. to me it is clear. It is a purely software problem. I can see reasons, such as a more difficult estimation of delays, etc. - but this could be easilly overcome by adding another level of 'profiling' or please create a wizard, using which, we could tweak the delay to our likings. I guess you already create such profiles for 3D Vision enabled devices.

Please do not ignore this request just like you seem to be ignoring a request of adding a nearest neighboor interpolation to the driver which is a basic need for one playing on lower resolutions than native 4K

All the laptops which I've tested do support gaming at 120HZ, tested on Rise Of The Tomb Raider. So there is no reason not to support 3D Vision in our configurations.

I'm not asking for 3D Vision to be made available on DLP-Link projectors without buying the 3D vision kit (though I think that would be great). Please just meet our expectations in the middle.

#1
Posted 03/23/2017 10:05 PM   
This is a user to user forum. I don't suppose you filed a feature request? Or, talked to chat? Or, opened a support ticket?
This is a user to user forum.

I don't suppose you filed a feature request?

Or, talked to chat?

Or, opened a support ticket?

#2
Posted 03/23/2017 10:10 PM   
you see people have been into such sports for long on this forum [url]https://forums.geforce.com/default/topic/844905/geforce-drivers/-feature-request-nonblurry-upscaling-at-integer-ratios/23/[/url]

#3
Posted 03/23/2017 10:18 PM   
besides I want more people to know the rational behind not supprting 3D vision on our laptops. together maybe we can achieve more
besides I want more people to know the rational behind not supprting 3D vision on our laptops. together maybe we can achieve more

#4
Posted 03/23/2017 10:19 PM   
As stated, this is a user to user to forum. You may as well go to the closet fountain and toss a penny in it, to obtain a better chance. Or you can open some type of "direct" dialogue with Nvidia, so that they can directly tell you that "it will not work", "it can work and they have no interest in implementing the feature" or "sure, we'll gladly get right on that and have it ready next week" VR users have this exact same issue and it is one of the reasons that Nvidia has their "VR Ready" label now.
As stated, this is a user to user to forum.

You may as well go to the closet fountain and toss a penny in it, to obtain a better chance.

Or you can open some type of "direct" dialogue with Nvidia, so that they can directly tell you that "it will not work", "it can work and they have no interest in implementing the feature" or "sure, we'll gladly get right on that and have it ready next week"

VR users have this exact same issue and it is one of the reasons that Nvidia has their "VR Ready" label now.

#5
Posted 03/23/2017 11:01 PM   
Like D-Man11 said, if you want to tell/ask NVIDIA something, contact them directly. They are not going to read this thread. And while I don't want to discourage you from asking them to support a feature, realistically they aren't going to do anything. 3D Vision is not where they are making their money. There are also technical limitations to what you're asking for. You're going to need an external IR emitter, the display needs to have a fast response time, and the backlight needs to support strobing for LightBoost.
Like D-Man11 said, if you want to tell/ask NVIDIA something, contact them directly. They are not going to read this thread.

And while I don't want to discourage you from asking them to support a feature, realistically they aren't going to do anything. 3D Vision is not where they are making their money.

There are also technical limitations to what you're asking for. You're going to need an external IR emitter, the display needs to have a fast response time, and the backlight needs to support strobing for LightBoost.

#6
Posted 03/23/2017 11:18 PM   
D-Man; AyyMD, thank you for your responses, I won't bother myself contacting NVIDIA through their direct communication lines etc. as we all very well know where this would land. I just wanted to increase your awareness of their ignorance. AyyMD, as for the limtations which you've mentioned.. there are three things one needs to make the Frame Sequence-3D work: 1) video stream of a high enough frequency - lets say 120hz so to make it around 60Hz per eye 2) a signal which would sync glasses, their shuttering with what is being displayed. lets say an IR diode as in 3D vision (a diode like this one is already built in into my dlp-link projector for the very purpose no 3D vision-funny-dongle required. that 3D dongle actually makes things far worse. it imposes a delay) 3)the already mentioned glasses. such glasses were alreadt provided with my already mentioned dlp-link projector. so if NVIDIA were not so much after money I wouldn't need to buy the 3D vision kit at all just so to plug that dongle to my USB port to enable the 3D functionallity in the driver. IN THE DRIVER and a driver is a SOFTWARE. a friend of mine who also has a 3d-vision kit, dlp-link projector and a stationary PC. he just plugs the dongle and throws a blanket on it so it wouldn't mess the dlp-link glasses. and why? becouse a dlp-link technology is superior to 3d-vision kit. but he needs to plug that damn thing for the driver to agree to produce the image. now. there is nothing stopping nvidia to make it work through iGPU. no LightBoost needed. please do not spread misconceptions.
D-Man; AyyMD, thank you for your responses, I won't bother myself contacting NVIDIA through their direct communication lines etc. as we all very well know where this would land. I just wanted to increase your awareness of their ignorance.

AyyMD, as for the limtations which you've mentioned..

there are three things one needs to make the Frame Sequence-3D work:

1) video stream of a high enough frequency - lets say 120hz so to make it around 60Hz per eye

2) a signal which would sync glasses, their shuttering with what is being displayed. lets say an IR diode as in 3D vision (a diode like this one is already built in into my dlp-link projector for the very purpose no 3D vision-funny-dongle required. that 3D dongle actually makes things far worse. it imposes a delay)

3)the already mentioned glasses. such glasses were alreadt provided with my already mentioned dlp-link projector. so if NVIDIA were not so much after money I wouldn't need to buy the 3D vision kit at all just so to plug that dongle to my USB port to enable the 3D functionallity in the driver. IN THE DRIVER and a driver is a SOFTWARE.

a friend of mine who also has a 3d-vision kit, dlp-link projector and a stationary PC. he just plugs the dongle and throws a blanket on it so it wouldn't mess the dlp-link glasses. and why? becouse a dlp-link technology is superior to 3d-vision kit. but he needs to plug that damn thing for the driver to agree to produce the image.

now. there is nothing stopping nvidia to make it work through iGPU.

no LightBoost needed.

please do not spread misconceptions.

#7
Posted 03/25/2017 11:55 PM   
[quote="vega4"]D-Man; AyyMD, thank you for your responses, I won't bother myself contacting NVIDIA through their direct communication lines etc. as we all very well know where this would land. I just wanted to increase your awareness of their ignorance. AyyMD, as for the limtations which you've mentioned.. there are three things one needs to make the Frame Sequence-3D work: 1) video stream of a high enough frequency - lets say 120hz so to make it around 60Hz per eye 2) a signal which would sync glasses, their shuttering with what is being displayed. lets say an IR diode as in 3D vision (a diode like this one is already built in into my dlp-link projector for the very purpose no 3D vision-funny-dongle required. that 3D dongle actually makes things far worse. it imposes a delay) 3)the already mentioned glasses. such glasses were alreadt provided with my already mentioned dlp-link projector. so if NVIDIA were not so much after money I wouldn't need to buy the 3D vision kit at all just so to plug that dongle to my USB port to enable the 3D functionallity in the driver. IN THE DRIVER and a driver is a SOFTWARE. a friend of mine who also has a 3d-vision kit, dlp-link projector and a stationary PC. he just plugs the dongle and throws a blanket on it so it wouldn't mess the dlp-link glasses. and why? becouse a dlp-link technology is superior to 3d-vision kit. but he needs to plug that damn thing for the driver to agree to produce the image. now. there is nothing stopping nvidia to make it work through iGPU. no LightBoost needed. please do not spread misconceptions. [/quote] Hmm... a blanket...ROFL... while that "hack" might work and you have a very VALID point there, if you read and understand a bit better how hardware works, you will see WHY the limitation... Consumers these days... Want everything to JUST "work out the box"... 3D Vision WAS NEVER ADVERTISED for OPTIMUS laptops, just SPECIALLY BUILT ONES! While I agree with you, it will never ever happen at this iteration!
vega4 said:D-Man; AyyMD, thank you for your responses, I won't bother myself contacting NVIDIA through their direct communication lines etc. as we all very well know where this would land. I just wanted to increase your awareness of their ignorance.

AyyMD, as for the limtations which you've mentioned..

there are three things one needs to make the Frame Sequence-3D work:

1) video stream of a high enough frequency - lets say 120hz so to make it around 60Hz per eye

2) a signal which would sync glasses, their shuttering with what is being displayed. lets say an IR diode as in 3D vision (a diode like this one is already built in into my dlp-link projector for the very purpose no 3D vision-funny-dongle required. that 3D dongle actually makes things far worse. it imposes a delay)

3)the already mentioned glasses. such glasses were alreadt provided with my already mentioned dlp-link projector. so if NVIDIA were not so much after money I wouldn't need to buy the 3D vision kit at all just so to plug that dongle to my USB port to enable the 3D functionallity in the driver. IN THE DRIVER and a driver is a SOFTWARE.

a friend of mine who also has a 3d-vision kit, dlp-link projector and a stationary PC. he just plugs the dongle and throws a blanket on it so it wouldn't mess the dlp-link glasses. and why? becouse a dlp-link technology is superior to 3d-vision kit. but he needs to plug that damn thing for the driver to agree to produce the image.

now. there is nothing stopping nvidia to make it work through iGPU.

no LightBoost needed.

please do not spread misconceptions.



Hmm... a blanket...ROFL... while that "hack" might work and you have a very VALID point there, if you read and understand a bit better how hardware works, you will see WHY the limitation...

Consumers these days... Want everything to JUST "work out the box"... 3D Vision WAS NEVER ADVERTISED for OPTIMUS laptops, just SPECIALLY BUILT ONES!

While I agree with you, it will never ever happen at this iteration!

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#8
Posted 03/26/2017 12:22 AM   
[quote="helifax"]if you read and understand a bit better how hardware works, you will see WHY the limitation... [/quote] As I've mentioned I am not into hardware MUCH, although I'm into certain fields of Computer Science *A LOT*,as a researcher. The concepts and technology used inside of 3D Vision are very simple.At least the hardware part. Rendering of the scenes, conversion of certain effects I bet are much more demanding. As for the HARDWARE part -I believe, there is nothing to understand beyond what I've already mentioned. Please forgive me for being so direct, but if you proclaim such statements and mean a specific reason please enlighten me, or us, around here so we can understand better. Please be specific. I see no point in creating a mysterious aura around the subject. The only reason which I can think of are the delays imposed by additional processing inside of iGPU's memory. [quote="helifax"] Consumers these days... Want everything to JUST "work out the box"... 3D Vision WAS NEVER ADVERTISED for OPTIMUS laptops, just SPECIALLY BUILT ONES! [/quote] I think people would be VERY pleased not to have this working out of the box, instead - they would be more than happy to get through a very simple and short calibration wizard, than not to have it AVAILABLE AT ALL due to some silly reason. so what that 3D vision was never advertised for OPTIMUS laptops? The OPTIMUS technology, being a very basic concept of how dedicated gpu switches back to the one inside of CPU is nothing sophisticated nor complicated nor special. To me it is like saying that '3D vision' was not advertised for use inside of laptops made by HP. The funniest point is that the OPTIMUS naming convention was advertised by NVIDIA *ITSELF*- now thaaat is a ROTFL. A customer shouldn't have need to know some certainveeery low-level architectural pecularities of a certain laptop. the OPTIMUS thing is even not advertised by laptop manufacturers anymore. it is a basic concept used in almost every laptop that external GPU swithes back to iGPU.A laptop is a multi-purpose computer after all *AND* it meets aaaall the the *OFFICIAL* requirements for 3D Vision. Hell I do not even know if my laptop has what is called OPTIMUS technology. All I know is that it has no direct HDMI output from geforce which is the reason. I mean - my whole setup: [url]http://www.nvidia.com/object/3d-vision-system-requirements.html[/url] It is good you see my point. If you say this won't happen in this itteration etc. you just seem to have agreed with how things work among NVIDIA and their customers. well I have not agreed yet :D anyone else?;p Please also note the issue with nearest-neighboor scalling. We've been waiting for years already.
helifax said:if you read and understand a bit better how hardware works, you will see WHY the limitation...

As I've mentioned I am not into hardware MUCH, although I'm into certain fields of Computer Science *A LOT*,as a researcher.

The concepts and technology used inside of 3D Vision are very simple.At least the hardware part. Rendering of the scenes, conversion of certain effects I bet are much more demanding. As for the HARDWARE part -I believe, there is nothing to understand beyond what I've already mentioned.

Please forgive me for being so direct, but if you proclaim such statements and mean a specific reason please enlighten me, or us, around here so we can understand better. Please be specific. I see no point in creating a mysterious aura around the subject.

The only reason which I can think of are the delays imposed by additional processing inside of iGPU's memory.

helifax said:
Consumers these days... Want everything to JUST "work out the box"... 3D Vision WAS NEVER ADVERTISED for OPTIMUS laptops, just SPECIALLY BUILT ONES!


I think people would be VERY pleased not to have this working out of the box, instead - they would be more than happy to get through a very simple and short calibration wizard, than not to have it AVAILABLE AT ALL due to some silly reason.

so what that 3D vision was never advertised for OPTIMUS laptops? The OPTIMUS technology, being a very basic concept of how dedicated gpu switches back to the one inside of CPU is nothing sophisticated nor complicated nor special. To me it is like saying that '3D vision' was not advertised for use inside of laptops made by HP. The funniest point is that the OPTIMUS naming convention was advertised by NVIDIA *ITSELF*- now thaaat is a ROTFL.

A customer shouldn't have need to know some certainveeery low-level architectural pecularities of a certain laptop. the OPTIMUS thing is even not advertised by laptop manufacturers anymore. it is a basic concept used in almost every laptop that external GPU swithes back to iGPU.A laptop is a multi-purpose computer after all *AND* it meets aaaall the the *OFFICIAL* requirements for 3D Vision.

Hell I do not even know if my laptop has what is called OPTIMUS technology. All I know is that it has no direct HDMI output from geforce which is the reason.

I mean - my whole setup:
http://www.nvidia.com/object/3d-vision-system-requirements.html

It is good you see my point.

If you say this won't happen in this itteration etc. you just seem to have agreed with how things work among NVIDIA and their customers. well I have not agreed yet :D anyone else?;p

Please also note the issue with nearest-neighboor scalling. We've been waiting for years already.

#9
Posted 03/28/2017 09:50 PM   
Well Hybrid SLI "never" worked well when it was available in 2008-2009 (years might be off, too lazy to confirm) but it was in desktops and then laptops that used motherboards with Nvidia's SouthBridge chips. Lucid Hydra never worked worth a fuck either. dx12 has had limited success with it. You can theorize all you want about Nvidia's Gpu rendering to Intels, but you need to look at the pudding and the proof it contains. The solution would have been that your laptop manufacturer would have routed the HDMI output port off of the Nvidia GPU, as has already stated. Your "failing" to contact support, is on you and I've no idea why you keep posting here in a user to user forum expecting some kind of result from the few that actively read this sub-forum. We're all aware of the issues with Optimus, it has been posted multiple times before and this thread is redundant.
Well Hybrid SLI "never" worked well when it was available in 2008-2009 (years might be off, too lazy to confirm) but it was in desktops and then laptops that used motherboards with Nvidia's SouthBridge chips.

Lucid Hydra never worked worth a fuck either. dx12 has had limited success with it.

You can theorize all you want about Nvidia's Gpu rendering to Intels, but you need to look at the pudding and the proof it contains.

The solution would have been that your laptop manufacturer would have routed the HDMI output port off of the Nvidia GPU, as has already stated.

Your "failing" to contact support, is on you and I've no idea why you keep posting here in a user to user forum expecting some kind of result from the few that actively read this sub-forum.

We're all aware of the issues with Optimus, it has been posted multiple times before and this thread is redundant.

#10
Posted 03/28/2017 10:19 PM   
[quote="D-Man11"]Well Hybrid SLI "never" worked well when it was available in 2008-2009 (years might be off, too lazy to confirm) but it was in desktops and then laptops that used motherboards with Nvidia's SouthBridge chips. [/quote] .. sure I do recall the days when SLI was born many years ago. But I wonder why do you mention SLI and Lucid Hydra at all. are you an NVIDA emplyee?:D All we need is the ability to produce 120HZ image and sync it with that funny usb-dongle. well... I'll cover mine with a blanket but that's just me. my laptop does 120hz very well :D to be honest I watch 3D movies using this very laptop using DLP-LINK technology you see. the image is PERFECT:) and yeah out of curiosity I've even forced geforce to render the video inside of VLC instead of iGPU - it worked also. [quote="D-Man11"] We're all aware of the issues with Optimus, it has been posted multiple times before and this thread is redundant.[/quote] Actually I havent found a single thread with a clear reason and explanation and resolution to this problem. [quote="D-Man11"] I've no idea why you keep posting here in a user to user forum [/quote] well I'll tell you. a single person has no chance to influance a conglomerate , such as NVIDIA, to change their behavior toward their clients, especially if the client has no deep technical knowledge of the internals. clients read smart-looking explanations posted by people like you or the other guy, you've now even recalled the SLI technology etc which has nothing to do with the problem at hand, being a purely SOFTWARE one. and so.. oh well.. I'm here to start a REVOLUTION xdd while also having some fun
D-Man11 said:Well Hybrid SLI "never" worked well when it was available in 2008-2009 (years might be off, too lazy to confirm) but it was in desktops and then laptops that used motherboards with Nvidia's SouthBridge chips.


.. sure I do recall the days when SLI was born many years ago. But I wonder why do you mention SLI and Lucid Hydra at all. are you an NVIDA emplyee?:D

All we need is the ability to produce 120HZ image and sync it with that funny usb-dongle. well... I'll cover mine with a blanket but that's just me. my laptop does 120hz very well :D to be honest I watch 3D movies using this very laptop using DLP-LINK technology you see. the image is PERFECT:) and yeah out of curiosity I've even forced geforce to render the video inside of VLC instead of iGPU - it worked also.

D-Man11 said:
We're all aware of the issues with Optimus, it has been posted multiple times before and this thread is redundant.


Actually I havent found a single thread with a clear reason and explanation and resolution to this problem.

D-Man11 said:
I've no idea why you keep posting here in a user to user forum

well I'll tell you. a single person has no chance to influance a conglomerate , such as NVIDIA, to change their behavior toward their clients, especially if the client has no deep technical knowledge of the internals. clients read smart-looking explanations posted by people like you or the other guy, you've now even recalled the SLI technology etc which has nothing to do with the problem at hand, being a purely SOFTWARE one.

and so.. oh well.. I'm here to start a REVOLUTION xdd while also having some fun

#11
Posted 03/28/2017 10:35 PM   
Rendering static images that are already drawn is nothing. I do not see how being able to play a video is in any way relevant. And why is it that you have not contacted the laptop manufacturer, Intel or Nvidia and touted your CS knowledge to them?
Rendering static images that are already drawn is nothing. I do not see how being able to play a video is in any way relevant.

And why is it that you have not contacted the laptop manufacturer, Intel or Nvidia and touted your CS knowledge to them?

#12
Posted 03/28/2017 10:49 PM   
[quote="D-Man11"]Rendering static images that are already drawn is nothing. I do not see how being able to play a video is in any way relevant. [/quote] well. tell you what - that is all there is to it, no magic:) sadly. a frame of video vs a frame of a computer game are indistinguishable from the hardware point of view. as long as it has been rendered - there is NO difference. The GPU has to 'render' a video frame, or it it has to render a frame produced by a game engine. Well, it doesn't even care. The results are only sequences of bytes. Remember. Sequences of bytes. No magic behind it. [quote="D-Man11"] And why is it that you have not contacted the laptop manufacturer, Intel or Nvidia and touted your CS knowledge to them? [/quote] As I've already stated - I'm here to start a REVOLUTION. Have you ever heared about a 1-men revolt?
D-Man11 said:Rendering static images that are already drawn is nothing. I do not see how being able to play a video is in any way relevant.


well. tell you what - that is all there is to it, no magic:) sadly.

a frame of video vs a frame of a computer game are indistinguishable from the hardware point of view. as long as it has been rendered - there is NO difference. The GPU has to 'render' a video frame, or it it has to render a frame produced by a game engine. Well, it doesn't even care.

The results are only sequences of bytes. Remember. Sequences of bytes. No magic behind it.



D-Man11 said:
And why is it that you have not contacted the laptop manufacturer, Intel or Nvidia and touted your CS knowledge to them?


As I've already stated - I'm here to start a REVOLUTION. Have you ever heared about a 1-men revolt?

#13
Posted 03/28/2017 10:59 PM   
You seem awfully certain that you know exactly how it all works, and it's all easy. I'm a lot less sure that everything is so easy. And I write code. Here is an very quick google I found of Oculus testing the latency involved for possibly using an Optimus poisoned laptop for VR. [url]https://www.reddit.com/r/oculus/comments/30lcpo/my_testing_shows_nvidia_optimus_on_880m_adds/[/url] The oculus part is not interesting, their latency estimate is. It's on the order of 50 to 80ms. At 120Hz for 3D Vision glasses, that works out to be 8ms per frame. So, having a latency of 4 to 5 frames would seem to be a deal killer when 3D Vision has the same problem as VR, in that synchronization is paramount. If the emitter and screen are out of sync, you get ghosting. You might be able to get 3D TV Play to work, as then the sync would be with the actual frame being drawn. You'd still be 4 frames slow, but that would just add mouse lag, not visual lag. You don't use 3D Vision glasses with 3D TV Play, you use the DLP glasses. There is a free-trial, it costs you nothing to try.
You seem awfully certain that you know exactly how it all works, and it's all easy.

I'm a lot less sure that everything is so easy. And I write code.


Here is an very quick google I found of Oculus testing the latency involved for possibly using an Optimus poisoned laptop for VR.

https://www.reddit.com/r/oculus/comments/30lcpo/my_testing_shows_nvidia_optimus_on_880m_adds/

The oculus part is not interesting, their latency estimate is. It's on the order of 50 to 80ms.

At 120Hz for 3D Vision glasses, that works out to be 8ms per frame. So, having a latency of 4 to 5 frames would seem to be a deal killer when 3D Vision has the same problem as VR, in that synchronization is paramount. If the emitter and screen are out of sync, you get ghosting.


You might be able to get 3D TV Play to work, as then the sync would be with the actual frame being drawn. You'd still be 4 frames slow, but that would just add mouse lag, not visual lag.

You don't use 3D Vision glasses with 3D TV Play, you use the DLP glasses.

There is a free-trial, it costs you nothing to try.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#14
Posted 03/29/2017 06:26 AM   
[quote="vega4"]D-Man; AyyMD, thank you for your responses, I won't bother myself contacting NVIDIA through their direct communication lines etc. as we all very well know where this would land. I just wanted to increase your awareness of their ignorance. [/quote] I don't think it has to do with ignorance, it's more an issue of where to put in funds / time / effort, and for 3D Vision it's not interesting to them any longer (probably due to not making any money of it). It's sad really. I think most people who have played 3D on a large projector screen or even just on the monitor would agree it's pretty damn immersive and the best graphical effect that exist to enchance the immersiveness for a game. ...and some people think VR will be a big thing lol.. it'll surely die off too when developers realise very few people will pay up for it, there's an initial interest (it was most likely the same for 3D Vision kits). It doesnt catch on no matter how great it is because most people just arent "hardcore" enough to invest, they stick to what they have because it works, it isnt a hassle etc. Anyways, if you want 3D Vision buy a used DLP projector.. you'll be amazed at playing games at 100" or larger, and in 3D. The latency is also good so no ghosting. I got my Benq for next to nothing (like $70 or similar), new lamps can be bought from ebay really cheap.
vega4 said:D-Man; AyyMD, thank you for your responses, I won't bother myself contacting NVIDIA through their direct communication lines etc. as we all very well know where this would land. I just wanted to increase your awareness of their ignorance.


I don't think it has to do with ignorance, it's more an issue of where to put in funds / time / effort, and for 3D Vision it's not interesting to them any longer (probably due to not making any money of it).

It's sad really. I think most people who have played 3D on a large projector screen or even just on the monitor would agree it's pretty damn immersive and the best graphical effect that exist to enchance the immersiveness for a game. ...and some people think VR will be a big thing lol.. it'll surely die off too when developers realise very few people will pay up for it, there's an initial interest (it was most likely the same for 3D Vision kits). It doesnt catch on no matter how great it is because most people just arent "hardcore" enough to invest, they stick to what they have because it works, it isnt a hassle etc.

Anyways, if you want 3D Vision buy a used DLP projector.. you'll be amazed at playing games at 100" or larger, and in 3D. The latency is also good so no ghosting. I got my Benq for next to nothing (like $70 or similar), new lamps can be bought from ebay really cheap.

Computer: i7 2600K @4.8GHz / Asus Sabertooth P67 Rev3 / 32GB Corsair Vengeance / GTX 980ti / 34" Samsung S34E790C
Projectors: BenQ W700 / BenQ MH741

#15
Posted 03/29/2017 09:26 AM   
  1 / 3    
Scroll To Top