[Join The Revolt] Lets make 3D Vision available on 'Optimus' laptops
2 / 3
https://www.gizmodo.com.au/2017/03/zero-latency-2-0-new-levels-in-virtual-reality/[quote="bo3b"]I'm a lot less sure that everything is so easy. And I write code.
[/quote]
reminds me of the old days while at the university as an udergrad. there was a guy. he wanted to do a phd I wasn't even thinking about doing mine AT THAT TIME. I overheard him saying to some other guy... "well you do not want to be a regular coder right?"... I thought to myself... a regular coder.... everyone wants to be a programmer right?:P oh men I was soooooooo wrong.
now I always teach to my students. everythins is easy you see. everything is simple these are all only sequences of 0s and 1s. think low level write high-level.
[quote="bo3b"]
The oculus part is not interesting, their latency estimate is. It's on the order of 50 to 80ms.
[/quote]
The problem with latency in VR kits is of a different kind than in 3D vision.
VR systems need much better latency in order to trick the brain into thinking it's looking at a virtual world that completely surrounds the player wherever he or she looks. That is why latency in VR display IS important. see?
Now lets get back to the 3D vision kit. The problem here is of a completely different kind. [b][u]You are confusing the concepts.[/u][/b] here I was concerned with latency because you need to synchronise frames being displayed with the shuttering of the glasses.So the right frame gets to a specific eye left or right. It has nothing to do with delays in VR. The lag between the user input and what you see on the screen would be exactly the same as playing on your laptop. and you CAN play on your laptop,quite all right, right?
When it comes to 3d vision.The thing is that we would need to sync frames being displayed with the glasses and for this we would need to estimate the delays. Counting from the moment when frame bytes leave gpu to when they reach the screen. That is why they crtify the screens etc so they can better estimate the delays. And I say give us a simple wizard to tweaks that single interval. and then we suddenly get 3D vision support for all devices in all settings.
[quote="vurt72"]
I don't think it has to do with ignorance, it's more an issue of where to put in funds / time / effort, and for 3D Vision it's not interesting to them any longer (probably due to not making any money of it).
[/quote]
I couldn't agree more. but see.. we are the customers. we need to think about what we want not about their incomes. I dont like being kicked in the ass do you?
[quote="vurt72"]
Anyways, if you want 3D Vision buy a used DLP projector.. you'll be amazed at playing games at 100" or larger, and in 3D. The latency is also good so no ghosting. I got my Benq for next to nothing (like $70 or similar), new lamps can be bought from ebay really cheap.
[/quote]
Have you read my previous posts? that is exactly my setup;] well I've got a more expensive DLP but that is not of importance. but I can't use it due to NVIDIA not suppoting in-direct output of video frames.
oh come on guys... 3d stereoscopy using shuttering glasses is a peace of cake right? I've already explained to concepts. let the revolt begin! on my signal? ... GO!;d spread the wisdom.fight for your rights! fight for your needs!
also:
[url]https://www.gizmodo.com.au/2017/03/zero-latency-2-0-new-levels-in-virtual-reality/[/url]
bo3b said:I'm a lot less sure that everything is so easy. And I write code.
reminds me of the old days while at the university as an udergrad. there was a guy. he wanted to do a phd I wasn't even thinking about doing mine AT THAT TIME. I overheard him saying to some other guy... "well you do not want to be a regular coder right?"... I thought to myself... a regular coder.... everyone wants to be a programmer right?:P oh men I was soooooooo wrong.
now I always teach to my students. everythins is easy you see. everything is simple these are all only sequences of 0s and 1s. think low level write high-level.
bo3b said:
The oculus part is not interesting, their latency estimate is. It's on the order of 50 to 80ms.
The problem with latency in VR kits is of a different kind than in 3D vision.
VR systems need much better latency in order to trick the brain into thinking it's looking at a virtual world that completely surrounds the player wherever he or she looks. That is why latency in VR display IS important. see?
Now lets get back to the 3D vision kit. The problem here is of a completely different kind. You are confusing the concepts. here I was concerned with latency because you need to synchronise frames being displayed with the shuttering of the glasses.So the right frame gets to a specific eye left or right. It has nothing to do with delays in VR. The lag between the user input and what you see on the screen would be exactly the same as playing on your laptop. and you CAN play on your laptop,quite all right, right?
When it comes to 3d vision.The thing is that we would need to sync frames being displayed with the glasses and for this we would need to estimate the delays. Counting from the moment when frame bytes leave gpu to when they reach the screen. That is why they crtify the screens etc so they can better estimate the delays. And I say give us a simple wizard to tweaks that single interval. and then we suddenly get 3D vision support for all devices in all settings.
vurt72 said:
I don't think it has to do with ignorance, it's more an issue of where to put in funds / time / effort, and for 3D Vision it's not interesting to them any longer (probably due to not making any money of it).
I couldn't agree more. but see.. we are the customers. we need to think about what we want not about their incomes. I dont like being kicked in the ass do you?
vurt72 said:
Anyways, if you want 3D Vision buy a used DLP projector.. you'll be amazed at playing games at 100" or larger, and in 3D. The latency is also good so no ghosting. I got my Benq for next to nothing (like $70 or similar), new lamps can be bought from ebay really cheap.
Have you read my previous posts? that is exactly my setup;] well I've got a more expensive DLP but that is not of importance. but I can't use it due to NVIDIA not suppoting in-direct output of video frames.
oh come on guys... 3d stereoscopy using shuttering glasses is a peace of cake right? I've already explained to concepts. let the revolt begin! on my signal? ... GO!;d spread the wisdom.fight for your rights! fight for your needs!
@bo3b: oh come on my glass is always half empty. I learn every day something now. well I need to. learning and writing is mostly what I do. at least now:P
it is good you wrote - You were wrong about the latencies. did you get my point?
My best wishes Bob thank you for your input in this conversation.nice avatar.
@bo3b: oh come on my glass is always half empty. I learn every day something now. well I need to. learning and writing is mostly what I do. at least now:P
it is good you wrote - You were wrong about the latencies. did you get my point?
My best wishes Bob thank you for your input in this conversation.nice avatar.
[quote="vega4"]@bo3b: oh come on my glass is always half empty. I learn every day something now. well I need to. learning and writing is mostly what I do. at least now:P
it is good you wrote - You were wrong about the latencies. did you get my point?
My best wishes Bob thank you for your input in this conversation.nice avatar.[/quote]
Vega4 with this kind of attitude I don't understand how you can learn anything. You seem to be an arrogant all-knowing prick that doesn't get his limitation. You weren't even capable of understanding bo3b's point.
vega4 said:@bo3b: oh come on my glass is always half empty. I learn every day something now. well I need to. learning and writing is mostly what I do. at least now:P
it is good you wrote - You were wrong about the latencies. did you get my point?
My best wishes Bob thank you for your input in this conversation.nice avatar.
Vega4 with this kind of attitude I don't understand how you can learn anything. You seem to be an arrogant all-knowing prick that doesn't get his limitation. You weren't even capable of understanding bo3b's point.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits
Please forgive my attitude but for f***s sake, first I think you deserve it, second, I'll underline my points once more:
1) THE LAG IMPOSED BY iGPU IS *CONSTANT*. I play at 120Hz on my DLP projector at 1280x720 without any noticable LAG. It would work THE SAME HAD NVIDIA enabled 3D-signal. (well there would be an FPS drop)
If you connect a couple of pipes together and get water rushing from the same source would that change the flow at the end ??
2) The dlp-link projector syncs frames by itself
3) we could use a wizard to tweak the value of a delay for the visiion-kit-IR-beamer in case one had no opportunity to use dlp-link instead.
Besides in my case Direct Mode talk between geforce and iGPU works 100% fine. no need for operating system to do any heavy lifting. and I'm well below the mentioned range.
is anything unclear now. what am I supposed to learn? oh well
imagine this scenario: Geforce renders frames, for left AND right eye, these flow to iGPU in Direct mode,there is a CONSTANT DELAY imposed by iGPU, in case of DLP-LINK we don't even care it results only in a mouse-lag. in case of a 3d-vision-funny-IR-beamer we tweak the delay.
all in all the lack of a direct HDMI output from geforce may result in a mouse lag at best in case of a dlp-link device.
Please forgive my attitude but for f***s sake, first I think you deserve it, second, I'll underline my points once more:
1) THE LAG IMPOSED BY iGPU IS *CONSTANT*. I play at 120Hz on my DLP projector at 1280x720 without any noticable LAG. It would work THE SAME HAD NVIDIA enabled 3D-signal. (well there would be an FPS drop)
If you connect a couple of pipes together and get water rushing from the same source would that change the flow at the end ??
2) The dlp-link projector syncs frames by itself
3) we could use a wizard to tweak the value of a delay for the visiion-kit-IR-beamer in case one had no opportunity to use dlp-link instead.
Besides in my case Direct Mode talk between geforce and iGPU works 100% fine. no need for operating system to do any heavy lifting. and I'm well below the mentioned range.
is anything unclear now. what am I supposed to learn? oh well
imagine this scenario: Geforce renders frames, for left AND right eye, these flow to iGPU in Direct mode,there is a CONSTANT DELAY imposed by iGPU, in case of DLP-LINK we don't even care it results only in a mouse-lag. in case of a 3d-vision-funny-IR-beamer we tweak the delay.
all in all the lack of a direct HDMI output from geforce may result in a mouse lag at best in case of a dlp-link device.
Vega you fuck wit. You're like a dog with a frisby, let go FFS.
If you have a Ferrari and a lada parked next to each other in the same garage, You can't use the Ferrari engine to power the lada!
Hhmmm why is that? I hear you ask.
It's because the Ferrari engine isn't physically connected to the Ladas wheels.
Just like how your HDMI port isn't phycsically connected to your nvidia graphics card.
This ain't going to get fixed!
You bought the wrong tool for the job. Put your hand in your pocket and swallow your pride.
At least now you'll know what not to do next time.
That's it. End of. Now sling your hook.
Vega you fuck wit. You're like a dog with a frisby, let go FFS.
If you have a Ferrari and a lada parked next to each other in the same garage, You can't use the Ferrari engine to power the lada!
Hhmmm why is that? I hear you ask.
It's because the Ferrari engine isn't physically connected to the Ladas wheels.
Just like how your HDMI port isn't phycsically connected to your nvidia graphics card.
This ain't going to get fixed!
You bought the wrong tool for the job. Put your hand in your pocket and swallow your pride.
At least now you'll know what not to do next time.
@GibsonRed thank you sincerely for your comment, it made me smile a lot, especially the 'letting go' part :P :D
You see in computer science, there is a huge difference between a hardware part of the system and the software part.
Here in the realm of computer science, in this particular situation, the Ferari engine (NVIDIA GPU) has complete physical connection to the the Lada wheels (external display), electrons, bits, bytes if you will, flow through these components trough very well documented components, programmed using very well documented APIs.
and so it is perfectly possible to modify a software driver to allow for this to work.
Looks like my revolt did not get traction. I'm leaving.
@GibsonRed thank you sincerely for your comment, it made me smile a lot, especially the 'letting go' part :P :D
You see in computer science, there is a huge difference between a hardware part of the system and the software part.
Here in the realm of computer science, in this particular situation, the Ferari engine (NVIDIA GPU) has complete physical connection to the the Lada wheels (external display), electrons, bits, bytes if you will, flow through these components trough very well documented components, programmed using very well documented APIs.
and so it is perfectly possible to modify a software driver to allow for this to work.
Looks like my revolt did not get traction. I'm leaving.
[quote="vega4"]and so it is perfectly possible to modify a software driver to allow for this to work.[/quote]So what are you waiting for? Figure out what's stopping it from working and fix it...
You were told from the beginning that this is a user to user forum... users trying to help other users, requesting stuff may work in other sections of the forum but around here any requests fall on deaf ears or should I say blinded eyes...
We're just 3D Vision users trying to help each other out and keep 3D Vision alive, whether it's through fixing games, Profiles, finding hacks and tweaks, etc...
vega4 said:and so it is perfectly possible to modify a software driver to allow for this to work.
So what are you waiting for? Figure out what's stopping it from working and fix it...
You were told from the beginning that this is a user to user forum... users trying to help other users, requesting stuff may work in other sections of the forum but around here any requests fall on deaf ears or should I say blinded eyes...
We're just 3D Vision users trying to help each other out and keep 3D Vision alive, whether it's through fixing games, Profiles, finding hacks and tweaks, etc...
Probably just so that he can tell everyone that he did his best and it's all on Nvidia and has nothing to do with Intel, Windows or Asus.
lol, at first he was saying that Nvidia needed to render to Intel's buffer. He has since edited his post and changed his tune. That's comical.
If he was any type of teacher, as he states, he would be open to learning the reason and contact Nvidia. He probably blames the Automaker for his car running poorly, but has never changed the oil or had a tune-up. He complains that it struggles going uphill, but it goes great downhill, so obviously, it's something wrong with the way they made the car.
No idea why he refuses to contact the different entities involved and somehow thinks touting that he's a CS Scientist is going to impress us into making it work for him, too funny.
Probably just so that he can tell everyone that he did his best and it's all on Nvidia and has nothing to do with Intel, Windows or Asus.
lol, at first he was saying that Nvidia needed to render to Intel's buffer. He has since edited his post and changed his tune. That's comical.
If he was any type of teacher, as he states, he would be open to learning the reason and contact Nvidia. He probably blames the Automaker for his car running poorly, but has never changed the oil or had a tune-up. He complains that it struggles going uphill, but it goes great downhill, so obviously, it's something wrong with the way they made the car.
No idea why he refuses to contact the different entities involved and somehow thinks touting that he's a CS Scientist is going to impress us into making it work for him, too funny.
[quote="D-Man11"]
lol, at first he was saying that Nvidia needed to render to Intel's buffer. He has since edited his post and changed his tune. That's comical.[/quote]
I was supposed to be out, do not want to start a hate-war, but, after feeling sorry for you, This was the only logical sentence in your post so I thought I would relate to it.
Before you make a public laugh of yourself, please at least read some docs on the technology you are relating to. Yes I did write so, I haven't changed my tune or did edit my post at least not to change that specific part, on purpose, at least.
Almost all new laptops seem to come with Optimus; Which was designed by NVIDIA. This seems to cause various problems for developers.
a) Latency; because the discrete GPU is only used to render. [u]The frame needs to be then copied over to the buffer of the Intel on-chip GPU in order to send to the HDMI output, which adds latency.[/u]
b) since the HDMI works off the Intel on-chip GPU, on older versions of the (Optimus chip/iGPU) pair,it *USED TO* be limited to 60Hz, which is [u]no longer the case[/u]
c) then go and read about Optimus Direct and Extended mode.
Out I am. I thought I would rise your awareness of the problem, so that more people would be aware of its nature, and maybe start banging at NVIDIA's doors at large numbers. But seems like NVIDIA became sort of a 'goddess' around here and people like D-Man11 swore their lifes to protect it.
Please do stay in peace and continue helping each other.
D-Man11 said:
lol, at first he was saying that Nvidia needed to render to Intel's buffer. He has since edited his post and changed his tune. That's comical.
I was supposed to be out, do not want to start a hate-war, but, after feeling sorry for you, This was the only logical sentence in your post so I thought I would relate to it.
Before you make a public laugh of yourself, please at least read some docs on the technology you are relating to. Yes I did write so, I haven't changed my tune or did edit my post at least not to change that specific part, on purpose, at least.
Almost all new laptops seem to come with Optimus; Which was designed by NVIDIA. This seems to cause various problems for developers.
a) Latency; because the discrete GPU is only used to render. The frame needs to be then copied over to the buffer of the Intel on-chip GPU in order to send to the HDMI output, which adds latency.
b) since the HDMI works off the Intel on-chip GPU, on older versions of the (Optimus chip/iGPU) pair,it *USED TO* be limited to 60Hz, which is no longer the case
c) then go and read about Optimus Direct and Extended mode.
Out I am. I thought I would rise your awareness of the problem, so that more people would be aware of its nature, and maybe start banging at NVIDIA's doors at large numbers. But seems like NVIDIA became sort of a 'goddess' around here and people like D-Man11 swore their lifes to protect it.
Please do stay in peace and continue helping each other.
vega4
A victim of conceit.
Computer Science Degrees are a dime a dozen here in the bayarea. I don't see how this would wow anyone but, a ignorant person.
Really put your Degree to work and fix it your self.
[quote="BlueSkyDefender"]vega4
A victim of conceit.
Computer Science Degrees are a dime a dozen here in the bayarea. I don't see how this would wow anyone but, a ignorant person.
But, really put your Degree to work and fix it your self.[/quote]
Reverse engineering this would be too much time consuming. Also would require an update after each update from NVIDIA. I used to be into such fun for years while developing a bot for WoW. I'm actually a main person behind development of a bot for star-citizen. I most probably could. Don't have time nor intention. There are multiple projects I'm into on daily basis from simple ones to military grade. Most of the time I know what I'm saying. Join the revolt or stay put.
Computer Science Degrees are a dime a dozen here in the bayarea. I don't see how this would wow anyone but, a ignorant person.
But, really put your Degree to work and fix it your self.
Reverse engineering this would be too much time consuming. Also would require an update after each update from NVIDIA. I used to be into such fun for years while developing a bot for WoW. I'm actually a main person behind development of a bot for star-citizen. I most probably could. Don't have time nor intention. There are multiple projects I'm into on daily basis from simple ones to military grade. Most of the time I know what I'm saying. Join the revolt or stay put.
@vega4: I'm always interested in learning new things too, and I apologize for writing you off so early. Your comment about coding being easy is deeply insulting to a professional programmer like myself, so I assumed you were just another troll. Here's my current project: [url]https://github.com/bo3b/3Dmigoto[/url] I defy anyone to claim this is 'easy'.
I'm not clear on what your experiment was that makes you believe that Optimus runs at 120Hz. I've never heard of that, but don't have that equipment and cannot test it directly. If you are talking about an actual experiment, and not a high-level concept of how it 'should' work, I'd like to know more.
Are we talking about throughput, or latency? From my earlier and possibly out of date reading of Optimus, the problem was throughput.
It's also worth noting that NVidia locks out laptops from being able to do 3D on the internal screen. No idea why, but it's not possible to even do Discover red/cyan mode on the internal display. You need to connect an external monitor.
So, it's entirely possible that Optimus restrictions are some arbitrary NVidia lock, like that, or the 720p cap.
Optimus and g-sync are mutually exclusive. Only by a laptop with g-sync and you won't have this problem. I see that most high-end gaming laptops are moving toward g-sync and away from the awful Optimus.
The interesting experiment is to run something like Tomb Raider through Optimus to a 3D TV or projector, and see if you are getting sufficient performance. Tomb Raider supports Side-by-Side format, and does not require 3D Vision. There are a handful of other games that do the same.
@vega4: I'm always interested in learning new things too, and I apologize for writing you off so early. Your comment about coding being easy is deeply insulting to a professional programmer like myself, so I assumed you were just another troll. Here's my current project: https://github.com/bo3b/3Dmigoto I defy anyone to claim this is 'easy'.
I'm not clear on what your experiment was that makes you believe that Optimus runs at 120Hz. I've never heard of that, but don't have that equipment and cannot test it directly. If you are talking about an actual experiment, and not a high-level concept of how it 'should' work, I'd like to know more.
Are we talking about throughput, or latency? From my earlier and possibly out of date reading of Optimus, the problem was throughput.
It's also worth noting that NVidia locks out laptops from being able to do 3D on the internal screen. No idea why, but it's not possible to even do Discover red/cyan mode on the internal display. You need to connect an external monitor.
So, it's entirely possible that Optimus restrictions are some arbitrary NVidia lock, like that, or the 720p cap.
Optimus and g-sync are mutually exclusive. Only by a laptop with g-sync and you won't have this problem. I see that most high-end gaming laptops are moving toward g-sync and away from the awful Optimus.
The interesting experiment is to run something like Tomb Raider through Optimus to a 3D TV or projector, and see if you are getting sufficient performance. Tomb Raider supports Side-by-Side format, and does not require 3D Vision. There are a handful of other games that do the same.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
reminds me of the old days while at the university as an udergrad. there was a guy. he wanted to do a phd I wasn't even thinking about doing mine AT THAT TIME. I overheard him saying to some other guy... "well you do not want to be a regular coder right?"... I thought to myself... a regular coder.... everyone wants to be a programmer right?:P oh men I was soooooooo wrong.
now I always teach to my students. everythins is easy you see. everything is simple these are all only sequences of 0s and 1s. think low level write high-level.
The problem with latency in VR kits is of a different kind than in 3D vision.
VR systems need much better latency in order to trick the brain into thinking it's looking at a virtual world that completely surrounds the player wherever he or she looks. That is why latency in VR display IS important. see?
Now lets get back to the 3D vision kit. The problem here is of a completely different kind. You are confusing the concepts. here I was concerned with latency because you need to synchronise frames being displayed with the shuttering of the glasses.So the right frame gets to a specific eye left or right. It has nothing to do with delays in VR. The lag between the user input and what you see on the screen would be exactly the same as playing on your laptop. and you CAN play on your laptop,quite all right, right?
When it comes to 3d vision.The thing is that we would need to sync frames being displayed with the glasses and for this we would need to estimate the delays. Counting from the moment when frame bytes leave gpu to when they reach the screen. That is why they crtify the screens etc so they can better estimate the delays. And I say give us a simple wizard to tweaks that single interval. and then we suddenly get 3D vision support for all devices in all settings.
I couldn't agree more. but see.. we are the customers. we need to think about what we want not about their incomes. I dont like being kicked in the ass do you?
Have you read my previous posts? that is exactly my setup;] well I've got a more expensive DLP but that is not of importance. but I can't use it due to NVIDIA not suppoting in-direct output of video frames.
oh come on guys... 3d stereoscopy using shuttering glasses is a peace of cake right? I've already explained to concepts. let the revolt begin! on my signal? ... GO!;d spread the wisdom.fight for your rights! fight for your needs!
also:
https://www.gizmodo.com.au/2017/03/zero-latency-2-0-new-levels-in-virtual-reality/
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
it is good you wrote - You were wrong about the latencies. did you get my point?
My best wishes Bob thank you for your input in this conversation.nice avatar.
Vega4 with this kind of attitude I don't understand how you can learn anything. You seem to be an arrogant all-knowing prick that doesn't get his limitation. You weren't even capable of understanding bo3b's point.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits
1) THE LAG IMPOSED BY iGPU IS *CONSTANT*. I play at 120Hz on my DLP projector at 1280x720 without any noticable LAG. It would work THE SAME HAD NVIDIA enabled 3D-signal. (well there would be an FPS drop)
If you connect a couple of pipes together and get water rushing from the same source would that change the flow at the end ??
2) The dlp-link projector syncs frames by itself
3) we could use a wizard to tweak the value of a delay for the visiion-kit-IR-beamer in case one had no opportunity to use dlp-link instead.
Besides in my case Direct Mode talk between geforce and iGPU works 100% fine. no need for operating system to do any heavy lifting. and I'm well below the mentioned range.
is anything unclear now. what am I supposed to learn? oh well
imagine this scenario: Geforce renders frames, for left AND right eye, these flow to iGPU in Direct mode,there is a CONSTANT DELAY imposed by iGPU, in case of DLP-LINK we don't even care it results only in a mouse-lag. in case of a 3d-vision-funny-IR-beamer we tweak the delay.
all in all the lack of a direct HDMI output from geforce may result in a mouse lag at best in case of a dlp-link device.
If you have a Ferrari and a lada parked next to each other in the same garage, You can't use the Ferrari engine to power the lada!
Hhmmm why is that? I hear you ask.
It's because the Ferrari engine isn't physically connected to the Ladas wheels.
Just like how your HDMI port isn't phycsically connected to your nvidia graphics card.
This ain't going to get fixed!
You bought the wrong tool for the job. Put your hand in your pocket and swallow your pride.
At least now you'll know what not to do next time.
That's it. End of. Now sling your hook.
You see in computer science, there is a huge difference between a hardware part of the system and the software part.
Here in the realm of computer science, in this particular situation, the Ferari engine (NVIDIA GPU) has complete physical connection to the the Lada wheels (external display), electrons, bits, bytes if you will, flow through these components trough very well documented components, programmed using very well documented APIs.
and so it is perfectly possible to modify a software driver to allow for this to work.
Looks like my revolt did not get traction. I'm leaving.
You were told from the beginning that this is a user to user forum... users trying to help other users, requesting stuff may work in other sections of the forum but around here any requests fall on deaf ears or should I say blinded eyes...
We're just 3D Vision users trying to help each other out and keep 3D Vision alive, whether it's through fixing games, Profiles, finding hacks and tweaks, etc...
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
lol, at first he was saying that Nvidia needed to render to Intel's buffer. He has since edited his post and changed his tune. That's comical.
If he was any type of teacher, as he states, he would be open to learning the reason and contact Nvidia. He probably blames the Automaker for his car running poorly, but has never changed the oil or had a tune-up. He complains that it struggles going uphill, but it goes great downhill, so obviously, it's something wrong with the way they made the car.
No idea why he refuses to contact the different entities involved and somehow thinks touting that he's a CS Scientist is going to impress us into making it work for him, too funny.
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
I was supposed to be out, do not want to start a hate-war, but, after feeling sorry for you, This was the only logical sentence in your post so I thought I would relate to it.
Before you make a public laugh of yourself, please at least read some docs on the technology you are relating to. Yes I did write so, I haven't changed my tune or did edit my post at least not to change that specific part, on purpose, at least.
Almost all new laptops seem to come with Optimus; Which was designed by NVIDIA. This seems to cause various problems for developers.
a) Latency; because the discrete GPU is only used to render. The frame needs to be then copied over to the buffer of the Intel on-chip GPU in order to send to the HDMI output, which adds latency.
b) since the HDMI works off the Intel on-chip GPU, on older versions of the (Optimus chip/iGPU) pair,it *USED TO* be limited to 60Hz, which is no longer the case
c) then go and read about Optimus Direct and Extended mode.
Out I am. I thought I would rise your awareness of the problem, so that more people would be aware of its nature, and maybe start banging at NVIDIA's doors at large numbers. But seems like NVIDIA became sort of a 'goddess' around here and people like D-Man11 swore their lifes to protect it.
Please do stay in peace and continue helping each other.
A victim of conceit.
Computer Science Degrees are a dime a dozen here in the bayarea. I don't see how this would wow anyone but, a ignorant person.
Really put your Degree to work and fix it your self.
Reverse engineering this would be too much time consuming. Also would require an update after each update from NVIDIA. I used to be into such fun for years while developing a bot for WoW. I'm actually a main person behind development of a bot for star-citizen. I most probably could. Don't have time nor intention. There are multiple projects I'm into on daily basis from simple ones to military grade. Most of the time I know what I'm saying. Join the revolt or stay put.
I'm not clear on what your experiment was that makes you believe that Optimus runs at 120Hz. I've never heard of that, but don't have that equipment and cannot test it directly. If you are talking about an actual experiment, and not a high-level concept of how it 'should' work, I'd like to know more.
Are we talking about throughput, or latency? From my earlier and possibly out of date reading of Optimus, the problem was throughput.
It's also worth noting that NVidia locks out laptops from being able to do 3D on the internal screen. No idea why, but it's not possible to even do Discover red/cyan mode on the internal display. You need to connect an external monitor.
So, it's entirely possible that Optimus restrictions are some arbitrary NVidia lock, like that, or the 720p cap.
Optimus and g-sync are mutually exclusive. Only by a laptop with g-sync and you won't have this problem. I see that most high-end gaming laptops are moving toward g-sync and away from the awful Optimus.
The interesting experiment is to run something like Tomb Raider through Optimus to a 3D TV or projector, and see if you are getting sufficient performance. Tomb Raider supports Side-by-Side format, and does not require 3D Vision. There are a handful of other games that do the same.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers