GTX 970 4K 3D VIsion gaming success with LGUB850V Passive TV
35 / 44
Note that entire 4K image (3840x2160) must be processed for line-interleaved, SBS, and TAB 3D formats, even though half the data is thrown away. VERY inefficient, but this is way current 3D display formats operate...
Note that entire 4K image (3840x2160) must be processed for line-interleaved, SBS, and TAB 3D formats, even though half the data is thrown away. VERY inefficient, but this is way current 3D display formats operate...
Uhm so I tried it with my laptop again in another store with the old driver override, normal 3d vision doesn't work at all, so I had to force it with side by side in GTA. The viewing angle was terrible and it looked bad and blurry at 1080p, I guess it was 3D. Is this viewing angle supposed to be horrible with the LGOLED55E6V? My old tv looks better when I test 3D on my laptop with it. Can a bad viewing angle be caused by my laptop? I thought viewing angle was display dependant.
Can I even get a decent representation of how the 3D would function with a 770m laptop on this tv? It worked decent enough on my old tv in 1080p, why not on the new one in 1080p? I don't understand.
Uhm so I tried it with my laptop again in another store with the old driver override, normal 3d vision doesn't work at all, so I had to force it with side by side in GTA. The viewing angle was terrible and it looked bad and blurry at 1080p, I guess it was 3D. Is this viewing angle supposed to be horrible with the LGOLED55E6V? My old tv looks better when I test 3D on my laptop with it. Can a bad viewing angle be caused by my laptop? I thought viewing angle was display dependant.
Can I even get a decent representation of how the 3D would function with a 770m laptop on this tv? It worked decent enough on my old tv in 1080p, why not on the new one in 1080p? I don't understand.
I haven 't got the 3d tv yet, but I did some testing using tridef top/bottom and my existing pc monitor.
First of all I loaded up a game in 2d mode 1440p resolution and at 60 fps the gpu usage was about 45. I then used tridef in top bottom mode, and at 60fps the gpu usage was also 45....so it seems that using tridef top/bottom you will lose half the vertical resolution but not suffer a performance hit? Of course there will be the significant hit of 4K (giving 3840x1080 per eye), but not an additional hit of 3d, unless I am misunderstanding something here.
For the sake of argument, I created a custom resolution in the nvidia control of 2560x720 (to mirror 3840x1080), and set this in game. Whilst gpu usage basically halved to around 20, of course, and as mentioned, I would either have black bars or a distorted image when upscaled into 16:9.
I haven 't got the 3d tv yet, but I did some testing using tridef top/bottom and my existing pc monitor.
First of all I loaded up a game in 2d mode 1440p resolution and at 60 fps the gpu usage was about 45. I then used tridef in top bottom mode, and at 60fps the gpu usage was also 45....so it seems that using tridef top/bottom you will lose half the vertical resolution but not suffer a performance hit? Of course there will be the significant hit of 4K (giving 3840x1080 per eye), but not an additional hit of 3d, unless I am misunderstanding something here.
For the sake of argument, I created a custom resolution in the nvidia control of 2560x720 (to mirror 3840x1080), and set this in game. Whilst gpu usage basically halved to around 20, of course, and as mentioned, I would either have black bars or a distorted image when upscaled into 16:9.
[quote="SoundBlstr"]From my understanding, there's no way around the pretty severe performance hit, because the PC effectively has to draw twice the frames for your left/right eyes. However, like you indicated, I'm not sure if the math is exactly 50% of 4K2D, since it seems like the vertical resolution should be cut in half. I'd have to know more about the technology though to know if the graphic card actually takes advantage of this fact, or whether the scene is still rendered at full res and just sent to the monitor every other line.
Either way, from what I'm seeing, any change in resolution translates into that being your [i]final[/i] res, so changing to 3840x1080 would just give you a letterboxed final picture. Since my display is currently working, I'm hesitant to do too much testing with custom resolutions for fear that I'd mess something up.
Like I mentioned in my full post in the other thread ([url]https://forums.geforce.com/default/topic/943900/3d-vision/search-information-about-edid-for-3dvision-on-lg-oled-4k-55ef950v-55ef9500-/6/?offset=88#5069058[/url]), after installing the EDID, I don't get the same options in the Nvidia control panel under 'change resolution'. Specifically, the "Output color format" and "Output dynamic range" fields are disabled. Since "Output color format" was critical to getting a non-ghosted 3D picture, I'm concerned that if I changed resolutions it might reset those settings, and I'd be unable to set them back without messing with the drivers.
I just did an informal test in Witcher 3 at 4K (nearly maxed out settings):
3D on - 16-18 avg FPS
3D off- 30-34 avg FPS
So, that seems to be about a 50% hit.
[/quote]
Which EDId did you use? Im not able to change "Output color format" and "Output dynamic range" I bought the C6 lg tv and the ghosting is pretty bad with the color bug. Im almost thinking about refunding the tv if I cant get it to work.
SoundBlstr said:From my understanding, there's no way around the pretty severe performance hit, because the PC effectively has to draw twice the frames for your left/right eyes. However, like you indicated, I'm not sure if the math is exactly 50% of 4K2D, since it seems like the vertical resolution should be cut in half. I'd have to know more about the technology though to know if the graphic card actually takes advantage of this fact, or whether the scene is still rendered at full res and just sent to the monitor every other line.
Either way, from what I'm seeing, any change in resolution translates into that being your final res, so changing to 3840x1080 would just give you a letterboxed final picture. Since my display is currently working, I'm hesitant to do too much testing with custom resolutions for fear that I'd mess something up.
Like I mentioned in my full post in the other thread (https://forums.geforce.com/default/topic/943900/3d-vision/search-information-about-edid-for-3dvision-on-lg-oled-4k-55ef950v-55ef9500-/6/?offset=88#5069058), after installing the EDID, I don't get the same options in the Nvidia control panel under 'change resolution'. Specifically, the "Output color format" and "Output dynamic range" fields are disabled. Since "Output color format" was critical to getting a non-ghosted 3D picture, I'm concerned that if I changed resolutions it might reset those settings, and I'd be unable to set them back without messing with the drivers.
I just did an informal test in Witcher 3 at 4K (nearly maxed out settings):
3D on - 16-18 avg FPS
3D off- 30-34 avg FPS
So, that seems to be about a 50% hit.
Which EDId did you use? Im not able to change "Output color format" and "Output dynamic range" I bought the C6 lg tv and the ghosting is pretty bad with the color bug. Im almost thinking about refunding the tv if I cant get it to work.
[quote="sebastatu"]Which EDId did you use? Im not able to change "Output color format" and "Output dynamic range" I bought the C6 lg tv and the ghosting is pretty bad with the color bug. Im almost thinking about refunding the tv if I cant get it to work.[/quote]
I used this one:
https://drive.google.com/file/d/0BxmDW2IVWe1pTGtJQ1hSM2EzUUk/view
However, I didn't install the EDID until I had first adjusted the "Output color format" and "Output color depth" using the default driver. See here for the order I did things:
[url]https://forums.geforce.com/default/topic/943900/3d-vision/search-information-about-edid-for-3dvision-on-lg-oled-4k-55ef950v-55ef9500-/6/?offset=88#5069058[/url]
I used Trine 2 to make sure 3D was working before even messing with the EDID. That game (along with Trine 1, Trine 3 and Shadwen) allow you to output line interlaced 3D WITHOUT installing the EDID or otherwise dealing with the NVidia driver. I'd recommend making sure that works first (demos available through Steam, if you don't own those games).
Here's a brief recap of how things went down for me:
- run Trine 2 in line interlaced (swapped) mode - bad ghosting on red/blue
- adjust TV settings, Nvidia control panel settings and switch HDMI cable (more details in post linked above) then run Trine 2 again...this time 3d worked well without ghosting
- installed EDID fix...now all games worked
sebastatu said:Which EDId did you use? Im not able to change "Output color format" and "Output dynamic range" I bought the C6 lg tv and the ghosting is pretty bad with the color bug. Im almost thinking about refunding the tv if I cant get it to work.
I used Trine 2 to make sure 3D was working before even messing with the EDID. That game (along with Trine 1, Trine 3 and Shadwen) allow you to output line interlaced 3D WITHOUT installing the EDID or otherwise dealing with the NVidia driver. I'd recommend making sure that works first (demos available through Steam, if you don't own those games).
Here's a brief recap of how things went down for me:
- run Trine 2 in line interlaced (swapped) mode - bad ghosting on red/blue
- adjust TV settings, Nvidia control panel settings and switch HDMI cable (more details in post linked above) then run Trine 2 again...this time 3d worked well without ghosting
- installed EDID fix...now all games worked
Core i7-6700K, 32GB RAM, GeForce RTX 2080 Ti, 2016 LG OLED55C6P, Oculus Rift, Windows 10
[quote="Conan481"]Awesome. I'll try it when I hook up my 65" tomorrow. However I noticed that the swap usually happens 3xaxtly after 30 min for me. So maybe you're just not playing long enough. Either way hopefully your setting is a fix. [/quote]
To follow up on this earlier issue, I've now played several games for hours at a time without the eye swap happening. I'm pretty sure that setting did the trick for me.
Conan481 said:Awesome. I'll try it when I hook up my 65" tomorrow. However I noticed that the swap usually happens 3xaxtly after 30 min for me. So maybe you're just not playing long enough. Either way hopefully your setting is a fix.
To follow up on this earlier issue, I've now played several games for hours at a time without the eye swap happening. I'm pretty sure that setting did the trick for me.
Core i7-6700K, 32GB RAM, GeForce RTX 2080 Ti, 2016 LG OLED55C6P, Oculus Rift, Windows 10
I have a little update on my set (55UC970V). I neglected it for quite a while and today I decided to use it for a little bit for gaming.
I also updated to the latest firmware (5.05.55).
I did a measurement agains my prohjector and found out that the LG has now only 15ms more than my projector.
I didn't have the chance to fixed measurement of my projector but accoring to reviews it is 32ms so this take the LG to a quite acceptable 47ms for the Expert profile.
The game profile gave me same exact results so there is absolutely no reason anymore to use game profile and loose image quality.
UB850V and UB950V sets should have the same firmware available (5.05.55).
I have a little update on my set (55UC970V). I neglected it for quite a while and today I decided to use it for a little bit for gaming.
I also updated to the latest firmware (5.05.55).
I did a measurement agains my prohjector and found out that the LG has now only 15ms more than my projector.
I didn't have the chance to fixed measurement of my projector but accoring to reviews it is 32ms so this take the LG to a quite acceptable 47ms for the Expert profile.
The game profile gave me same exact results so there is absolutely no reason anymore to use game profile and loose image quality.
UB850V and UB950V sets should have the same firmware available (5.05.55).
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
Because when you use gamemode you loose the possibility to properly adjust image color and quality.
I was never satisfied with color in game profile, and it is impossible to calibrate it.
On expert you have all of this + some other image processing options that can be useful.
Because when you use gamemode you loose the possibility to properly adjust image color and quality.
I was never satisfied with color in game profile, and it is impossible to calibrate it.
On expert you have all of this + some other image processing options that can be useful.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
That is what i'm saying. I measured inputlag in both profiles and it is the same. On my set it does not damage 3d at all and it shouldn't do it on any set. You don't watch 3d movies on game mode right?
That is what i'm saying. I measured inputlag in both profiles and it is the same. On my set it does not damage 3d at all and it shouldn't do it on any set. You don't watch 3d movies on game mode right?
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
I took the plunge and bought the LGOLED55E6V, I got it working fine. I have the red ghosting problem where you can't set chroma 4:4:4, but I bought a new hdmi 2.0b cable that should solve the problem, right? I'm currently using a 1.2 diplayport to hdmi cable.
I love how much less ghosting I have now in comparison with my old tv, thanks alot, all of you. I am very grateful
I took the plunge and bought the LGOLED55E6V, I got it working fine. I have the red ghosting problem where you can't set chroma 4:4:4, but I bought a new hdmi 2.0b cable that should solve the problem, right? I'm currently using a 1.2 diplayport to hdmi cable.
I love how much less ghosting I have now in comparison with my old tv, thanks alot, all of you. I am very grateful
[quote="Metaloholic"][quote="ChronicHedgehog"]I have a LG 55EF9500 for sale if anyone's interested. I'd love to keep it but something about it triggers my migraines. Probably going to switch to a dual projector setup with passive filters. Pickup only in Philadelphia, PA. PM for details.[/quote]
I can't say anything else but prepare to crap your pants saying HOLY JEESUS
[/quote]
Ha, ha. Yes, I'm looking forward to it, but I'll be building it slowly. Need to sell the TV first.
ChronicHedgehog said:I have a LG 55EF9500 for sale if anyone's interested. I'd love to keep it but something about it triggers my migraines. Probably going to switch to a dual projector setup with passive filters. Pickup only in Philadelphia, PA. PM for details.
I can't say anything else but prepare to crap your pants saying HOLY JEESUS
Ha, ha. Yes, I'm looking forward to it, but I'll be building it slowly. Need to sell the TV first.
4K 55" LG 55EF9500 3D OLED TV for sale! Pickup ONLY in Philadelphia, PA. PM for details.
By the way, did yo guys try using GeDoSaTo to render at 1080p and upsacle to 4k ? It's the most efficient way to use 4k passive 3dtv's. But GeDoSaTo only works with dx9 games. Some dx11 games however support internal resolution scaling natively in their graphics settings. If only all of them did.... Or if only nvidia's DSR supported 0.5x mode....
By the way, did yo guys try using GeDoSaTo to render at 1080p and upsacle to 4k ? It's the most efficient way to use 4k passive 3dtv's. But GeDoSaTo only works with dx9 games. Some dx11 games however support internal resolution scaling natively in their graphics settings. If only all of them did.... Or if only nvidia's DSR supported 0.5x mode....
Can I even get a decent representation of how the 3D would function with a 770m laptop on this tv? It worked decent enough on my old tv in 1080p, why not on the new one in 1080p? I don't understand.
First of all I loaded up a game in 2d mode 1440p resolution and at 60 fps the gpu usage was about 45. I then used tridef in top bottom mode, and at 60fps the gpu usage was also 45....so it seems that using tridef top/bottom you will lose half the vertical resolution but not suffer a performance hit? Of course there will be the significant hit of 4K (giving 3840x1080 per eye), but not an additional hit of 3d, unless I am misunderstanding something here.
For the sake of argument, I created a custom resolution in the nvidia control of 2560x720 (to mirror 3840x1080), and set this in game. Whilst gpu usage basically halved to around 20, of course, and as mentioned, I would either have black bars or a distorted image when upscaled into 16:9.
Which EDId did you use? Im not able to change "Output color format" and "Output dynamic range" I bought the C6 lg tv and the ghosting is pretty bad with the color bug. Im almost thinking about refunding the tv if I cant get it to work.
I used this one:
https://drive.google.com/file/d/0BxmDW2IVWe1pTGtJQ1hSM2EzUUk/view
However, I didn't install the EDID until I had first adjusted the "Output color format" and "Output color depth" using the default driver. See here for the order I did things:
https://forums.geforce.com/default/topic/943900/3d-vision/search-information-about-edid-for-3dvision-on-lg-oled-4k-55ef950v-55ef9500-/6/?offset=88#5069058
I used Trine 2 to make sure 3D was working before even messing with the EDID. That game (along with Trine 1, Trine 3 and Shadwen) allow you to output line interlaced 3D WITHOUT installing the EDID or otherwise dealing with the NVidia driver. I'd recommend making sure that works first (demos available through Steam, if you don't own those games).
Here's a brief recap of how things went down for me:
- run Trine 2 in line interlaced (swapped) mode - bad ghosting on red/blue
- adjust TV settings, Nvidia control panel settings and switch HDMI cable (more details in post linked above) then run Trine 2 again...this time 3d worked well without ghosting
- installed EDID fix...now all games worked
Core i7-6700K, 32GB RAM, GeForce RTX 2080 Ti, 2016 LG OLED55C6P, Oculus Rift, Windows 10
To follow up on this earlier issue, I've now played several games for hours at a time without the eye swap happening. I'm pretty sure that setting did the trick for me.
Core i7-6700K, 32GB RAM, GeForce RTX 2080 Ti, 2016 LG OLED55C6P, Oculus Rift, Windows 10
I also updated to the latest firmware (5.05.55).
I did a measurement agains my prohjector and found out that the LG has now only 15ms more than my projector.
I didn't have the chance to fixed measurement of my projector but accoring to reviews it is 32ms so this take the LG to a quite acceptable 47ms for the Expert profile.
The game profile gave me same exact results so there is absolutely no reason anymore to use game profile and loose image quality.
UB850V and UB950V sets should have the same firmware available (5.05.55).
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
Why you think that game profile loose image quality? I think that this profile reduce input lag only. I am wrong?
4K3D on passive LG OLED 4K TV 65C6V, GTX 1080 Ti, Win 8.1 64 Pro, i7-7700, 3D-Vision 2 on Benq LW61-LED PJ. HTC Vive. Panasonic Z-10000 3D Camcorder
I was never satisfied with color in game profile, and it is impossible to calibrate it.
On expert you have all of this + some other image processing options that can be useful.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
4K3D on passive LG OLED 4K TV 65C6V, GTX 1080 Ti, Win 8.1 64 Pro, i7-7700, 3D-Vision 2 on Benq LW61-LED PJ. HTC Vive. Panasonic Z-10000 3D Camcorder
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
Yes, you are right ))
4K3D on passive LG OLED 4K TV 65C6V, GTX 1080 Ti, Win 8.1 64 Pro, i7-7700, 3D-Vision 2 on Benq LW61-LED PJ. HTC Vive. Panasonic Z-10000 3D Camcorder
I love how much less ghosting I have now in comparison with my old tv, thanks alot, all of you. I am very grateful
Ha, ha. Yes, I'm looking forward to it, but I'll be building it slowly. Need to sell the TV first.
4K 55" LG 55EF9500 3D OLED TV for sale! Pickup ONLY in Philadelphia, PA. PM for details.