So the past year I have been testing out HDR games with a 3D fix, and while the games look amazing in 2D with HDR enabled - the picture becomes washed out and dark when 3D is activated. I play on a passive LG OLED, so I use TaB for 3D. I can see when the TaB is displayed (before I activate 3D on tv) that the HDR is present and seems to look great on the split screen, but when I turn tv TaB 3D ON the picture is ruined. It is not a tv setting issue, there clearly is something wrong (and HDR image is incredible in 2D with same tv settings). Unfortunately it seems to me the issue is a hardware (tv) problem, and the tv tech for 3D is not equipped to handle HDR and the extra color data? maybe LG never expected HDR + 3D could ever be a thing, since blu rays are either one or the other and not both...perhaps it is actually impossible to have both?
It is truly a shame if it is just not possible, as anyone who has seen HDR implemented well knows that it is a genuine game changer. I can only imagine what Shadow of the Tomb Raider would look like in full HDR 3D, forced to chose I will ALWAYS chose 3D over HDR...but a part of me dies inside that I have to lose that amazing HDR graphics.
I wonder if it is possible in 3D vision? probably no HDR 3D vision monitors exist do they? I have always used hdtvs as my monitor, so I am limited to 3dtvplay or 3DM checkerboard TaB methods of playing in 3D. More and more games are including HDR on PC, but I am not sure we are going to be able to benefit at all...
So the past year I have been testing out HDR games with a 3D fix, and while the games look amazing in 2D with HDR enabled - the picture becomes washed out and dark when 3D is activated. I play on a passive LG OLED, so I use TaB for 3D. I can see when the TaB is displayed (before I activate 3D on tv) that the HDR is present and seems to look great on the split screen, but when I turn tv TaB 3D ON the picture is ruined. It is not a tv setting issue, there clearly is something wrong (and HDR image is incredible in 2D with same tv settings). Unfortunately it seems to me the issue is a hardware (tv) problem, and the tv tech for 3D is not equipped to handle HDR and the extra color data? maybe LG never expected HDR + 3D could ever be a thing, since blu rays are either one or the other and not both...perhaps it is actually impossible to have both?
It is truly a shame if it is just not possible, as anyone who has seen HDR implemented well knows that it is a genuine game changer. I can only imagine what Shadow of the Tomb Raider would look like in full HDR 3D, forced to chose I will ALWAYS chose 3D over HDR...but a part of me dies inside that I have to lose that amazing HDR graphics.
I wonder if it is possible in 3D vision? probably no HDR 3D vision monitors exist do they? I have always used hdtvs as my monitor, so I am limited to 3dtvplay or 3DM checkerboard TaB methods of playing in 3D. More and more games are including HDR on PC, but I am not sure we are going to be able to benefit at all...
Use native interlaced signal and HDR will look great. Might be that it is disabled when you push the 3d button. I never tested it that wa but I can confirm that it looks great with 3D when using native resolution.
Use native interlaced signal and HDR will look great. Might be that it is disabled when you push the 3d button. I never tested it that wa but I can confirm that it looks great with 3D when using native resolution.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits
Have you used/tried Helifax's great Mass Effect Andromeda mod? It is the only 3D Vision mod that I have found that supported HDR and 3D on my E6 passive OLED (also with joker18's latest EDID mod, not TaB mode). The enhanced dynamic range is quite apparent (at least to me), and the 3D is great - I really enjoyed the experience and hope the developers return to Andromeda...
By the way, Shadow of the Tomb Raider 3D only works correctly with my setup in exclusive full-screen mode, DX11 only (DX12 currently crashes my game), normal 3D Vision mode (not SBS). HDR setting is disabled/greyed, for some unknown reason.
Have you used/tried Helifax's great Mass Effect Andromeda mod? It is the only 3D Vision mod that I have found that supported HDR and 3D on my E6 passive OLED (also with joker18's latest EDID mod, not TaB mode). The enhanced dynamic range is quite apparent (at least to me), and the 3D is great - I really enjoyed the experience and hope the developers return to Andromeda...
By the way, Shadow of the Tomb Raider 3D only works correctly with my setup in exclusive full-screen mode, DX11 only (DX12 currently crashes my game), normal 3D Vision mode (not SBS). HDR setting is disabled/greyed, for some unknown reason.
Shadow of the Tomb Raider only has HDR in DX12 mode. Why? I don't know. I heard Kaldaien being angry about it in his Special K thread on Steam.
What I wonder is (if it's the same case as im ROTTR): why no 3D Vision mode in DX12? Isn't there any 3D Vision Direct mode for DX12 or something? Automatic works at the very least.
Shadow of the Tomb Raider only has HDR in DX12 mode. Why? I don't know. I heard Kaldaien being angry about it in his Special K thread on Steam.
What I wonder is (if it's the same case as im ROTTR): why no 3D Vision mode in DX12? Isn't there any 3D Vision Direct mode for DX12 or something? Automatic works at the very least.
[quote="joker18"]Use native interlaced signal and HDR will look great. Might be that it is disabled when you push the 3d button. I never tested it that wa but I can confirm that it looks great with 3D when using native resolution.[/quote]
oh interesting, so native interlaced signal works just fine and 3D HDR is the same improvement as 2D HDR? So in that case I would need to use the EDID override correct? I tried that on my old passive Sony tv when I first discovered 3D, and I noticed the ghosting was much worse than using the checkboard/TaB method so I removed my EDID at the time.
Maybe its time I try it with my LG, the only problem is I don't want to play every 3D game in native 4k since I play a lot of AAA titles that are way too demanding even for a 1080ti and would be down to 20 FPS etc. but now you have my curiosity peaked, so I need the EDID override for my tv and set Nvidia to 3DTV play to output interlaced 4k image with true HDR? (its been a while since I tested all of that stuff, been using TaB for 90% of fixes)
joker18 said:Use native interlaced signal and HDR will look great. Might be that it is disabled when you push the 3d button. I never tested it that wa but I can confirm that it looks great with 3D when using native resolution.
oh interesting, so native interlaced signal works just fine and 3D HDR is the same improvement as 2D HDR? So in that case I would need to use the EDID override correct? I tried that on my old passive Sony tv when I first discovered 3D, and I noticed the ghosting was much worse than using the checkboard/TaB method so I removed my EDID at the time.
Maybe its time I try it with my LG, the only problem is I don't want to play every 3D game in native 4k since I play a lot of AAA titles that are way too demanding even for a 1080ti and would be down to 20 FPS etc. but now you have my curiosity peaked, so I need the EDID override for my tv and set Nvidia to 3DTV play to output interlaced 4k image with true HDR? (its been a while since I tested all of that stuff, been using TaB for 90% of fixes)
How many games support both HDR and 3D?
Crap tried to play FF XV in 3D Vision and is getting terrible performance.
Dropping down from highest settings to high at least gave me 10-20fps at worst.
Was down to 3fps using highest.
I'm enabling all the NVIDIA graphics features which are probably very demanding.
I easily use up all my 8GB of vram.
The game is also using 22GB of system ram.
Think this is the first time I've been above 16GB memory usage.
Crap tried to play FF XV in 3D Vision and is getting terrible performance.
Dropping down from highest settings to high at least gave me 10-20fps at worst.
Was down to 3fps using highest.
I'm enabling all the NVIDIA graphics features which are probably very demanding.
I easily use up all my 8GB of vram.
The game is also using 22GB of system ram.
Think this is the first time I've been above 16GB memory usage.
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
Battlefield 1, Call of duty WW2, Mass Effect Andromeda.
This games also have resolution scale,I usually lower that to 50-55 and the game still looks good and runs much better.
Battlefield 1, Call of duty WW2, Mass Effect Andromeda.
This games also have resolution scale,I usually lower that to 50-55 and the game still looks good and runs much better.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits
I just tried BF1 and got decent performance 30+ v-synced using ultra settings on my RTX 2080.
I'm having troubles with the EDID override as the image is not interlaced as it should be and it appears that left and right image has been blended together somehow. When I don't use the EDID override I can turn HDR on and off in the windows display properties. HDR is currently enabled all the time even in games not supporting HDR. As well as in windows overall.
I just tried BF1 and got decent performance 30+ v-synced using ultra settings on my RTX 2080.
I'm having troubles with the EDID override as the image is not interlaced as it should be and it appears that left and right image has been blended together somehow. When I don't use the EDID override I can turn HDR on and off in the windows display properties. HDR is currently enabled all the time even in games not supporting HDR. As well as in windows overall.
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
thanks for pointing me in the right direction guys, I was able to confirm HDR works by just switching the 3DMigoto output to reverse interlaced and setting resolution to display's native (4k). This allows me to not have to use the EDID override and be locked into 4k, and I can keep things flexible - play demanding games in 1440p/60 TaB, some in 4k TaB, and any HDR titles in 4k HDR interlaced. The best of all worlds.
This is definitely going to pay dividends as more and more pc titles launch with HDR support. Now we just need cards that can do 8k/60 for our 4k/60 3D needs :)
thanks for pointing me in the right direction guys, I was able to confirm HDR works by just switching the 3DMigoto output to reverse interlaced and setting resolution to display's native (4k). This allows me to not have to use the EDID override and be locked into 4k, and I can keep things flexible - play demanding games in 1440p/60 TaB, some in 4k TaB, and any HDR titles in 4k HDR interlaced. The best of all worlds.
This is definitely going to pay dividends as more and more pc titles launch with HDR support. Now we just need cards that can do 8k/60 for our 4k/60 3D needs :)
No one has performed formal assessment of 3DMigoto line-interlaced (or TaB) versus EDID override, including image quality comparison and performance differences. In the past, when I did a little informal testing, I thought the EDID override approach looked slightly better, and I did not notice any significant performance difference.
From a useability perspective, the EDID method enables automatic 3D - so, a user never needs to touch/use TV remote for 3D Vision gaming.
No one has performed formal assessment of 3DMigoto line-interlaced (or TaB) versus EDID override, including image quality comparison and performance differences. In the past, when I did a little informal testing, I thought the EDID override approach looked slightly better, and I did not notice any significant performance difference.
From a useability perspective, the EDID method enables automatic 3D - so, a user never needs to touch/use TV remote for 3D Vision gaming.
LG’s 2016 E6 or C6 OLED series (last year LG made 3D TVs) - but they are becoming very difficult to find now (almost all stock gone).
NVIDIA really needs to create new generation of 3D Vision glasses supporting interfaces like 1080p/120 Hz over HDMI (adopted by LG’s subsequent E/C models after abandoning 3D)...
LG’s 2016 E6 or C6 OLED series (last year LG made 3D TVs) - but they are becoming very difficult to find now (almost all stock gone).
NVIDIA really needs to create new generation of 3D Vision glasses supporting interfaces like 1080p/120 Hz over HDMI (adopted by LG’s subsequent E/C models after abandoning 3D)...
I think that SuperDepth 3D might also be a viable alternative for HDR in some games (as well as better FPS if needed), though I'm in no way certain of it.
My assumption, is from this post by BlueSkyDef, the author of SuperDepth 3D.
[url]https://forums.geforce.com/default/topic/1061791/3d-vision/rtx-2080-incoming-/post/5877300/#5877300[/url]
[quote="BlueSkyDef"]There are holes in starman arguments. I am the guy who plays at 4k60 3D + HDR. I benefit from SLI and a faster card. I even would benefit from just sidegrade, to be honest, due to HDR performance improvements and the separation of the int & floating point pipelines on the new card that will help with my shader. People forget at around 4k 5-10fps makes a difference. when trying to stay in the 42-60 range of playable.
SLI due 4k and My large gaming library around half would benefit from The performance boost that would come from SLI and at 4k it's needed.
I am not limited by 3D Vision's since I play using my own 3D Shader on almost all games I can get it running.
I also have an HDR 3D screen with no 3D Vision support. So my situation is different. So understanding that generalizations can bite you in the ass if you trying to prove a point.
Ya, Nvidia is charging a lot of money. Even I didn't just go buy any of the new cards on release day or preorders. Just Because I would benefit more from SLI than a sidegrade for the here and now. Still didn't mean I went out and got a 2nd 1080ti. Even if SLI would be cheaper for me too. I got my 1080ti for around 470. I am sure I can get a 2nd one for more or less the same.
But, SLI Bad.... ohhhhh...... Not in my case. It will still come out cheaper than a single 2080ti. It would be cheaper for me to sidegrade and get a 2080 and keep the same performance in normal rasterized games and a boost in games that have HDR + a boost in FPS in my shader.
It's strange..... I have options. I don't care for RayTracing right now. But, it had to be reintroduced at some point. RayTracing has been around for a long ass time, it's not new. Support and games will come in time. But, this would not be the reason I would buy this card. DLSS is something that I am interested in. Since in some games that I have not played yet have heavy requirements at 4k. DLSS, in this case, would help out with overall FPS. In FFXI it will blow my single 1080ti out of the water only when it comes to this game because of DLSS.
So I am still trying to make up my mind if I should GO SLI or just SideGrade. This is not easy since I see the benefit with both.[/quote]
I think that SuperDepth 3D might also be a viable alternative for HDR in some games (as well as better FPS if needed), though I'm in no way certain of it.
My assumption, is from this post by BlueSkyDef, the author of SuperDepth 3D.
BlueSkyDef said:There are holes in starman arguments. I am the guy who plays at 4k60 3D + HDR. I benefit from SLI and a faster card. I even would benefit from just sidegrade, to be honest, due to HDR performance improvements and the separation of the int & floating point pipelines on the new card that will help with my shader. People forget at around 4k 5-10fps makes a difference. when trying to stay in the 42-60 range of playable.
SLI due 4k and My large gaming library around half would benefit from The performance boost that would come from SLI and at 4k it's needed.
I am not limited by 3D Vision's since I play using my own 3D Shader on almost all games I can get it running.
I also have an HDR 3D screen with no 3D Vision support. So my situation is different. So understanding that generalizations can bite you in the ass if you trying to prove a point.
Ya, Nvidia is charging a lot of money. Even I didn't just go buy any of the new cards on release day or preorders. Just Because I would benefit more from SLI than a sidegrade for the here and now. Still didn't mean I went out and got a 2nd 1080ti. Even if SLI would be cheaper for me too. I got my 1080ti for around 470. I am sure I can get a 2nd one for more or less the same.
But, SLI Bad.... ohhhhh...... Not in my case. It will still come out cheaper than a single 2080ti. It would be cheaper for me to sidegrade and get a 2080 and keep the same performance in normal rasterized games and a boost in games that have HDR + a boost in FPS in my shader.
It's strange..... I have options. I don't care for RayTracing right now. But, it had to be reintroduced at some point. RayTracing has been around for a long ass time, it's not new. Support and games will come in time. But, this would not be the reason I would buy this card. DLSS is something that I am interested in. Since in some games that I have not played yet have heavy requirements at 4k. DLSS, in this case, would help out with overall FPS. In FFXI it will blow my single 1080ti out of the water only when it comes to this game because of DLSS.
So I am still trying to make up my mind if I should GO SLI or just SideGrade. This is not easy since I see the benefit with both.
It is truly a shame if it is just not possible, as anyone who has seen HDR implemented well knows that it is a genuine game changer. I can only imagine what Shadow of the Tomb Raider would look like in full HDR 3D, forced to chose I will ALWAYS chose 3D over HDR...but a part of me dies inside that I have to lose that amazing HDR graphics.
I wonder if it is possible in 3D vision? probably no HDR 3D vision monitors exist do they? I have always used hdtvs as my monitor, so I am limited to 3dtvplay or 3DM checkerboard TaB methods of playing in 3D. More and more games are including HDR on PC, but I am not sure we are going to be able to benefit at all...
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits
By the way, Shadow of the Tomb Raider 3D only works correctly with my setup in exclusive full-screen mode, DX11 only (DX12 currently crashes my game), normal 3D Vision mode (not SBS). HDR setting is disabled/greyed, for some unknown reason.
What I wonder is (if it's the same case as im ROTTR): why no 3D Vision mode in DX12? Isn't there any 3D Vision Direct mode for DX12 or something? Automatic works at the very least.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
oh interesting, so native interlaced signal works just fine and 3D HDR is the same improvement as 2D HDR? So in that case I would need to use the EDID override correct? I tried that on my old passive Sony tv when I first discovered 3D, and I noticed the ghosting was much worse than using the checkboard/TaB method so I removed my EDID at the time.
Maybe its time I try it with my LG, the only problem is I don't want to play every 3D game in native 4k since I play a lot of AAA titles that are way too demanding even for a 1080ti and would be down to 20 FPS etc. but now you have my curiosity peaked, so I need the EDID override for my tv and set Nvidia to 3DTV play to output interlaced 4k image with true HDR? (its been a while since I tested all of that stuff, been using TaB for 90% of fixes)
Crap tried to play FF XV in 3D Vision and is getting terrible performance.
Dropping down from highest settings to high at least gave me 10-20fps at worst.
Was down to 3fps using highest.
I'm enabling all the NVIDIA graphics features which are probably very demanding.
I easily use up all my 8GB of vram.
The game is also using 22GB of system ram.
Think this is the first time I've been above 16GB memory usage.
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
donations: ulfjalmbrant@hotmail.com
This games also have resolution scale,I usually lower that to 50-55 and the game still looks good and runs much better.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Windows 10 64bits
I'm having troubles with the EDID override as the image is not interlaced as it should be and it appears that left and right image has been blended together somehow. When I don't use the EDID override I can turn HDR on and off in the windows display properties. HDR is currently enabled all the time even in games not supporting HDR. As well as in windows overall.
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
donations: ulfjalmbrant@hotmail.com
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
This is definitely going to pay dividends as more and more pc titles launch with HDR support. Now we just need cards that can do 8k/60 for our 4k/60 3D needs :)
From a useability perspective, the EDID method enables automatic 3D - so, a user never needs to touch/use TV remote for 3D Vision gaming.
NVIDIA really needs to create new generation of 3D Vision glasses supporting interfaces like 1080p/120 Hz over HDMI (adopted by LG’s subsequent E/C models after abandoning 3D)...
My assumption, is from this post by BlueSkyDef, the author of SuperDepth 3D.
https://forums.geforce.com/default/topic/1061791/3d-vision/rtx-2080-incoming-/post/5877300/#5877300