Thanks for the info, D-Man!
The 21:9 concern was related to not having SBS/TB through 3D migoto, which I use to play in higher resolution and refresh rate combinations than 3D Vision would allow. No HMD in the mix.
I thought TriDef's frame sequential output was tied to AMD (AMD HD3D) cards at some point. Maybe I'm just plain wrong, or maybe that changed.
I had software for 3D Blu Ray's in the past, but I lost that solution when I moved from Win7 to Win10. A couple of years ago, dual-boot was the only PC-based solution I found for 3D Blu-rays. That was enough of a hassle to reinstate a PS3 exclusively for 3D playback.
The 21:9 concern was related to not having SBS/TB through 3D migoto, which I use to play in higher resolution and refresh rate combinations than 3D Vision would allow. No HMD in the mix.
I thought TriDef's frame sequential output was tied to AMD (AMD HD3D) cards at some point. Maybe I'm just plain wrong, or maybe that changed.
I had software for 3D Blu Ray's in the past, but I lost that solution when I moved from Win7 to Win10. A couple of years ago, dual-boot was the only PC-based solution I found for 3D Blu-rays. That was enough of a hassle to reinstate a PS3 exclusively for 3D playback.
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
If you are using a 3D Vision monitor with TriDef, you need Nvidia's stereoscopic driver's active to drive the USB emitter/glasses.
Otherwise, 3D capable displays will work with any supported format available on TriDef.
Of course 3Dmigoto is based on Nvidia's driver heuristics, so you could not use community fixes AFAIK.
I haven't used TriDef in several years because the license is tied to a Radeon Gpu that I have, so I'm uncertain about 3Dmigotos compatibility with it.
There are trials for both PowerDVD and WinDVD, although someone recently mentioned that the Cyberlink trial no longer offers 3D playback, it must be purchased first? Dunno
If you are using a 3D Vision monitor with TriDef, you need Nvidia's stereoscopic driver's active to drive the USB emitter/glasses.
Otherwise, 3D capable displays will work with any supported format available on TriDef.
Of course 3Dmigoto is based on Nvidia's driver heuristics, so you could not use community fixes AFAIK.
I haven't used TriDef in several years because the license is tied to a Radeon Gpu that I have, so I'm uncertain about 3Dmigotos compatibility with it.
There are trials for both PowerDVD and WinDVD, although someone recently mentioned that the Cyberlink trial no longer offers 3D playback, it must be purchased first? Dunno
Tridef's licence is tied to the computer, I do not know which component they detect exactly to differentiate between machines, but I have changed graphics card from my old Radeon HD7970 to an R9 300 (when it died), and then to my current Nvidia GTX1070 without issues.
I have a full Tridef licence though.
A few years ago there was an AMD subsidized offer for Tridef licences at half price (for Radeon users only of course).
I have not tried Tridef's frame sequential mode with my nvidia card. I used it a couple of times on my AMD card (lots of AMD driver bugs with HD3D bugs, new drivers kept breaking the output over and over)
If I remember the reports correctly on an Nvidia systems, Tridef could not access the 3D output directly, so the 3D Vision driver had to be active and rendering 3D at the same time as Tridef, (possibly creating huge headaches with which driver doing what).
I still use Tridef for some games, but I render in Side by Side Full-res (3840x1080).
For BluRay playback, I also abandoned software players very early on.
I'm using an external Sony BluRay 3D player. I also noticed that my internet service provider's TV decoder has a BluRay player, and it does support BluRay 3D as well, but does not detect the 3D tags of mp4 and mkv files over the network. (Free SAS, France, modem Freebox v6 "Revolution")
Tridef's licence is tied to the computer, I do not know which component they detect exactly to differentiate between machines, but I have changed graphics card from my old Radeon HD7970 to an R9 300 (when it died), and then to my current Nvidia GTX1070 without issues.
I have a full Tridef licence though.
A few years ago there was an AMD subsidized offer for Tridef licences at half price (for Radeon users only of course).
I have not tried Tridef's frame sequential mode with my nvidia card. I used it a couple of times on my AMD card (lots of AMD driver bugs with HD3D bugs, new drivers kept breaking the output over and over)
If I remember the reports correctly on an Nvidia systems, Tridef could not access the 3D output directly, so the 3D Vision driver had to be active and rendering 3D at the same time as Tridef, (possibly creating huge headaches with which driver doing what).
I still use Tridef for some games, but I render in Side by Side Full-res (3840x1080).
For BluRay playback, I also abandoned software players very early on.
I'm using an external Sony BluRay 3D player. I also noticed that my internet service provider's TV decoder has a BluRay player, and it does support BluRay 3D as well, but does not detect the 3D tags of mp4 and mkv files over the network. (Free SAS, France, modem Freebox v6 "Revolution")
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
I've got the UHD51A and an HDMI 2.0b cable connected to a GTX 1070 on Windows 10. I've been unsuccessful getting this 1080p/120Hz solution to work so far.
In reading back through this thread, I'm unclear whether or not this should work with 3DTV play or if I need to have a 3D Vision kit. I get the "resolution not compatible with 3D vision" error in regards to 1920x1080 @120Hz. It tells me to use 720p@60 or 1080p@24 as usual.
Will 1920x1080@120 even work with 3DTV play? Is an EDID override necessary?
I've got the UHD51A and an HDMI 2.0b cable connected to a GTX 1070 on Windows 10. I've been unsuccessful getting this 1080p/120Hz solution to work so far.
In reading back through this thread, I'm unclear whether or not this should work with 3DTV play or if I need to have a 3D Vision kit. I get the "resolution not compatible with 3D vision" error in regards to 1920x1080 @120Hz. It tells me to use 720p@60 or 1080p@24 as usual.
Will 1920x1080@120 even work with 3DTV play? Is an EDID override necessary?
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
3DTV Play is limited to 1080p @ 24hz. You need the 3D Vision kit for unlocking 1080p @ 120hz (or just the emitter - but it's not sold separately. If you're lucky you find one cheap on ebay). The emitter (pyramid) serves as a license dongle for unlocking 3D Vision / Generic CRT mode.
[quote="Dazzle233"]Is an EDID override necessary?[/quote]
Yes, additionally to the 3D Vision kit you need an EDID override. Look on page 10. I've posted the EDID of Optoma UHD 40/50 which might also work for UHD51A. UHD 40 / 50 use by default Generic CRT mode which is exactly the same as 3D Vision (just another name). Nobody can garantuee that this will work for UHD 51A but if you don't try you never know. At least UHD 51A seems to be very similar like UHD 50 so chances might be high. You would be the first to find out - nobodoy tried before.
3DTV Play is limited to 1080p @ 24hz. You need the 3D Vision kit for unlocking 1080p @ 120hz (or just the emitter - but it's not sold separately. If you're lucky you find one cheap on ebay). The emitter (pyramid) serves as a license dongle for unlocking 3D Vision / Generic CRT mode.
Dazzle233 said:Is an EDID override necessary?
Yes, additionally to the 3D Vision kit you need an EDID override. Look on page 10. I've posted the EDID of Optoma UHD 40/50 which might also work for UHD51A. UHD 40 / 50 use by default Generic CRT mode which is exactly the same as 3D Vision (just another name). Nobody can garantuee that this will work for UHD 51A but if you don't try you never know. At least UHD 51A seems to be very similar like UHD 50 so chances might be high. You would be the first to find out - nobodoy tried before.
ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de
Thanks for the lightning fast reply, Paul! Good to know the details...bad to know more money needs to be spent :/. I'll see what I can find on ebay.
Does the version of the kit matter? 3D Vision vs. 3D Vision 2?
Thanks!
Sorry I don't know if that matters (I only have Kit 2) but maybe other people here can tell if there are technical differences between the 3D Vision 1 and 2 emitter. If the pyramid is both the same it wouldn't matter (at least it's very likely). I only know that the 3D Vision glasses differ between Kit 1 and 2 but that wouldn't matter as you can use cheap DLP glasses anyway. In fact for my UHD 40 only DLP glasses work. I tried my 3D Vision glasses with Optoma UHD 40 and it wasn't synced properly. With DLP glasses it works perfectly.
Sorry I don't know if that matters (I only have Kit 2) but maybe other people here can tell if there are technical differences between the 3D Vision 1 and 2 emitter. If the pyramid is both the same it wouldn't matter (at least it's very likely). I only know that the 3D Vision glasses differ between Kit 1 and 2 but that wouldn't matter as you can use cheap DLP glasses anyway. In fact for my UHD 40 only DLP glasses work. I tried my 3D Vision glasses with Optoma UHD 40 and it wasn't synced properly. With DLP glasses it works perfectly.
ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de
The only difference between 3d vision 1 and 2 is the glasses. The emitter is exactly the same.
There are always some on ebay and if you're lucky you'll manage to get some cheap. I just paid £40 on ebay for a full 3d vision kit with glasses plus and extra pair of glasses. That's a version 1 kit though, V2 glasses tend to be more expensive.
Have you been able to at least get the projector working in 2d desktop mode at 1080p/120Hz?
The only difference between 3d vision 1 and 2 is the glasses. The emitter is exactly the same.
There are always some on ebay and if you're lucky you'll manage to get some cheap. I just paid £40 on ebay for a full 3d vision kit with glasses plus and extra pair of glasses. That's a version 1 kit though, V2 glasses tend to be more expensive.
Have you been able to at least get the projector working in 2d desktop mode at 1080p/120Hz?
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Another bullet bitten....I just bought the 3D Vision 2 kit new. Won't be here until mid week.
I have gotten 1080p @ 120Hz in 2D with no problems. 3840x2160 @ 60Hz has been giving me issues though, which is the projector's native resolution. 4096x2160 @ 60Hz is smooth with no issues and makes for lovely 2D content. I play ultrawide as much as possible, forcing 3440x1440. When I do that, some games send the projector a 3840x2160 signal and others are sending 4096x2160.
When the projector gets a 3840x2160 signal from the PC it tends to split the screen vertically at about 33%/66%. It's weird...the left half of the screen is on the right side and gets cut off at about 1/3 of the screen width. The other 2/3 of the screen is on the left side.
When it gets 4096x2160 I have absolutely no issues. Grid 2 and Gears of War Ultimate Edition were a couple I tried that signaled 4096x2160 @ 60Hz. That resolution functions more like the native resolution of the projector so far. Looks awesome when it works.
Another bullet bitten....I just bought the 3D Vision 2 kit new. Won't be here until mid week.
I have gotten 1080p @ 120Hz in 2D with no problems. 3840x2160 @ 60Hz has been giving me issues though, which is the projector's native resolution. 4096x2160 @ 60Hz is smooth with no issues and makes for lovely 2D content. I play ultrawide as much as possible, forcing 3440x1440. When I do that, some games send the projector a 3840x2160 signal and others are sending 4096x2160.
When the projector gets a 3840x2160 signal from the PC it tends to split the screen vertically at about 33%/66%. It's weird...the left half of the screen is on the right side and gets cut off at about 1/3 of the screen width. The other 2/3 of the screen is on the left side.
When it gets 4096x2160 I have absolutely no issues. Grid 2 and Gears of War Ultimate Edition were a couple I tried that signaled 4096x2160 @ 60Hz. That resolution functions more like the native resolution of the projector so far. Looks awesome when it works.
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
For dx11 games 3dmigoto has a resolution override that you could try.
For dx9 try chiri's resolution override or DXOverride.
Or you can edit your EDID with only 3440x1440 showing as an available resolution to use only when you need it, then simply revert back to the original EDID when not needed
For dx11 games 3dmigoto has a resolution override that you could try.
For dx9 try chiri's resolution override or DXOverride.
Or you can edit your EDID with only 3440x1440 showing as an available resolution to use only when you need it, then simply revert back to the original EDID when not needed
FYI, some Observations...
1. So my 8 year old DLP projector has started developing dead mirror pixels as white and black dots. I replaced the DMD chip ~4 years ago, but a replacement from the same-ish batch has started producing the same problem after a replacement last week. That's 3 bad chips in 8 years.
FYI, anyone replacing the DMD chips, there are different versions which are compatible. The later the revision version, the better it copes with lifetime fatigue and resists bad "pixels".
E.g. Original DMD chip = 1280-6038B
rev1 = 1280-6039B
.
.
latest revision which is compatible with the projector is 1280-6439B
So, this is an FYI to anyone replacing a defective DMD chip - research and get the latest version for a marginally higher price, lest you get the same original and are stuck with the same problem a little while down the line.
2. If the projector fails, aside from the DMD, it is very likely to be the power supply board capacitors. Re-soldering in and replacing all new electrolytic capacitors will likely fix the problem than not.
3. As mentioned elsewhere, IMHO, 720p/800p 4x DSR to 1440p/1600p produces far superior results than native 1080p @ 120Hz with standard AA (MSAA etc). An added advantage is that DSR is universally supported, and games look stunning in combination with a high sharpening filter if using ReShade / SweetFX, or if the game has one built-in.
I'm finding that my 2x 1080s in SLi just about enough for 1600p 3DV gaming.
Of course, the only time 1080p result is better is if using 1080p also 4x DSR to 3840x2160. But unless having 2x 2080 Tis in SLi, it's not realistic to play modern games in this resolution in 3DV and expect 60FPS.
Thus, I find myself in a very awkward situation - my 8 year old technology producing better overall results than modern cutting edge technology when combined with available/affordable hardware which realistically enables play @ smooth 60FPS. Strange...
4. I have personally been after the "4K" .66" DMDs with 2716 x 1528 native @ 120Hz and RGBRGB wheels such as perhaps the Optoma UHD65 (though not as pricey). Presumably these don't exist yet tested at 120Hz 2716 x 1528? :)
Certainly Ti says their chip is intrinsically designed that way:
[url]https://e2e.ti.com/support/dlp/f/94/t/727474[/url]
[url]http://www.ti.com/product/DLP660TE[/url]
It's bizarre that manufacturers don't allow the native 2716 x 1528 resolution at 120Hz for gamers etc, without hacks - it doesn't cost them anything but is a huge feature.
1. So my 8 year old DLP projector has started developing dead mirror pixels as white and black dots. I replaced the DMD chip ~4 years ago, but a replacement from the same-ish batch has started producing the same problem after a replacement last week. That's 3 bad chips in 8 years.
FYI, anyone replacing the DMD chips, there are different versions which are compatible. The later the revision version, the better it copes with lifetime fatigue and resists bad "pixels".
E.g. Original DMD chip = 1280-6038B
rev1 = 1280-6039B
.
.
latest revision which is compatible with the projector is 1280-6439B
So, this is an FYI to anyone replacing a defective DMD chip - research and get the latest version for a marginally higher price, lest you get the same original and are stuck with the same problem a little while down the line.
2. If the projector fails, aside from the DMD, it is very likely to be the power supply board capacitors. Re-soldering in and replacing all new electrolytic capacitors will likely fix the problem than not.
3. As mentioned elsewhere, IMHO, 720p/800p 4x DSR to 1440p/1600p produces far superior results than native 1080p @ 120Hz with standard AA (MSAA etc). An added advantage is that DSR is universally supported, and games look stunning in combination with a high sharpening filter if using ReShade / SweetFX, or if the game has one built-in.
I'm finding that my 2x 1080s in SLi just about enough for 1600p 3DV gaming.
Of course, the only time 1080p result is better is if using 1080p also 4x DSR to 3840x2160. But unless having 2x 2080 Tis in SLi, it's not realistic to play modern games in this resolution in 3DV and expect 60FPS.
Thus, I find myself in a very awkward situation - my 8 year old technology producing better overall results than modern cutting edge technology when combined with available/affordable hardware which realistically enables play @ smooth 60FPS. Strange...
4. I have personally been after the "4K" .66" DMDs with 2716 x 1528 native @ 120Hz and RGBRGB wheels such as perhaps the Optoma UHD65 (though not as pricey). Presumably these don't exist yet tested at 120Hz 2716 x 1528? :)
Certainly Ti says their chip is intrinsically designed that way:
It's bizarre that manufacturers don't allow the native 2716 x 1528 resolution at 120Hz for gamers etc, without hacks - it doesn't cost them anything but is a huge feature.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Hi RAGEdemon,
sorry to hear that your DMD starts producing dead pixels. I know it's not a cheap alternative but maybe you can give Optoma UHD 40/50 a chance (1080p@120hz)
[quote="RAGEdemon"]3. As mentioned elsewhere, IMHO, 720p/800p 4x DSR to 1440p/1600p produces far superior results than native 1080p @ 120Hz with standard AA (MSAA etc). An added advantage is that DSR is universally supported, and games look stunning in combination with a high sharpening filter if using ReShade / SweetFX, or if the game has one built-in.[/quote]
May I ask if you already saw 1080p @120z 3D in action on a DLP projector or where did you get the information that 720p @1440p DSR looks that much better than native 1080p@120hz?
Of course edges are a little bit smoother when calculating the image in 1440p DSR internally. But the overall image looks much clearer and sharper with native 1080p @120hz. Before having the Optoma UHD 40 I had a 720p projector (BenQ W710ST) and also used 1440p DSR resolution with it but this never looked satisfying for me.
Believe me when saying that 1080p@120hz is a great improvement above 720p DSR. I also tried 4k and 1440p DSR with the Optoma UHD 40 and 4k DSR looks of course stunning. But I must say that you don't lose that much of sharpness when using 1440p DSR with it. At least on a monitor this DSR setting would look much worse.
I don't know if you are interested in all the other aspects of a modern 4k projector but in my opinion the Optoma UHD 40 is the best allrounder. Texts are superior sharp in 4K, browsing the internet on a projector makes fun at last and of course you can play in 4K HDR which is sometimes the only choice if you want to play some games (like God of War on PS4)
I'm very happy with this projector and never would want to go back to 720p DSR
sorry to hear that your DMD starts producing dead pixels. I know it's not a cheap alternative but maybe you can give Optoma UHD 40/50 a chance (1080p@120hz)
RAGEdemon said:3. As mentioned elsewhere, IMHO, 720p/800p 4x DSR to 1440p/1600p produces far superior results than native 1080p @ 120Hz with standard AA (MSAA etc). An added advantage is that DSR is universally supported, and games look stunning in combination with a high sharpening filter if using ReShade / SweetFX, or if the game has one built-in.
May I ask if you already saw 1080p @120z 3D in action on a DLP projector or where did you get the information that 720p @1440p DSR looks that much better than native 1080p@120hz?
Of course edges are a little bit smoother when calculating the image in 1440p DSR internally. But the overall image looks much clearer and sharper with native 1080p @120hz. Before having the Optoma UHD 40 I had a 720p projector (BenQ W710ST) and also used 1440p DSR resolution with it but this never looked satisfying for me.
Believe me when saying that 1080p@120hz is a great improvement above 720p DSR. I also tried 4k and 1440p DSR with the Optoma UHD 40 and 4k DSR looks of course stunning. But I must say that you don't lose that much of sharpness when using 1440p DSR with it. At least on a monitor this DSR setting would look much worse.
I don't know if you are interested in all the other aspects of a modern 4k projector but in my opinion the Optoma UHD 40 is the best allrounder. Texts are superior sharp in 4K, browsing the internet on a projector makes fun at last and of course you can play in 4K HDR which is sometimes the only choice if you want to play some games (like God of War on PS4)
I'm very happy with this projector and never would want to go back to 720p DSR
ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de
Thanks for the reply Pauldusler.
I compared it to my brother's Asus 144Hz 1080p gaming monitor.
I compared (different houses so not side-by-side):
4x DSR 1600p + sharpening on the 800p projector
Vs.
1080p native + 4xMSAA etc.
I felt this was a somewhat fair comparison as performance wise, 1600p is generally around the same as 1080p + 4xMSAA.
Perhaps it was my 4m distance to my screen vs normal viewing monitor distance, but the monitor AA was not acceptable to me - it was uneven in the games in which it even worked - I had been spoiled by 4X DSR. I also tried turning on sharpening while using the monitor in an effort to be more objective, but unfortunately it looked horrible without DSR simply because it, of course, only amplified the jaggies.
I have no doubt that 4K with HDR etc on the UHD40 looks absolutely amazing. Nor do I doubt for a second that 1080p DSR to 4K @120Hz blows 800p DSR/1600p out of the water.
Unfortunately, after the testing, I find myself uninterested in:
a. 1080p + MSAA solution.
b. And 1080p + DSR to 4k @120Hz...
...simply due to performance available to muster 60 FPS:
After being spoiled by DSR, I only know that I can't play any modern game at UHD40's DSR 4k 120Hz @60FPS, or else I would jump on the UHD40 or similar in an instant - at least not without 2 2080 Tis. The best I can manage with SLi 1080s right now is ~1600p, which is OK for 800p projector, but would be amazing with a 2716x1528 native at an affordable price :)
I guess everyone has their own personal taste :)
I compared it to my brother's Asus 144Hz 1080p gaming monitor.
I compared (different houses so not side-by-side):
4x DSR 1600p + sharpening on the 800p projector
Vs.
1080p native + 4xMSAA etc.
I felt this was a somewhat fair comparison as performance wise, 1600p is generally around the same as 1080p + 4xMSAA.
Perhaps it was my 4m distance to my screen vs normal viewing monitor distance, but the monitor AA was not acceptable to me - it was uneven in the games in which it even worked - I had been spoiled by 4X DSR. I also tried turning on sharpening while using the monitor in an effort to be more objective, but unfortunately it looked horrible without DSR simply because it, of course, only amplified the jaggies.
I have no doubt that 4K with HDR etc on the UHD40 looks absolutely amazing. Nor do I doubt for a second that 1080p DSR to 4K @120Hz blows 800p DSR/1600p out of the water.
Unfortunately, after the testing, I find myself uninterested in:
a. 1080p + MSAA solution.
b. And 1080p + DSR to 4k @120Hz...
...simply due to performance available to muster 60 FPS:
After being spoiled by DSR, I only know that I can't play any modern game at UHD40's DSR 4k 120Hz @60FPS, or else I would jump on the UHD40 or similar in an instant - at least not without 2 2080 Tis. The best I can manage with SLi 1080s right now is ~1600p, which is OK for 800p projector, but would be amazing with a 2716x1528 native at an affordable price :)
I guess everyone has their own personal taste :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Jumped from the Benq1070 to the Optoma Uhd40
I agree with Paul , the jump is big going from 720P DSR (2540x2160) to Native 1080P
I replayed several game`s and I could notice mutch more details.
With Simracing keeping focussed on the clipping points became more easy.
Running with a Gtx980 so preformance wise it is a improvement because I barely use DSR now.
Because the quality is still better then on 720P with DSR.
Found myself wondering in flotsam for a good hour looking at all those details
Also the projector came with a function "ultra detail"
Iam mostly skeptical with extra functions and leave them off.
But by turning this one up felt like a build in Reshade in the projector , i like it
Though if you want to do 4K dsr, that is indeed like very preformance heavy
I agree with that everyone has his personal taste :)
Still very happy with the jump to the Uhd40, and just wanna say thanks again to this community
Without this community 3d would be dead, and would never bought Uhd40 mainly because it can do 1080P 3D :)
Jumped from the Benq1070 to the Optoma Uhd40
I agree with Paul , the jump is big going from 720P DSR (2540x2160) to Native 1080P
I replayed several game`s and I could notice mutch more details.
With Simracing keeping focussed on the clipping points became more easy.
Running with a Gtx980 so preformance wise it is a improvement because I barely use DSR now.
Because the quality is still better then on 720P with DSR.
Found myself wondering in flotsam for a good hour looking at all those details
Also the projector came with a function "ultra detail"
Iam mostly skeptical with extra functions and leave them off.
But by turning this one up felt like a build in Reshade in the projector , i like it
Though if you want to do 4K dsr, that is indeed like very preformance heavy
I agree with that everyone has his personal taste :)
Still very happy with the jump to the Uhd40, and just wanna say thanks again to this community
Without this community 3d would be dead, and would never bought Uhd40 mainly because it can do 1080P 3D :)
Thanks for the reply frank.binnendijk,
When you said "the jump is big going from 720P DSR ([color="green"]2540x2160[/color]) to Native 1080P", did you mean "2560x1440", i.e. 4x DSR?
Also, about "ultra detail" - how well does it work with 3DV? It seems to be a simple sharpening filter, wich is exactly what I want - are there different levels of power? 3DMigoto + Reshade is a PITA to get to work together; this feature alone might push me to seriously consider this projector :)
My new DMD chip ought to arrive in a couple of weeks...
In the meantime, I have started a thread here inquiring about .66" 120Hz capability for anyone else interested... I think D-Man11 was at one point.
[url]https://www.avsforum.com/forum/68-digital-projectors-under-3-000-usd-msrp/3020898-any-confirmed-120hz-input-66-2716x1528-projectors-dispelling-no-3d-myth.html[/url]
When you said "the jump is big going from 720P DSR (2540x2160) to Native 1080P", did you mean "2560x1440", i.e. 4x DSR?
Also, about "ultra detail" - how well does it work with 3DV? It seems to be a simple sharpening filter, wich is exactly what I want - are there different levels of power? 3DMigoto + Reshade is a PITA to get to work together; this feature alone might push me to seriously consider this projector :)
My new DMD chip ought to arrive in a couple of weeks...
In the meantime, I have started a thread here inquiring about .66" 120Hz capability for anyone else interested... I think D-Man11 was at one point.
The 21:9 concern was related to not having SBS/TB through 3D migoto, which I use to play in higher resolution and refresh rate combinations than 3D Vision would allow. No HMD in the mix.
I thought TriDef's frame sequential output was tied to AMD (AMD HD3D) cards at some point. Maybe I'm just plain wrong, or maybe that changed.
I had software for 3D Blu Ray's in the past, but I lost that solution when I moved from Win7 to Win10. A couple of years ago, dual-boot was the only PC-based solution I found for 3D Blu-rays. That was enough of a hassle to reinstate a PS3 exclusively for 3D playback.
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
Otherwise, 3D capable displays will work with any supported format available on TriDef.
Of course 3Dmigoto is based on Nvidia's driver heuristics, so you could not use community fixes AFAIK.
I haven't used TriDef in several years because the license is tied to a Radeon Gpu that I have, so I'm uncertain about 3Dmigotos compatibility with it.
There are trials for both PowerDVD and WinDVD, although someone recently mentioned that the Cyberlink trial no longer offers 3D playback, it must be purchased first? Dunno
I have a full Tridef licence though.
A few years ago there was an AMD subsidized offer for Tridef licences at half price (for Radeon users only of course).
I have not tried Tridef's frame sequential mode with my nvidia card. I used it a couple of times on my AMD card (lots of AMD driver bugs with HD3D bugs, new drivers kept breaking the output over and over)
If I remember the reports correctly on an Nvidia systems, Tridef could not access the 3D output directly, so the 3D Vision driver had to be active and rendering 3D at the same time as Tridef, (possibly creating huge headaches with which driver doing what).
I still use Tridef for some games, but I render in Side by Side Full-res (3840x1080).
For BluRay playback, I also abandoned software players very early on.
I'm using an external Sony BluRay 3D player. I also noticed that my internet service provider's TV decoder has a BluRay player, and it does support BluRay 3D as well, but does not detect the 3D tags of mp4 and mkv files over the network. (Free SAS, France, modem Freebox v6 "Revolution")
Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter
In reading back through this thread, I'm unclear whether or not this should work with 3DTV play or if I need to have a 3D Vision kit. I get the "resolution not compatible with 3D vision" error in regards to 1920x1080 @120Hz. It tells me to use 720p@60 or 1080p@24 as usual.
Will 1920x1080@120 even work with 3DTV play? Is an EDID override necessary?
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
Yes, additionally to the 3D Vision kit you need an EDID override. Look on page 10. I've posted the EDID of Optoma UHD 40/50 which might also work for UHD51A. UHD 40 / 50 use by default Generic CRT mode which is exactly the same as 3D Vision (just another name). Nobody can garantuee that this will work for UHD 51A but if you don't try you never know. At least UHD 51A seems to be very similar like UHD 50 so chances might be high. You would be the first to find out - nobodoy tried before.
ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de
Does the version of the kit matter? 3D Vision vs. 3D Vision 2?
Thanks!
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de
There are always some on ebay and if you're lucky you'll manage to get some cheap. I just paid £40 on ebay for a full 3d vision kit with glasses plus and extra pair of glasses. That's a version 1 kit though, V2 glasses tend to be more expensive.
Have you been able to at least get the projector working in 2d desktop mode at 1080p/120Hz?
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I have gotten 1080p @ 120Hz in 2D with no problems. 3840x2160 @ 60Hz has been giving me issues though, which is the projector's native resolution. 4096x2160 @ 60Hz is smooth with no issues and makes for lovely 2D content. I play ultrawide as much as possible, forcing 3440x1440. When I do that, some games send the projector a 3840x2160 signal and others are sending 4096x2160.
When the projector gets a 3840x2160 signal from the PC it tends to split the screen vertically at about 33%/66%. It's weird...the left half of the screen is on the right side and gets cut off at about 1/3 of the screen width. The other 2/3 of the screen is on the left side.
When it gets 4096x2160 I have absolutely no issues. Grid 2 and Gears of War Ultimate Edition were a couple I tried that signaled 4096x2160 @ 60Hz. That resolution functions more like the native resolution of the projector so far. Looks awesome when it works.
3DTV Play / TriDef 3D
EVGA GTX 1070 (x2 SLI)
Win 10 Pro
i5-3570k @ 4.2GHz
8GB RAM
Optoma UHD51A
For dx9 try chiri's resolution override or DXOverride.
Or you can edit your EDID with only 3440x1440 showing as an available resolution to use only when you need it, then simply revert back to the original EDID when not needed
1. So my 8 year old DLP projector has started developing dead mirror pixels as white and black dots. I replaced the DMD chip ~4 years ago, but a replacement from the same-ish batch has started producing the same problem after a replacement last week. That's 3 bad chips in 8 years.
FYI, anyone replacing the DMD chips, there are different versions which are compatible. The later the revision version, the better it copes with lifetime fatigue and resists bad "pixels".
E.g. Original DMD chip = 1280-6038B
rev1 = 1280-6039B
.
.
latest revision which is compatible with the projector is 1280-6439B
So, this is an FYI to anyone replacing a defective DMD chip - research and get the latest version for a marginally higher price, lest you get the same original and are stuck with the same problem a little while down the line.
2. If the projector fails, aside from the DMD, it is very likely to be the power supply board capacitors. Re-soldering in and replacing all new electrolytic capacitors will likely fix the problem than not.
3. As mentioned elsewhere, IMHO, 720p/800p 4x DSR to 1440p/1600p produces far superior results than native 1080p @ 120Hz with standard AA (MSAA etc). An added advantage is that DSR is universally supported, and games look stunning in combination with a high sharpening filter if using ReShade / SweetFX, or if the game has one built-in.
I'm finding that my 2x 1080s in SLi just about enough for 1600p 3DV gaming.
Of course, the only time 1080p result is better is if using 1080p also 4x DSR to 3840x2160. But unless having 2x 2080 Tis in SLi, it's not realistic to play modern games in this resolution in 3DV and expect 60FPS.
Thus, I find myself in a very awkward situation - my 8 year old technology producing better overall results than modern cutting edge technology when combined with available/affordable hardware which realistically enables play @ smooth 60FPS. Strange...
4. I have personally been after the "4K" .66" DMDs with 2716 x 1528 native @ 120Hz and RGBRGB wheels such as perhaps the Optoma UHD65 (though not as pricey). Presumably these don't exist yet tested at 120Hz 2716 x 1528? :)
Certainly Ti says their chip is intrinsically designed that way:
https://e2e.ti.com/support/dlp/f/94/t/727474
http://www.ti.com/product/DLP660TE
It's bizarre that manufacturers don't allow the native 2716 x 1528 resolution at 120Hz for gamers etc, without hacks - it doesn't cost them anything but is a huge feature.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
sorry to hear that your DMD starts producing dead pixels. I know it's not a cheap alternative but maybe you can give Optoma UHD 40/50 a chance (1080p@120hz)
May I ask if you already saw 1080p @120z 3D in action on a DLP projector or where did you get the information that 720p @1440p DSR looks that much better than native 1080p@120hz?
Of course edges are a little bit smoother when calculating the image in 1440p DSR internally. But the overall image looks much clearer and sharper with native 1080p @120hz. Before having the Optoma UHD 40 I had a 720p projector (BenQ W710ST) and also used 1440p DSR resolution with it but this never looked satisfying for me.
Believe me when saying that 1080p@120hz is a great improvement above 720p DSR. I also tried 4k and 1440p DSR with the Optoma UHD 40 and 4k DSR looks of course stunning. But I must say that you don't lose that much of sharpness when using 1440p DSR with it. At least on a monitor this DSR setting would look much worse.
I don't know if you are interested in all the other aspects of a modern 4k projector but in my opinion the Optoma UHD 40 is the best allrounder. Texts are superior sharp in 4K, browsing the internet on a projector makes fun at last and of course you can play in 4K HDR which is sometimes the only choice if you want to play some games (like God of War on PS4)
I'm very happy with this projector and never would want to go back to 720p DSR
ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de
I compared it to my brother's Asus 144Hz 1080p gaming monitor.
I compared (different houses so not side-by-side):
4x DSR 1600p + sharpening on the 800p projector
Vs.
1080p native + 4xMSAA etc.
I felt this was a somewhat fair comparison as performance wise, 1600p is generally around the same as 1080p + 4xMSAA.
Perhaps it was my 4m distance to my screen vs normal viewing monitor distance, but the monitor AA was not acceptable to me - it was uneven in the games in which it even worked - I had been spoiled by 4X DSR. I also tried turning on sharpening while using the monitor in an effort to be more objective, but unfortunately it looked horrible without DSR simply because it, of course, only amplified the jaggies.
I have no doubt that 4K with HDR etc on the UHD40 looks absolutely amazing. Nor do I doubt for a second that 1080p DSR to 4K @120Hz blows 800p DSR/1600p out of the water.
Unfortunately, after the testing, I find myself uninterested in:
a. 1080p + MSAA solution.
b. And 1080p + DSR to 4k @120Hz...
...simply due to performance available to muster 60 FPS:
After being spoiled by DSR, I only know that I can't play any modern game at UHD40's DSR 4k 120Hz @60FPS, or else I would jump on the UHD40 or similar in an instant - at least not without 2 2080 Tis. The best I can manage with SLi 1080s right now is ~1600p, which is OK for 800p projector, but would be amazing with a 2716x1528 native at an affordable price :)
I guess everyone has their own personal taste :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I agree with Paul , the jump is big going from 720P DSR (2540x2160) to Native 1080P
I replayed several game`s and I could notice mutch more details.
With Simracing keeping focussed on the clipping points became more easy.
Running with a Gtx980 so preformance wise it is a improvement because I barely use DSR now.
Because the quality is still better then on 720P with DSR.
Found myself wondering in flotsam for a good hour looking at all those details
Also the projector came with a function "ultra detail"
Iam mostly skeptical with extra functions and leave them off.
But by turning this one up felt like a build in Reshade in the projector , i like it
Though if you want to do 4K dsr, that is indeed like very preformance heavy
I agree with that everyone has his personal taste :)
Still very happy with the jump to the Uhd40, and just wanna say thanks again to this community
Without this community 3d would be dead, and would never bought Uhd40 mainly because it can do 1080P 3D :)
When you said "the jump is big going from 720P DSR (2540x2160) to Native 1080P", did you mean "2560x1440", i.e. 4x DSR?
Also, about "ultra detail" - how well does it work with 3DV? It seems to be a simple sharpening filter, wich is exactly what I want - are there different levels of power? 3DMigoto + Reshade is a PITA to get to work together; this feature alone might push me to seriously consider this projector :)
My new DMD chip ought to arrive in a couple of weeks...
In the meantime, I have started a thread here inquiring about .66" 120Hz capability for anyone else interested... I think D-Man11 was at one point.
https://www.avsforum.com/forum/68-digital-projectors-under-3-000-usd-msrp/3020898-any-confirmed-120hz-input-66-2716x1528-projectors-dispelling-no-3d-myth.html
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.