Please add driver/software support for HDMI 2.0 (or at least 1.4b) to 3DTV Play for GTX980, which has the appropriate HDMI 2.0 hardware. Note that a number of UHD TV displays exist which are capable of supporting 1080p60, including both my HDMI 2.0 passive LG 55UB9500 and my HDMI 1.4b+ passive Sony 65X900A (capable 1.4b displays have been available for almost a year). Also note that my AMD 7970 provides full-frame 1080p30 3D, while my AMD R9-290X (with Catalyst 14.12) actually provides full HDMI 1.4b support - full-frame 1080p 60 fps 3D - when using TriDef's Ignition, set to HDMI mode (HD3D)! Games like CoD Advanced Warfare look great with 290X and Ignition, but Ignition does not support some new games like Dragon Age Inquisition. Please update 3DTV Play to reflect current display industry trends and capabilities.
Please add driver/software support for HDMI 2.0 (or at least 1.4b) to 3DTV Play for GTX980, which has the appropriate HDMI 2.0 hardware. Note that a number of UHD TV displays exist which are capable of supporting 1080p60, including both my HDMI 2.0 passive LG 55UB9500 and my HDMI 1.4b+ passive Sony 65X900A (capable 1.4b displays have been available for almost a year). Also note that my AMD 7970 provides full-frame 1080p30 3D, while my AMD R9-290X (with Catalyst 14.12) actually provides full HDMI 1.4b support - full-frame 1080p 60 fps 3D - when using TriDef's Ignition, set to HDMI mode (HD3D)! Games like CoD Advanced Warfare look great with 290X and Ignition, but Ignition does not support some new games like Dragon Age Inquisition. Please update 3DTV Play to reflect current display industry trends and capabilities.
Guys nobody from Nvidia reads these threads. YOU HAVE to complain to nvidia directly. Open a support ticket and then you can post some links to relevant threads. I've already complained months ago and opened a ticket but got the usually BS nvidia reply. The more people complain, the better the change of it getting addressed.
Guys nobody from Nvidia reads these threads. YOU HAVE to complain to nvidia directly. Open a support ticket and then you can post some links to relevant threads. I've already complained months ago and opened a ticket but got the usually BS nvidia reply. The more people complain, the better the change of it getting addressed.
Thanks for the suggestion. I already linked a Customer Care support request to this discussion thread, to formalize this technology update request. I also have talked directly to phone support, before purchasing 980. Note that AMD can now support 1080p 60 fps full-frame 3D via HDMI - others can also use this information in their support tickets ...
Thanks for the suggestion. I already linked a Customer Care support request to this discussion thread, to formalize this technology update request. I also have talked directly to phone support, before purchasing 980. Note that AMD can now support 1080p 60 fps full-frame 3D via HDMI - others can also use this information in their support tickets ...
[quote="Conan481"]Guys nobody from Nvidia reads these threads. YOU HAVE to complain to nvidia directly. Open a support ticket and then you can post some links to relevant threads. I've already complained months ago and opened a ticket but got the usually BS nvidia reply. The more people complain, the better the change of it getting addressed.[/quote]
That doesn't work. I opened a ticket with them over a month ago and they never replied.
Also, we need more than just HDMI 2.0 support. We also need side-by-side support like TriDef has.
With side-by-side 3D support, you can feed a 3D signal at 4K resolution to a Samsung active 3D 4K TV and the TV will produce a 3D picture at 4K resolution (1920x2160 per eye - the highest resolution 3D you can get on any display, television or monitor, on the market)
Only TriDef will allow you to play in 3D at 2x 1080p resolution on a Samsung 4K TV thanks to its support for side-by-side mode at 4K resolution over HDMI 2.0. I'm not content with just getting a 1920x1080p full-frame 3D via HDMI on my 4K TV when my 4K TV can accept a 4K side-by-side 3D signal and deliver to me a 1920x2160 resolution 3D picture (that's 4 million pixels - a huge improvement over 1080p's 2 million pixels of resolution).
So even if NVIDIA does update 3DTV Play to allow you to do 1920x1080 @ 60 Hz over HDMI, it still won't be enough. I want to be able to achieve the same 3D resolution that I can with TriDef, and the only way that is happening is NVIDIA updating 3DTV Play to allow you to do a side-by-side 3D signal at 4K resolution over HDMI. I didn't buy a 4K TV just to watch content at 1080p resolution when in 3D, but I'm pretty content with being able to watch content at twice-1080p resolution in 3D.
This is a sample footage of Watch Dogs being played using TriDef at 4K resolution + 3D side-by-side mode:
https://www.youtube.com/watch?v=2dMVAdhWzwY
I don't know why NVIDIA is content with letting its competitor outclass them like this. They must be banking on most NVIDIA users being ignorant to the existence of TriDef or thinking TriDef is only for AMD GPUs.
Conan481 said:Guys nobody from Nvidia reads these threads. YOU HAVE to complain to nvidia directly. Open a support ticket and then you can post some links to relevant threads. I've already complained months ago and opened a ticket but got the usually BS nvidia reply. The more people complain, the better the change of it getting addressed.
That doesn't work. I opened a ticket with them over a month ago and they never replied.
Also, we need more than just HDMI 2.0 support. We also need side-by-side support like TriDef has.
With side-by-side 3D support, you can feed a 3D signal at 4K resolution to a Samsung active 3D 4K TV and the TV will produce a 3D picture at 4K resolution (1920x2160 per eye - the highest resolution 3D you can get on any display, television or monitor, on the market)
Only TriDef will allow you to play in 3D at 2x 1080p resolution on a Samsung 4K TV thanks to its support for side-by-side mode at 4K resolution over HDMI 2.0. I'm not content with just getting a 1920x1080p full-frame 3D via HDMI on my 4K TV when my 4K TV can accept a 4K side-by-side 3D signal and deliver to me a 1920x2160 resolution 3D picture (that's 4 million pixels - a huge improvement over 1080p's 2 million pixels of resolution).
So even if NVIDIA does update 3DTV Play to allow you to do 1920x1080 @ 60 Hz over HDMI, it still won't be enough. I want to be able to achieve the same 3D resolution that I can with TriDef, and the only way that is happening is NVIDIA updating 3DTV Play to allow you to do a side-by-side 3D signal at 4K resolution over HDMI. I didn't buy a 4K TV just to watch content at 1080p resolution when in 3D, but I'm pretty content with being able to watch content at twice-1080p resolution in 3D.
This is a sample footage of Watch Dogs being played using TriDef at 4K resolution + 3D side-by-side mode:
I don't know why NVIDIA is content with letting its competitor outclass them like this. They must be banking on most NVIDIA users being ignorant to the existence of TriDef or thinking TriDef is only for AMD GPUs.
Xizer. Side by Side would be awesome for 4K TV as well.
So what resolution to you have to render the game at? 4K? Most computer systems would really struggle with getting an acceptable frame rate at 4K res. I doubt my 970 SLI would be good enough for something like watchdogs. I'd rather have 1080p frame packed for performance and I'm looking for Passive 4K which would only do 1080p to each eye anyway.
What kind of specs are you running and how is the ghosting with the Samsung active 3D, as I've seen many active 3D sets and they all ghost more then I'd like.
Xizer. Side by Side would be awesome for 4K TV as well.
So what resolution to you have to render the game at? 4K? Most computer systems would really struggle with getting an acceptable frame rate at 4K res. I doubt my 970 SLI would be good enough for something like watchdogs. I'd rather have 1080p frame packed for performance and I'm looking for Passive 4K which would only do 1080p to each eye anyway.
What kind of specs are you running and how is the ghosting with the Samsung active 3D, as I've seen many active 3D sets and they all ghost more then I'd like.
Just scanned the Tridef forums, which I haven't done in a while and I already own Tridef, and it appears that Tridef is finished. No new profiles for many months, No mods responding to questions, etc. Tridef is most likely dead, and we are stuck with Nvidia. 3D vision has been technically regulated to crappy CM mode and would be officially dead if not for some of the great guys on here doing fix's for us.
I was hoping that Far Cry 4 would have a tridef profile but again, no new tridef software or profiles in the past 6 months.
Shame.
Just scanned the Tridef forums, which I haven't done in a while and I already own Tridef, and it appears that Tridef is finished. No new profiles for many months, No mods responding to questions, etc. Tridef is most likely dead, and we are stuck with Nvidia. 3D vision has been technically regulated to crappy CM mode and would be officially dead if not for some of the great guys on here doing fix's for us.
I was hoping that Far Cry 4 would have a tridef profile but again, no new tridef software or profiles in the past 6 months.
Note that HDMI 2.0 compliance for 3DTV Play should enable interleaved/FPR 4K display on my passive LG, where each eye receives 3840x1080 pixels, 60 fps.
Yes, unfortunately TriDef is struggling - the reason for my recent interest in 3DTV Play. If you search TriDef forums, you might notice that I have provided a number of user profiles over time.
Note that HDMI 2.0 compliance for 3DTV Play should enable interleaved/FPR 4K display on my passive LG, where each eye receives 3840x1080 pixels, 60 fps.
Yes, unfortunately TriDef is struggling - the reason for my recent interest in 3DTV Play. If you search TriDef forums, you might notice that I have provided a number of user profiles over time.
Still not sure why 4K has any serious draw for people. If it's TV distance the extra pixels aren't visible, and it just makes your hardware struggle.
If it's monitor distance, I can understand the desire for > 1080p. For WatchDogs in particular though- there is no chance that I'd want to play that in CM when we have a true-3D fix for it. Trading off halos and text glitching for higher resolution is not particularly interesting.
Xizer's video there is using Tridef CM.
Still not sure why 4K has any serious draw for people. If it's TV distance the extra pixels aren't visible, and it just makes your hardware struggle.
If it's monitor distance, I can understand the desire for > 1080p. For WatchDogs in particular though- there is no chance that I'd want to play that in CM when we have a true-3D fix for it. Trading off halos and text glitching for higher resolution is not particularly interesting.
Xizer's video there is using Tridef CM.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Those of us who aren't blind prefer higher resolution and a few artifacts to lower-than-console-peasant resolutions with no artifacts (seriously, even the console peasants aren't playing at 720p resolution anymore - 720p resolution is now BELOW what the console peasants are gaming at - think long and hard about that one).
720p resolution is literally unplayable once you've become accustomed to higher resolutions. I will sacrifice just about every graphical effect before I sacrifice resolution. I absolutely see the difference between 720p and 1080p and 1080p and 4K and I can spot the difference far away.
Go visit an optometrist, get your vision fixed, and you will be raging at low resolutions just like the rest of us. Or you could continue going through the world with subpar vision and get to enjoy the 'ignorance is bliss' that comes with it. Certainly, in many ways I sometimes find myself envious of those with poor vision as you never have to be bothered by compression artifacts (which are positively RAMPANT in Internet video and HDTV broadcasts), you can buy cheaper displays without a problem because you cannot tell the difference between a quality display and a budget display, and then your dating life is easier as you can have much lower standards for the visual appearance of your mate.
Those of us who aren't blind prefer higher resolution and a few artifacts to lower-than-console-peasant resolutions with no artifacts (seriously, even the console peasants aren't playing at 720p resolution anymore - 720p resolution is now BELOW what the console peasants are gaming at - think long and hard about that one).
720p resolution is literally unplayable once you've become accustomed to higher resolutions. I will sacrifice just about every graphical effect before I sacrifice resolution. I absolutely see the difference between 720p and 1080p and 1080p and 4K and I can spot the difference far away.
Go visit an optometrist, get your vision fixed, and you will be raging at low resolutions just like the rest of us. Or you could continue going through the world with subpar vision and get to enjoy the 'ignorance is bliss' that comes with it. Certainly, in many ways I sometimes find myself envious of those with poor vision as you never have to be bothered by compression artifacts (which are positively RAMPANT in Internet video and HDTV broadcasts), you can buy cheaper displays without a problem because you cannot tell the difference between a quality display and a budget display, and then your dating life is easier as you can have much lower standards for the visual appearance of your mate.
@Xizer: There is no call for insults. Assuming that everyone should think exactly as you do is a mistake. Everyone is different, everyone has different preferences.
My vision is superb. I have corrected 20/20 vision with contacts, and I can also tell the difference between 4K screens and 1080p screens, and I just don't care. When we get down to Oculus Rift resolution, I do care.
More to the point- you have yet to grasp the concept of pixel-arc-seconds. This is the third time we've talked about this, and you seem uninterested in learning anything new.
For other readers interested in learning, and people finding this thread, the wikipedia discussion gives some great background on the angular resolving power of the human eye. [url]http://en.wikipedia.org/wiki/Visual_acuity[/url]
Here is a good graph from CarltonBale, a well respected home theater writer: [url]http://s3.carltonbale.com/resolution_chart.html[/url]
And a home theater discussion that is decent: [url]http://referencehometheater.com/2013/commentary/4k-calculator/[/url]
Depending upon some factors like the TV size, and your TV viewing distance, no human has the ability to discern those pixels.
Edit: It's also worth noting that frame-sequential 720p is actually 2x the resolution, because you get two full screens for every frame, one for each eye. The slight angle between your vision gives you an effective doubling of resolution. At my viewing distance to projector, in 2D I can see the pixels, in 3D, I cannot. Don't be blinded by looking only at the numbers.
If you want to play in crappy CM at 4K with no SLI, knock yourself out. From my perspective, that is an absolutely terrible experience, but if you like it, that's what matters- not convincing someone else you are right.
@Xizer: There is no call for insults. Assuming that everyone should think exactly as you do is a mistake. Everyone is different, everyone has different preferences.
My vision is superb. I have corrected 20/20 vision with contacts, and I can also tell the difference between 4K screens and 1080p screens, and I just don't care. When we get down to Oculus Rift resolution, I do care.
More to the point- you have yet to grasp the concept of pixel-arc-seconds. This is the third time we've talked about this, and you seem uninterested in learning anything new.
For other readers interested in learning, and people finding this thread, the wikipedia discussion gives some great background on the angular resolving power of the human eye. http://en.wikipedia.org/wiki/Visual_acuity
Depending upon some factors like the TV size, and your TV viewing distance, no human has the ability to discern those pixels.
Edit: It's also worth noting that frame-sequential 720p is actually 2x the resolution, because you get two full screens for every frame, one for each eye. The slight angle between your vision gives you an effective doubling of resolution. At my viewing distance to projector, in 2D I can see the pixels, in 3D, I cannot. Don't be blinded by looking only at the numbers.
If you want to play in crappy CM at 4K with no SLI, knock yourself out. From my perspective, that is an absolutely terrible experience, but if you like it, that's what matters- not convincing someone else you are right.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="Conan481"]Xizer. Side by Side would be awesome for 4K TV as well.
So what resolution to you have to render the game at? 4K? Most computer systems would really struggle with getting an acceptable frame rate at 4K res. I doubt my 970 SLI would be good enough for something like watchdogs. I'd rather have 1080p frame packed for performance and I'm looking for Passive 4K which would only do 1080p to each eye anyway.[/quote]
Agree - even 980 SLI would struggle with some games, since 2 complete 4K frames would need to be rendered, even if interleaved or packed SBS for display.
You might consider LG or Sony passive 4K UTV (set to gaming mode) for outstanding 1080p 3D display. My prior picture quality/ghosting performance standard was my Panasonic active plasma VT25 (this thicker, first gen Panasonic design had better ghosting performance than its successors, by the way, and around 20 msec input lag), but both of my newer passive UTV displays provide better 3D display (but more input lag).
Until 3DTV Play adds HDMI 1.4b support (prefer 2.0), gamers with 4K UTVs might consider also getting AMD R9-290X (suggest Arctic hybrid cooler mod to reduce noise - see http://www.tridef.com/forum/viewtopic.php?f=11&t=4278) and Ignition, to experience a truly immersive 3D gaming environment for a large number of games. 1080p60 looks great on a large screen with a game like Advanced Warfare (see [url]http://www.tridef.com/forum/viewtopic.php?f=9&t=4884&p=23850&hilit=advanced+warfare#p23850[/url]). Since Ignition (and 3D Vision) no longer support recent games like Shadow of Mordor, but the broader NVIDIA community does, I purchased GTX980 and 3DTV Play to enjoy recent games. It is very difficult to go back to 24 fps (I prefer resolution to smooth motion). 3DTV Play should really be upgraded to match 2015 industry standards - and 3D Vision should start supporting new games again...
Conan481 said:Xizer. Side by Side would be awesome for 4K TV as well.
So what resolution to you have to render the game at? 4K? Most computer systems would really struggle with getting an acceptable frame rate at 4K res. I doubt my 970 SLI would be good enough for something like watchdogs. I'd rather have 1080p frame packed for performance and I'm looking for Passive 4K which would only do 1080p to each eye anyway.
Agree - even 980 SLI would struggle with some games, since 2 complete 4K frames would need to be rendered, even if interleaved or packed SBS for display.
You might consider LG or Sony passive 4K UTV (set to gaming mode) for outstanding 1080p 3D display. My prior picture quality/ghosting performance standard was my Panasonic active plasma VT25 (this thicker, first gen Panasonic design had better ghosting performance than its successors, by the way, and around 20 msec input lag), but both of my newer passive UTV displays provide better 3D display (but more input lag).
Until 3DTV Play adds HDMI 1.4b support (prefer 2.0), gamers with 4K UTVs might consider also getting AMD R9-290X (suggest Arctic hybrid cooler mod to reduce noise - see http://www.tridef.com/forum/viewtopic.php?f=11&t=4278) and Ignition, to experience a truly immersive 3D gaming environment for a large number of games. 1080p60 looks great on a large screen with a game like Advanced Warfare (see http://www.tridef.com/forum/viewtopic.php?f=9&t=4884&p=23850&hilit=advanced+warfare#p23850). Since Ignition (and 3D Vision) no longer support recent games like Shadow of Mordor, but the broader NVIDIA community does, I purchased GTX980 and 3DTV Play to enjoy recent games. It is very difficult to go back to 24 fps (I prefer resolution to smooth motion). 3DTV Play should really be upgraded to match 2015 industry standards - and 3D Vision should start supporting new games again...
[quote="Xizer"]Those of us who aren't blind prefer higher resolution and a few artifacts to lower-than-console-peasant resolutions with no artifacts (seriously, even the console peasants aren't playing at 720p resolution anymore - 720p resolution is now BELOW what the console peasants are gaming at - think long and hard about that one).
720p resolution is literally unplayable once you've become accustomed to higher resolutions. I will sacrifice just about every graphical effect before I sacrifice resolution. I absolutely see the difference between 720p and 1080p and 1080p and 4K and I can spot the difference far away.
Go visit an optometrist, get your vision fixed, and you will be raging at low resolutions just like the rest of us. Or you could continue going through the world with subpar vision and get to enjoy the 'ignorance is bliss' that comes with it. Certainly, in many ways I sometimes find myself envious of those with poor vision as you never have to be bothered by compression artifacts (which are positively RAMPANT in Internet video and HDTV broadcasts), you can buy cheaper displays without a problem because you cannot tell the difference between a quality display and a budget display, and then your dating life is easier as you can have much lower standards for the visual appearance of your mate.[/quote]
WOW!
What an arsehole!
Xizer said:Those of us who aren't blind prefer higher resolution and a few artifacts to lower-than-console-peasant resolutions with no artifacts (seriously, even the console peasants aren't playing at 720p resolution anymore - 720p resolution is now BELOW what the console peasants are gaming at - think long and hard about that one).
720p resolution is literally unplayable once you've become accustomed to higher resolutions. I will sacrifice just about every graphical effect before I sacrifice resolution. I absolutely see the difference between 720p and 1080p and 1080p and 4K and I can spot the difference far away.
Go visit an optometrist, get your vision fixed, and you will be raging at low resolutions just like the rest of us. Or you could continue going through the world with subpar vision and get to enjoy the 'ignorance is bliss' that comes with it. Certainly, in many ways I sometimes find myself envious of those with poor vision as you never have to be bothered by compression artifacts (which are positively RAMPANT in Internet video and HDTV broadcasts), you can buy cheaper displays without a problem because you cannot tell the difference between a quality display and a budget display, and then your dating life is easier as you can have much lower standards for the visual appearance of your mate.
The truth hurts. Don't get butthurt just because I tell it like it is. You're only punishing yourself if you don't take my advice to get your eyes checked.
This is a common response every time I suggest someone may need to visit an optometrist when they say something ignorant ("The human eye can't even see 4K details!!!1" "This YIFY encode looks great to me!" "Pffft, console games look just fine, PC graphics are placebo!").
Nothing seems to trigger the human defense response more than someone informing you that there is something wrong with you. The inability to take constructive criticism from your fellow human beings will be your downfall my friends. You should learn to accept others' advice if you want a happier life.
[quote="bo3b"]
Depending upon some factors like the TV size, and your TV viewing distance, no human has the ability to discern those pixels.
If you want to play in crappy CM at 4K with no SLI, knock yourself out. From my perspective, that is an absolutely terrible experience, but if you like it, that's what matters- not convincing someone else you are right.[/quote]
I don't know how many times we have to go over this.
1) SLI does work.
2) I have a 65" display at a viewing distance of 5'. I absolutely can discern more pixels than 1920x1080.
The truth hurts. Don't get butthurt just because I tell it like it is. You're only punishing yourself if you don't take my advice to get your eyes checked.
This is a common response every time I suggest someone may need to visit an optometrist when they say something ignorant ("The human eye can't even see 4K details!!!1" "This YIFY encode looks great to me!" "Pffft, console games look just fine, PC graphics are placebo!").
Nothing seems to trigger the human defense response more than someone informing you that there is something wrong with you. The inability to take constructive criticism from your fellow human beings will be your downfall my friends. You should learn to accept others' advice if you want a happier life.
bo3b said:
Depending upon some factors like the TV size, and your TV viewing distance, no human has the ability to discern those pixels.
If you want to play in crappy CM at 4K with no SLI, knock yourself out. From my perspective, that is an absolutely terrible experience, but if you like it, that's what matters- not convincing someone else you are right.
I don't know how many times we have to go over this.
1) SLI does work.
2) I have a 65" display at a viewing distance of 5'. I absolutely can discern more pixels than 1920x1080.
The thing is, this is a pretty chill board. And talking down to people isn't really how people like to roll here. You can have an opinion, and it might even be the "correct" opinion, but no need to make it personal. Sometimes a post just needs to go through a "softener" machine so it doesn't read so harshly. The same point can be made, but without the condescension.
The thing is, this is a pretty chill board. And talking down to people isn't really how people like to roll here. You can have an opinion, and it might even be the "correct" opinion, but no need to make it personal. Sometimes a post just needs to go through a "softener" machine so it doesn't read so harshly. The same point can be made, but without the condescension.
I just recently bought 4k 55" 3d tv, and i own 720p and 1080p projector too. Here are my thoughts.
Image quality difference between native 720p 120hz 3d vision projector and 4k tv with tridef(real 3d, not fake)?
Pretty big, but not as drastic as Xizer makes it out to be. 720p render resolution is causing games to have lots of shimmer, crawling and aliasing. However, upping the render resolution, downsampling from 1440p or 1080p to 720p makes the difference much smaller. Still there, but suprisingly small difference at normal viewing distance. In my case it was 3-3.5 meters from 108" screen and 2 meters from 4k tv.
IQ difference between native 720p and native 1080p projectors?
Currently i can see what 1080p 3d looks like with my 1080p projector but its limited to 24hz only. The difference between 720p(game rendered at 1080p) and native 1080p seems to be similar compared to 1080p - 4k difference. Its there, but suprisingly subtle. I would upgrade to 1080p120hz projector if there were any though. Or 60hz one with hdmi 2.0 which wuould support 1080p60hz 3d gaming(hypothetically, if nvidia supported it with 3dtvplay).
Well, which one do i prefer for gaming?
No contest really. I'll choose 720p projector any day of the week. Its just that much more immersive to play on a big ghost free image. Requires heavy AA if running at native 720p or downsampling in my opinion. 4k and 3d, 980 sli was not enough. Games like black flag ran better at 1080p(ds to 720p, pj) with single 980 compared 980 sli at 4k. 4k tv and 1 980 gpu going back to store soon. 20/20 vision.
And on topic: Yes HDMI 2.0 support would be more than welcome!
I just recently bought 4k 55" 3d tv, and i own 720p and 1080p projector too. Here are my thoughts.
Image quality difference between native 720p 120hz 3d vision projector and 4k tv with tridef(real 3d, not fake)?
Pretty big, but not as drastic as Xizer makes it out to be. 720p render resolution is causing games to have lots of shimmer, crawling and aliasing. However, upping the render resolution, downsampling from 1440p or 1080p to 720p makes the difference much smaller. Still there, but suprisingly small difference at normal viewing distance. In my case it was 3-3.5 meters from 108" screen and 2 meters from 4k tv.
IQ difference between native 720p and native 1080p projectors?
Currently i can see what 1080p 3d looks like with my 1080p projector but its limited to 24hz only. The difference between 720p(game rendered at 1080p) and native 1080p seems to be similar compared to 1080p - 4k difference. Its there, but suprisingly subtle. I would upgrade to 1080p120hz projector if there were any though. Or 60hz one with hdmi 2.0 which wuould support 1080p60hz 3d gaming(hypothetically, if nvidia supported it with 3dtvplay).
Well, which one do i prefer for gaming?
No contest really. I'll choose 720p projector any day of the week. Its just that much more immersive to play on a big ghost free image. Requires heavy AA if running at native 720p or downsampling in my opinion. 4k and 3d, 980 sli was not enough. Games like black flag ran better at 1080p(ds to 720p, pj) with single 980 compared 980 sli at 4k. 4k tv and 1 980 gpu going back to store soon. 20/20 vision.
And on topic: Yes HDMI 2.0 support would be more than welcome!
That doesn't work. I opened a ticket with them over a month ago and they never replied.
Also, we need more than just HDMI 2.0 support. We also need side-by-side support like TriDef has.
With side-by-side 3D support, you can feed a 3D signal at 4K resolution to a Samsung active 3D 4K TV and the TV will produce a 3D picture at 4K resolution (1920x2160 per eye - the highest resolution 3D you can get on any display, television or monitor, on the market)
Only TriDef will allow you to play in 3D at 2x 1080p resolution on a Samsung 4K TV thanks to its support for side-by-side mode at 4K resolution over HDMI 2.0. I'm not content with just getting a 1920x1080p full-frame 3D via HDMI on my 4K TV when my 4K TV can accept a 4K side-by-side 3D signal and deliver to me a 1920x2160 resolution 3D picture (that's 4 million pixels - a huge improvement over 1080p's 2 million pixels of resolution).
So even if NVIDIA does update 3DTV Play to allow you to do 1920x1080 @ 60 Hz over HDMI, it still won't be enough. I want to be able to achieve the same 3D resolution that I can with TriDef, and the only way that is happening is NVIDIA updating 3DTV Play to allow you to do a side-by-side 3D signal at 4K resolution over HDMI. I didn't buy a 4K TV just to watch content at 1080p resolution when in 3D, but I'm pretty content with being able to watch content at twice-1080p resolution in 3D.
This is a sample footage of Watch Dogs being played using TriDef at 4K resolution + 3D side-by-side mode:
I don't know why NVIDIA is content with letting its competitor outclass them like this. They must be banking on most NVIDIA users being ignorant to the existence of TriDef or thinking TriDef is only for AMD GPUs.
So what resolution to you have to render the game at? 4K? Most computer systems would really struggle with getting an acceptable frame rate at 4K res. I doubt my 970 SLI would be good enough for something like watchdogs. I'd rather have 1080p frame packed for performance and I'm looking for Passive 4K which would only do 1080p to each eye anyway.
What kind of specs are you running and how is the ghosting with the Samsung active 3D, as I've seen many active 3D sets and they all ghost more then I'd like.
I was hoping that Far Cry 4 would have a tridef profile but again, no new tridef software or profiles in the past 6 months.
Shame.
Yes, unfortunately TriDef is struggling - the reason for my recent interest in 3DTV Play. If you search TriDef forums, you might notice that I have provided a number of user profiles over time.
If it's monitor distance, I can understand the desire for > 1080p. For WatchDogs in particular though- there is no chance that I'd want to play that in CM when we have a true-3D fix for it. Trading off halos and text glitching for higher resolution is not particularly interesting.
Xizer's video there is using Tridef CM.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
720p resolution is literally unplayable once you've become accustomed to higher resolutions. I will sacrifice just about every graphical effect before I sacrifice resolution. I absolutely see the difference between 720p and 1080p and 1080p and 4K and I can spot the difference far away.
Go visit an optometrist, get your vision fixed, and you will be raging at low resolutions just like the rest of us. Or you could continue going through the world with subpar vision and get to enjoy the 'ignorance is bliss' that comes with it. Certainly, in many ways I sometimes find myself envious of those with poor vision as you never have to be bothered by compression artifacts (which are positively RAMPANT in Internet video and HDTV broadcasts), you can buy cheaper displays without a problem because you cannot tell the difference between a quality display and a budget display, and then your dating life is easier as you can have much lower standards for the visual appearance of your mate.
My vision is superb. I have corrected 20/20 vision with contacts, and I can also tell the difference between 4K screens and 1080p screens, and I just don't care. When we get down to Oculus Rift resolution, I do care.
More to the point- you have yet to grasp the concept of pixel-arc-seconds. This is the third time we've talked about this, and you seem uninterested in learning anything new.
For other readers interested in learning, and people finding this thread, the wikipedia discussion gives some great background on the angular resolving power of the human eye. http://en.wikipedia.org/wiki/Visual_acuity
Here is a good graph from CarltonBale, a well respected home theater writer: http://s3.carltonbale.com/resolution_chart.html
And a home theater discussion that is decent: http://referencehometheater.com/2013/commentary/4k-calculator/
Depending upon some factors like the TV size, and your TV viewing distance, no human has the ability to discern those pixels.
Edit: It's also worth noting that frame-sequential 720p is actually 2x the resolution, because you get two full screens for every frame, one for each eye. The slight angle between your vision gives you an effective doubling of resolution. At my viewing distance to projector, in 2D I can see the pixels, in 3D, I cannot. Don't be blinded by looking only at the numbers.
If you want to play in crappy CM at 4K with no SLI, knock yourself out. From my perspective, that is an absolutely terrible experience, but if you like it, that's what matters- not convincing someone else you are right.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Agree - even 980 SLI would struggle with some games, since 2 complete 4K frames would need to be rendered, even if interleaved or packed SBS for display.
You might consider LG or Sony passive 4K UTV (set to gaming mode) for outstanding 1080p 3D display. My prior picture quality/ghosting performance standard was my Panasonic active plasma VT25 (this thicker, first gen Panasonic design had better ghosting performance than its successors, by the way, and around 20 msec input lag), but both of my newer passive UTV displays provide better 3D display (but more input lag).
Until 3DTV Play adds HDMI 1.4b support (prefer 2.0), gamers with 4K UTVs might consider also getting AMD R9-290X (suggest Arctic hybrid cooler mod to reduce noise - see http://www.tridef.com/forum/viewtopic.php?f=11&t=4278) and Ignition, to experience a truly immersive 3D gaming environment for a large number of games. 1080p60 looks great on a large screen with a game like Advanced Warfare (see http://www.tridef.com/forum/viewtopic.php?f=9&t=4884&p=23850&hilit=advanced+warfare#p23850). Since Ignition (and 3D Vision) no longer support recent games like Shadow of Mordor, but the broader NVIDIA community does, I purchased GTX980 and 3DTV Play to enjoy recent games. It is very difficult to go back to 24 fps (I prefer resolution to smooth motion). 3DTV Play should really be upgraded to match 2015 industry standards - and 3D Vision should start supporting new games again...
WOW!
What an arsehole!
This is a common response every time I suggest someone may need to visit an optometrist when they say something ignorant ("The human eye can't even see 4K details!!!1" "This YIFY encode looks great to me!" "Pffft, console games look just fine, PC graphics are placebo!").
Nothing seems to trigger the human defense response more than someone informing you that there is something wrong with you. The inability to take constructive criticism from your fellow human beings will be your downfall my friends. You should learn to accept others' advice if you want a happier life.
I don't know how many times we have to go over this.
1) SLI does work.
2) I have a 65" display at a viewing distance of 5'. I absolutely can discern more pixels than 1920x1080.
Image quality difference between native 720p 120hz 3d vision projector and 4k tv with tridef(real 3d, not fake)?
Pretty big, but not as drastic as Xizer makes it out to be. 720p render resolution is causing games to have lots of shimmer, crawling and aliasing. However, upping the render resolution, downsampling from 1440p or 1080p to 720p makes the difference much smaller. Still there, but suprisingly small difference at normal viewing distance. In my case it was 3-3.5 meters from 108" screen and 2 meters from 4k tv.
IQ difference between native 720p and native 1080p projectors?
Currently i can see what 1080p 3d looks like with my 1080p projector but its limited to 24hz only. The difference between 720p(game rendered at 1080p) and native 1080p seems to be similar compared to 1080p - 4k difference. Its there, but suprisingly subtle. I would upgrade to 1080p120hz projector if there were any though. Or 60hz one with hdmi 2.0 which wuould support 1080p60hz 3d gaming(hypothetically, if nvidia supported it with 3dtvplay).
Well, which one do i prefer for gaming?
No contest really. I'll choose 720p projector any day of the week. Its just that much more immersive to play on a big ghost free image. Requires heavy AA if running at native 720p or downsampling in my opinion. 4k and 3d, 980 sli was not enough. Games like black flag ran better at 1080p(ds to 720p, pj) with single 980 compared 980 sli at 4k. 4k tv and 1 980 gpu going back to store soon. 20/20 vision.
And on topic: Yes HDMI 2.0 support would be more than welcome!