if 3d vision could output 4k resolution @ 60 hz would one get 3d picture from 4k tv equal to 1080p@1
2 / 2
For people like me, who already own a 3DTV Play license, the EDID mod is a very viable work-around to NVIDIA's failure to update a terribly aging product. After 5 years, 3DTV Play really needs an update, including the ability to allow an asynchronous 60 Hz Desktop (game keyboard/controller) with 24 Hz devices like active 1080p HDTVs and to provide full HDMI 2.0 support.
If NVIDIA truly wanted to focus on profit (selling high-end GPUs), they should actually be promoting passive 4K TV gaming. The 4K 3D "TV" display is actually the most resource intensive setup for any 3D Vision user, including Surround setups (4X 1080p, versus 3X 1080p) - with current technologies.
Unfortunately, with passive display, half the data is thrown away during interlace formatting, since current GPU processing is rigidly locked to square grid processing in display space. With more intelligent display processing (a first step toward foveated processing in future gen HMDs), the GPU could process ONLY the data required for display in interlaced, SBS, and TB formats (not data thrown away).
Ideally, NVIDIA's HDMI 2.0 update to 3DTV Play could include more efficient GPU processing (and aliasing reduction) when subsampling formats like SBS, TB, and interlace modes are used in display - this is much more important for 4K displays, to reduce GPU loading...
For people like me, who already own a 3DTV Play license, the EDID mod is a very viable work-around to NVIDIA's failure to update a terribly aging product. After 5 years, 3DTV Play really needs an update, including the ability to allow an asynchronous 60 Hz Desktop (game keyboard/controller) with 24 Hz devices like active 1080p HDTVs and to provide full HDMI 2.0 support.
If NVIDIA truly wanted to focus on profit (selling high-end GPUs), they should actually be promoting passive 4K TV gaming. The 4K 3D "TV" display is actually the most resource intensive setup for any 3D Vision user, including Surround setups (4X 1080p, versus 3X 1080p) - with current technologies.
Unfortunately, with passive display, half the data is thrown away during interlace formatting, since current GPU processing is rigidly locked to square grid processing in display space. With more intelligent display processing (a first step toward foveated processing in future gen HMDs), the GPU could process ONLY the data required for display in interlaced, SBS, and TB formats (not data thrown away).
Ideally, NVIDIA's HDMI 2.0 update to 3DTV Play could include more efficient GPU processing (and aliasing reduction) when subsampling formats like SBS, TB, and interlace modes are used in display - this is much more important for 4K displays, to reduce GPU loading...
If NVIDIA truly wanted to focus on profit (selling high-end GPUs), they should actually be promoting passive 4K TV gaming. The 4K 3D "TV" display is actually the most resource intensive setup for any 3D Vision user, including Surround setups (4X 1080p, versus 3X 1080p) - with current technologies.
Unfortunately, with passive display, half the data is thrown away during interlace formatting, since current GPU processing is rigidly locked to square grid processing in display space. With more intelligent display processing (a first step toward foveated processing in future gen HMDs), the GPU could process ONLY the data required for display in interlaced, SBS, and TB formats (not data thrown away).
Ideally, NVIDIA's HDMI 2.0 update to 3DTV Play could include more efficient GPU processing (and aliasing reduction) when subsampling formats like SBS, TB, and interlace modes are used in display - this is much more important for 4K displays, to reduce GPU loading...