I have seen that they do not carry DVI, but if a Display port adapter to DVI, therefore it is assumed that if it is compatible with NVIDIA 3D VISION, no?
I have seen that they do not carry DVI, but if a Display port adapter to DVI, therefore it is assumed that if it is compatible with NVIDIA 3D VISION, no?
I'm nearly positive they are 3D Vision capable. There's no reason why they wouldn't be.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
It's never safe to assume, but, it is listed on the website so should be ok:
https://www.geforce.com/hardware/desktop-gpus/geforce-rtx-2080/specifications
I think Ragedemon pointed to this link a short while ago when I asked a similar question. I didn't want to assume that everything was still supported, especially with such a new architecture.
I think Ragedemon pointed to this link a short while ago when I asked a similar question. I didn't want to assume that everything was still supported, especially with such a new architecture.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote="lou4612"]I'm pretty sure the 3D Vision Drive Software is in the latest driver update[/quote]It sure is, but I think it would be quite heavy if a new hardware release would result in removing the only feature for many 3D Vision users staying with nvidia not only for the upcoming hw version but also old ones. Either nvidia did support previous hw version on new releases and I hope they do so in the future, your post ... just having old Software included ... no offense, it just doesn't seem to be any answer to the op question! Not?
@rustyk21 btw what's so new about that architecture? Imo it's not much more than a shrink, not even, thinking 73%(xx80)/60%(xx80 ti) more size !! while shrinking from 16 to 12 nm to result in ~ 30% more performance resp 1080 ti = 2080 (normal) - isn't it a shame? the 1080 Ti performed much better to its predecessor I mean we do have to wait about RayTracing performance. Yet that technique isn't new. I remember playing wolfenstein (i believe 1. version actually playing using a RT patch).
lou4612 said:I'm pretty sure the 3D Vision Drive Software is in the latest driver update
It sure is, but I think it would be quite heavy if a new hardware release would result in removing the only feature for many 3D Vision users staying with nvidia not only for the upcoming hw version but also old ones. Either nvidia did support previous hw version on new releases and I hope they do so in the future, your post ... just having old Software included ... no offense, it just doesn't seem to be any answer to the op question! Not?
@rustyk21 btw what's so new about that architecture? Imo it's not much more than a shrink, not even, thinking 73%(xx80)/60%(xx80 ti) more size !! while shrinking from 16 to 12 nm to result in ~ 30% more performance resp 1080 ti = 2080 (normal) - isn't it a shame? the 1080 Ti performed much better to its predecessor I mean we do have to wait about RayTracing performance. Yet that technique isn't new. I remember playing wolfenstein (i believe 1. version actually playing using a RT patch).
[quote="Teikol"][quote="lou4612"]I'm pretty sure the 3D Vision Drive Software is in the latest driver update[/quote].....
@rustyk21 btw what's so new about that architecture? Imo it's not much more than a shrink, not even, thinking 73%(xx80)/60%(xx80 ti) more size !! while shrinking from 16 to 12 nm to result in ~ 30% more performance resp 1080 ti = 2080 (normal) - isn't it a shame? the 1080 Ti performed much better to its predecessor I mean we do have to wait about RayTracing performance. Yet that technique isn't new. I remember playing wolfenstein (i believe 1. version actually playing using a RT patch).[/quote]
I'm not talking about what a die shrink constitutes, agreed that's debatable. It's only an enhanced process on an existing node.
I also agree it's a shame about the performance comparisons, I'd love it if this outperformed a 1080ti by 50% but sooner or later a different approach is required. Look at the IPC/Core count issues confronting CPUs. It's going nowhere.
I mean that the addition of tensor cores etc. means that this is a radically different processor to Pascal.
In light of the fact that Nvidia are clearly pushing in a different direction, I was just pointing out that it's unwise to assume that legacy features are supported.
lou4612 said:I'm pretty sure the 3D Vision Drive Software is in the latest driver update
.....
@rustyk21 btw what's so new about that architecture? Imo it's not much more than a shrink, not even, thinking 73%(xx80)/60%(xx80 ti) more size !! while shrinking from 16 to 12 nm to result in ~ 30% more performance resp 1080 ti = 2080 (normal) - isn't it a shame? the 1080 Ti performed much better to its predecessor I mean we do have to wait about RayTracing performance. Yet that technique isn't new. I remember playing wolfenstein (i believe 1. version actually playing using a RT patch).
I'm not talking about what a die shrink constitutes, agreed that's debatable. It's only an enhanced process on an existing node.
I also agree it's a shame about the performance comparisons, I'd love it if this outperformed a 1080ti by 50% but sooner or later a different approach is required. Look at the IPC/Core count issues confronting CPUs. It's going nowhere.
I mean that the addition of tensor cores etc. means that this is a radically different processor to Pascal.
In light of the fact that Nvidia are clearly pushing in a different direction, I was just pointing out that it's unwise to assume that legacy features are supported.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
What we do not really know, is if 3D Vision and Compatibility Mode will be supported by Nvidia in DX12 and Vulkan.
We do know that both DX12 and Vulkan can have "Native Stereoscopic Support" built into the game itself, since both SDKs have implementations for it.
Hopefully, our 3D Vision monitors do not become worthless with new games using either renderer :(
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
Gaming Rig 1
i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO
Gaming Rig 2
My new build
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
https://www.geforce.com/hardware/desktop-gpus/geforce-rtx-2080/specifications
I think Ragedemon pointed to this link a short while ago when I asked a similar question. I didn't want to assume that everything was still supported, especially with such a new architecture.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
@rustyk21 btw what's so new about that architecture? Imo it's not much more than a shrink, not even, thinking 73%(xx80)/60%(xx80 ti) more size !! while shrinking from 16 to 12 nm to result in ~ 30% more performance resp 1080 ti = 2080 (normal) - isn't it a shame? the 1080 Ti performed much better to its predecessor I mean we do have to wait about RayTracing performance. Yet that technique isn't new. I remember playing wolfenstein (i believe 1. version actually playing using a RT patch).
I'm not talking about what a die shrink constitutes, agreed that's debatable. It's only an enhanced process on an existing node.
I also agree it's a shame about the performance comparisons, I'd love it if this outperformed a 1080ti by 50% but sooner or later a different approach is required. Look at the IPC/Core count issues confronting CPUs. It's going nowhere.
I mean that the addition of tensor cores etc. means that this is a radically different processor to Pascal.
In light of the fact that Nvidia are clearly pushing in a different direction, I was just pointing out that it's unwise to assume that legacy features are supported.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
We do know that both DX12 and Vulkan can have "Native Stereoscopic Support" built into the game itself, since both SDKs have implementations for it.
Hopefully, our 3D Vision monitors do not become worthless with new games using either renderer :(