Upgrade options to maximize life of official 3D Vision ready hardware/PC + backup
While we have a separate thread dedicated to gathering ideas and feedback on what can be done to support 3D gaming beyond 2020, this thread is for advice on upgrade options so our official 3D Vision ready hardware can a) last the longest and future proof a bit, and b) have enough back up if one part fails.
I have 2 emitters, 3 glasses, PG278Q and PG278QR monitors, Optoma UHD50 and Benq HT2050 projectors, X299 platform with 7820X CPU, a GTX 1080ti, and 32GB RAM. Building fastest PC and having backup is my goal for now. I'd like to gather some ideas on the following upgrades:
1 - Two 2080TIs in SLI - I checked a lot of SLI reviews and benchmarks. Seems the older DX11 titles don't gain much at all, and newer DX12 titles scale fairly coz they're optimized. Even if money is no object, problem is, we'll be stuck with an old driver and DX11 titles anyways, unless we figure out fixing DX12 games somehow. Beyond DX12 is out of question. Only reason I'm leaning towards SLI is if one 2080ti dies, I have a second backup (and sell my 1080ti). What you guys think?
2 - CPU - Cascade lake-X CPUs will be released in June at Computex according to rumors, and might as well be the last CPUs for X299 platform. My 7820X can't do X16/X16 SLI coz it has 28 PCI-E lanes. Speedwise a 9900K would be ideal but I'll need to change mobo, but 9900K can't do SLI in 16X/16X either. Does a faster CPU even matter for 3D Vision due to bottleneck? What you guys think?
3 - Monitor - I have PG278QR as my daily 3D display and older PG278Q as backup. Problem is PG278Q (older model) has vertical scanlines issue in 3D and it's very noticeable. If my PG278QR dies, I'm back to scanlines. I'm debating whether I should get another PG278QR or the newer curved PG27VQ. Someone posted pics somewhere here which showed it has slightly less ghosting. Any feedback from PG27VQ users?
DX9 was relevant for a long time; DX10 was short-lived; and now DX11 has been around beside any DX12 title. I have a feeling DX12 will be short-lived too. Once DX13 comes and games have no DX11 mode anymore, we're then screwed :(
While we have a separate thread dedicated to gathering ideas and feedback on what can be done to support 3D gaming beyond 2020, this thread is for advice on upgrade options so our official 3D Vision ready hardware can a) last the longest and future proof a bit, and b) have enough back up if one part fails.
I have 2 emitters, 3 glasses, PG278Q and PG278QR monitors, Optoma UHD50 and Benq HT2050 projectors, X299 platform with 7820X CPU, a GTX 1080ti, and 32GB RAM. Building fastest PC and having backup is my goal for now. I'd like to gather some ideas on the following upgrades:
1 - Two 2080TIs in SLI - I checked a lot of SLI reviews and benchmarks. Seems the older DX11 titles don't gain much at all, and newer DX12 titles scale fairly coz they're optimized. Even if money is no object, problem is, we'll be stuck with an old driver and DX11 titles anyways, unless we figure out fixing DX12 games somehow. Beyond DX12 is out of question. Only reason I'm leaning towards SLI is if one 2080ti dies, I have a second backup (and sell my 1080ti). What you guys think?
2 - CPU - Cascade lake-X CPUs will be released in June at Computex according to rumors, and might as well be the last CPUs for X299 platform. My 7820X can't do X16/X16 SLI coz it has 28 PCI-E lanes. Speedwise a 9900K would be ideal but I'll need to change mobo, but 9900K can't do SLI in 16X/16X either. Does a faster CPU even matter for 3D Vision due to bottleneck? What you guys think?
3 - Monitor - I have PG278QR as my daily 3D display and older PG278Q as backup. Problem is PG278Q (older model) has vertical scanlines issue in 3D and it's very noticeable. If my PG278QR dies, I'm back to scanlines. I'm debating whether I should get another PG278QR or the newer curved PG27VQ. Someone posted pics somewhere here which showed it has slightly less ghosting. Any feedback from PG27VQ users?
DX9 was relevant for a long time; DX10 was short-lived; and now DX11 has been around beside any DX12 title. I have a feeling DX12 will be short-lived too. Once DX13 comes and games have no DX11 mode anymore, we're then screwed :(
It would be interesting to know how much a dedicated PhysX GPU brings to the table in newer games that actually use the PhysX SDK from Nvidia. We know that Unity and Unreal 4 both have this available as an option to developers.
FPS gains can be hit and miss, depending on the game engine, but Volnaiskra demonstrated at one time that FPS gains can be "significant"
http://www.volnapc.com/all-posts/how-much-difference-does-a-dedicated-physx-card-make [color="orange"]<---a must read imho[/color]
(check Arkham Origins SLI Titans + 650) vs (Titan +Titan dedicated PhysX) results
[quote="Volnaiskra"]Wow. With a Titan taken off SLI duties and devoted to PhysX, the performance went through the roof. Not so much in the max framerate department, strangely, but who cares - the minimum and average fps is way up (by 60% and 42%, respectively).[/quote]
But keep in mind, there are many games that do not use Nvidia's PhysX SDK. A lot of games use the Havok Engine for Physics computations alongside whichever game engine they are using.(Havok is just one of many other ways to implement physics within a game)
Another thing to consider as pointed out by Captain0007, is the PCIe bandwidth scaling with multiple GPUs. If you SLI with an additional PhysX GPU, will you go to 16x8x8 and if so, will this be an issue?
Necropants at one time stated that [url=https://forums.geforce.com/default/topic/907141/3d-vision/any-point-in-having-a-dedicated-physx-card-now-thinking-witcher-3-/post/4770585/#4770585] that it did not seem to be an issue.[/url]
But keep in mind, the closer a dedicated PhysX GPU is to your main GPU (both architechture and memory wise), the better the gain. Just tossing any old GPU in there, could have very limited gains.
It was discussed [url=https://forums.geforce.com/default/topic/907141/3d-vision/any-point-in-having-a-dedicated-physx-card-now-thinking-witcher-3-/3/] here[/url] previously, if anyone is interested in this topic, perhaps continue it there or start a new thread.
One newer game using Nvidia's Apex/PhysX SDK is Hellblade Seauna's Sacrifice, which also has a VR option. It would be interesting to know if a dedicated PhysX GPU benefited in VR as well. Of course Nvidia's CEO showed that PhysX can run very well on a single GPU when he demonstrated FlameWorks, but that was an "optimized" demo.
But if a game uses Nvidia's Hair Works, Flame Works, Wave Works, Face Works or Flex, perhaps a dedicated PhysX GPU might be well worth including in a 2019 3D Vision Legacy build.
It would be interesting to know how much a dedicated PhysX GPU brings to the table in newer games that actually use the PhysX SDK from Nvidia. We know that Unity and Unreal 4 both have this available as an option to developers.
Volnaiskra said:Wow. With a Titan taken off SLI duties and devoted to PhysX, the performance went through the roof. Not so much in the max framerate department, strangely, but who cares - the minimum and average fps is way up (by 60% and 42%, respectively).
But keep in mind, there are many games that do not use Nvidia's PhysX SDK. A lot of games use the Havok Engine for Physics computations alongside whichever game engine they are using.(Havok is just one of many other ways to implement physics within a game)
Another thing to consider as pointed out by Captain0007, is the PCIe bandwidth scaling with multiple GPUs. If you SLI with an additional PhysX GPU, will you go to 16x8x8 and if so, will this be an issue?
But keep in mind, the closer a dedicated PhysX GPU is to your main GPU (both architechture and memory wise), the better the gain. Just tossing any old GPU in there, could have very limited gains.
It was discussed here previously, if anyone is interested in this topic, perhaps continue it there or start a new thread.
One newer game using Nvidia's Apex/PhysX SDK is Hellblade Seauna's Sacrifice, which also has a VR option. It would be interesting to know if a dedicated PhysX GPU benefited in VR as well. Of course Nvidia's CEO showed that PhysX can run very well on a single GPU when he demonstrated FlameWorks, but that was an "optimized" demo.
But if a game uses Nvidia's Hair Works, Flame Works, Wave Works, Face Works or Flex, perhaps a dedicated PhysX GPU might be well worth including in a 2019 3D Vision Legacy build.
I wouldn't rely on SLI at this point. It's got less and less reliable and less and less games support it hacked or otherwise.
In fact some games like tekken 7 and Resident evil two remake have issues when using SLI with 3dvision.
It's probably a good idea for a backup card though
I haven't cared to do any tests to see if have yet reached a point where the lanes have become saturated and obviously can't until it looks like I inevitably pick up a 2080ti (I'm really torn about this, nvidia doesn't deserve anymore of my money.)
I wouldn't rely on SLI at this point. It's got less and less reliable and less and less games support it hacked or otherwise.
In fact some games like tekken 7 and Resident evil two remake have issues when using SLI with 3dvision.
It's probably a good idea for a backup card though
I haven't cared to do any tests to see if have yet reached a point where the lanes have become saturated and obviously can't until it looks like I inevitably pick up a 2080ti (I'm really torn about this, nvidia doesn't deserve anymore of my money.)
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
[quote="Captain0007"]3 - Monitor - I have PG278QR as my daily 3D display and older PG278Q as backup. Problem is PG278Q (older model) has vertical scanlines issue in 3D and it's very noticeable. If my PG278QR dies, I'm back to scanlines. I'm debating whether I should get another PG278QR or the newer curved PG27VQ.[/quote]
If you look at Necropants sig, you could get another PG278Q for surround
[quote="Necropants"]3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)[/quote]
But perhaps the PG27VQ or another PG278QR might work. Typically 3D Vision surround requires that all 3 monitors are identical.
That's if you are even interested in 3D Vision Surround.
But of course if you find yourself loving 3D Vision Surround, you might end up needing to get an additional 3 back up monitors :P
Captain0007 said:3 - Monitor - I have PG278QR as my daily 3D display and older PG278Q as backup. Problem is PG278Q (older model) has vertical scanlines issue in 3D and it's very noticeable. If my PG278QR dies, I'm back to scanlines. I'm debating whether I should get another PG278QR or the newer curved PG27VQ.
If you look at Necropants sig, you could get another PG278Q for surround
Necropants said:3D Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
But perhaps the PG27VQ or another PG278QR might work. Typically 3D Vision surround requires that all 3 monitors are identical.
That's if you are even interested in 3D Vision Surround.
But of course if you find yourself loving 3D Vision Surround, you might end up needing to get an additional 3 back up monitors :P
We tested few games a year or two back with TitanX Pascal and 1080ti as other being deticated physx and there Was not really any difference to be noted. We tried Atleast metro last light or the 2033 bench and some other games with physx. Measurements were not any scientofic or hardcore but we felt it Was really just a waste of time and effort and no means i could say its worth anyones time or money to invest into another card for deticated physx card.
We tested few games a year or two back with TitanX Pascal and 1080ti as other being deticated physx and there Was not really any difference to be noted. We tried Atleast metro last light or the 2033 bench and some other games with physx. Measurements were not any scientofic or hardcore but we felt it Was really just a waste of time and effort and no means i could say its worth anyones time or money to invest into another card for deticated physx card.
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
I don't think its a great idea to drive 3D surround at high resolutions. The effective Resolution in stereo is massive. Do the math. You need a utter monster and then it's iffy with the CPU bug. 1440p and up. Maybe with 2080ti SLI.... don't rely on it for new games.
I wager it's the most punishing task you can expect to put a gaming machine through.
I tend to only use 3D surround on older games (borderlands 2 ect) at 1440p because of it.
If I wanted a dedicated 3D surround machine I would have kept my 1080p screens, honestly, don't think there is a reliable 4k surround setup to run at a clean 3Dvision 1080p
Don't get me wrong 3D surround is still imo the ultimate way to experience 3dvision. If like me you practically require 60fps with high graphics settings, probably reconsider. It probably won't ever really be for me now in most scenario's due to future cards being locked out. Helifax will swear by it, he also has the gear and the preference. (also running 5760x1080 )
When I do a completely fresh build I will revisit a lot of newer games, but I wouldn't bother without 2080ti SLI with good scaling.
As for Physx, as you can see I chose to dump it along time ago. It's very situational. I didn't really end up noticing a big difference. In batman:AK I still dedicate a card for it though cause that game is a mess.
I don't think its a great idea to drive 3D surround at high resolutions. The effective Resolution in stereo is massive. Do the math. You need a utter monster and then it's iffy with the CPU bug. 1440p and up. Maybe with 2080ti SLI.... don't rely on it for new games.
I wager it's the most punishing task you can expect to put a gaming machine through.
I tend to only use 3D surround on older games (borderlands 2 ect) at 1440p because of it.
If I wanted a dedicated 3D surround machine I would have kept my 1080p screens, honestly, don't think there is a reliable 4k surround setup to run at a clean 3Dvision 1080p
Don't get me wrong 3D surround is still imo the ultimate way to experience 3dvision. If like me you practically require 60fps with high graphics settings, probably reconsider. It probably won't ever really be for me now in most scenario's due to future cards being locked out. Helifax will swear by it, he also has the gear and the preference. (also running 5760x1080 )
When I do a completely fresh build I will revisit a lot of newer games, but I wouldn't bother without 2080ti SLI with good scaling.
As for Physx, as you can see I chose to dump it along time ago. It's very situational. I didn't really end up noticing a big difference. In batman:AK I still dedicate a card for it though cause that game is a mess.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
3D Surround at 5760x1080 with a single 2080Ti is very doable at stable 60FPS.
In some games you might not run "Ultra" and get 60FPS, so you might dial down a bit on the settings.
The CPU and the Mobo chipset also counts;)
I will probably buy an extra 2080Ti to SLI the current one for future.
My order of preference always was: 3D Surround > 2D Surround > 3D Single Screen > 2D Single Screen.
I hardly play games in 3D Single screen @ home (I do it on my laptop when I'm on the road).
3D Surround at 5760x1080 with a single 2080Ti is very doable at stable 60FPS.
In some games you might not run "Ultra" and get 60FPS, so you might dial down a bit on the settings.
The CPU and the Mobo chipset also counts;)
I will probably buy an extra 2080Ti to SLI the current one for future.
My order of preference always was: 3D Surround > 2D Surround > 3D Single Screen > 2D Single Screen.
I hardly play games in 3D Single screen @ home (I do it on my laptop when I'm on the road).
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
Yeah I probably need a New CPU/Motherboard at this point. But basically yeah if you have alot of money to throw around sure. As it stands I always have to make too many graphical compromises at 1440p for me to be happy. 5760x1080 Is likely more than fine with a good build. OP is running 1440p though.
Yeah I probably need a New CPU/Motherboard at this point. But basically yeah if you have alot of money to throw around sure. As it stands I always have to make too many graphical compromises at 1440p for me to be happy. 5760x1080 Is likely more than fine with a good build. OP is running 1440p though.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
[quote="Metal-O-Holic"]We tested few games a year or two back with TitanX Pascal and 1080ti as other being deticated physx and there Was not really any difference to be noted. We tried Atleast metro last light or the 2033 bench and some other games with physx. Measurements were not any scientofic or hardcore but we felt it Was really just a waste of time and effort and no means i could say its worth anyones time or money to invest into another card for deticated physx card. [/quote]
That's what I found in most of the benchmarks too. Far Cry 5 in some benchmarks had lower fps with 2080ti SLI vs single card.
[quote="D-Man11"][quote="Captain0007"]
But of course if you find yourself loving 3D Vision Surround, you might end up needing to get an additional 3 back up monitors :P[/quote]
Lol that would be massive investment. When I got my PG278QR, I tested it and kinda tried dual monitor with PG278Q and 3D still activated in dual monitor, but frame rates was a slide show.
Metal-O-Holic said:We tested few games a year or two back with TitanX Pascal and 1080ti as other being deticated physx and there Was not really any difference to be noted. We tried Atleast metro last light or the 2033 bench and some other games with physx. Measurements were not any scientofic or hardcore but we felt it Was really just a waste of time and effort and no means i could say its worth anyones time or money to invest into another card for deticated physx card.
That's what I found in most of the benchmarks too. Far Cry 5 in some benchmarks had lower fps with 2080ti SLI vs single card.
D-Man11 said:
Captain0007 said:
But of course if you find yourself loving 3D Vision Surround, you might end up needing to get an additional 3 back up monitors :P
Lol that would be massive investment. When I got my PG278QR, I tested it and kinda tried dual monitor with PG278Q and 3D still activated in dual monitor, but frame rates was a slide show.
Thanks all for the info, D-Man11 it was an interesting read on PhysX results . The main reason I'm thinking of going SLI is not higher fps gain but more for having a second backup 2080ti. With no new driver updates and no new SLI profiles for newer games it's pointless. For current games a single 2080ti is more than enough.
My current 1080ti is also a beast in itself and could work as a backup. By the time both cards die we might be well into second gen high FOV high resolution VR headsets as an alternative. All this money could be invested in a VR system. And maybe our community comes up with a way to get 3D working on newer GPUs. A lot of unknowns ATM.
I have Vive Pro with Wireless Adapter and it's absolutely the most immersive gaming. SDE and pixel density is just enough to keep you immersed and not kill the system (original Vive was way too pixilated for me)
And what about CPU. I think the only need for a new CPU in my case is having more PCI-E lanes. Or does it make a meaningful difference in 3D Vision gaming? Any ideas? The gaming benchmarks I read are more on 9900K's platform, little I can find on X299 platforms to give a clear picture.
There's plenty of stock for both PG278QR and PG27VQ monitors. And 2080ti will be around for some time as there's no news in sight of 3080ti. So for now I think it's best to wait and see. Maybe we can get 2080tis at deep discount once new gen is released, maybe PG27VQR is released and we can get PG27VQ at discount (PG278QR was same price before it but now $200 lwer), maybe before April 2020 NVidia adds support of 3080ti within 418 branch of drivers (lol what a wishful thinking), or maybe we find a hack to install 3D drivers on newer branches for newer GPUs.
But it's a good idea to have this discussion ahead and get some ideas and advice on what can be done to keep 3D gaming going for as long as possible.
Thanks all for the info, D-Man11 it was an interesting read on PhysX results . The main reason I'm thinking of going SLI is not higher fps gain but more for having a second backup 2080ti. With no new driver updates and no new SLI profiles for newer games it's pointless. For current games a single 2080ti is more than enough.
My current 1080ti is also a beast in itself and could work as a backup. By the time both cards die we might be well into second gen high FOV high resolution VR headsets as an alternative. All this money could be invested in a VR system. And maybe our community comes up with a way to get 3D working on newer GPUs. A lot of unknowns ATM.
I have Vive Pro with Wireless Adapter and it's absolutely the most immersive gaming. SDE and pixel density is just enough to keep you immersed and not kill the system (original Vive was way too pixilated for me)
And what about CPU. I think the only need for a new CPU in my case is having more PCI-E lanes. Or does it make a meaningful difference in 3D Vision gaming? Any ideas? The gaming benchmarks I read are more on 9900K's platform, little I can find on X299 platforms to give a clear picture.
There's plenty of stock for both PG278QR and PG27VQ monitors. And 2080ti will be around for some time as there's no news in sight of 3080ti. So for now I think it's best to wait and see. Maybe we can get 2080tis at deep discount once new gen is released, maybe PG27VQR is released and we can get PG27VQ at discount (PG278QR was same price before it but now $200 lwer), maybe before April 2020 NVidia adds support of 3080ti within 418 branch of drivers (lol what a wishful thinking), or maybe we find a hack to install 3D drivers on newer branches for newer GPUs.
But it's a good idea to have this discussion ahead and get some ideas and advice on what can be done to keep 3D gaming going for as long as possible.
I have 2 emitters, 3 glasses, PG278Q and PG278QR monitors, Optoma UHD50 and Benq HT2050 projectors, X299 platform with 7820X CPU, a GTX 1080ti, and 32GB RAM. Building fastest PC and having backup is my goal for now. I'd like to gather some ideas on the following upgrades:
1 - Two 2080TIs in SLI - I checked a lot of SLI reviews and benchmarks. Seems the older DX11 titles don't gain much at all, and newer DX12 titles scale fairly coz they're optimized. Even if money is no object, problem is, we'll be stuck with an old driver and DX11 titles anyways, unless we figure out fixing DX12 games somehow. Beyond DX12 is out of question. Only reason I'm leaning towards SLI is if one 2080ti dies, I have a second backup (and sell my 1080ti). What you guys think?
2 - CPU - Cascade lake-X CPUs will be released in June at Computex according to rumors, and might as well be the last CPUs for X299 platform. My 7820X can't do X16/X16 SLI coz it has 28 PCI-E lanes. Speedwise a 9900K would be ideal but I'll need to change mobo, but 9900K can't do SLI in 16X/16X either. Does a faster CPU even matter for 3D Vision due to bottleneck? What you guys think?
3 - Monitor - I have PG278QR as my daily 3D display and older PG278Q as backup. Problem is PG278Q (older model) has vertical scanlines issue in 3D and it's very noticeable. If my PG278QR dies, I'm back to scanlines. I'm debating whether I should get another PG278QR or the newer curved PG27VQ. Someone posted pics somewhere here which showed it has slightly less ghosting. Any feedback from PG27VQ users?
DX9 was relevant for a long time; DX10 was short-lived; and now DX11 has been around beside any DX12 title. I have a feeling DX12 will be short-lived too. Once DX13 comes and games have no DX11 mode anymore, we're then screwed :(
FPS gains can be hit and miss, depending on the game engine, but Volnaiskra demonstrated at one time that FPS gains can be "significant"
http://www.volnapc.com/all-posts/how-much-difference-does-a-dedicated-physx-card-make <---a must read imho
(check Arkham Origins SLI Titans + 650) vs (Titan +Titan dedicated PhysX) results
But keep in mind, there are many games that do not use Nvidia's PhysX SDK. A lot of games use the Havok Engine for Physics computations alongside whichever game engine they are using.(Havok is just one of many other ways to implement physics within a game)
Another thing to consider as pointed out by Captain0007, is the PCIe bandwidth scaling with multiple GPUs. If you SLI with an additional PhysX GPU, will you go to 16x8x8 and if so, will this be an issue?
Necropants at one time stated that that it did not seem to be an issue.
But keep in mind, the closer a dedicated PhysX GPU is to your main GPU (both architechture and memory wise), the better the gain. Just tossing any old GPU in there, could have very limited gains.
It was discussed here previously, if anyone is interested in this topic, perhaps continue it there or start a new thread.
One newer game using Nvidia's Apex/PhysX SDK is Hellblade Seauna's Sacrifice, which also has a VR option. It would be interesting to know if a dedicated PhysX GPU benefited in VR as well. Of course Nvidia's CEO showed that PhysX can run very well on a single GPU when he demonstrated FlameWorks, but that was an "optimized" demo.
But if a game uses Nvidia's Hair Works, Flame Works, Wave Works, Face Works or Flex, perhaps a dedicated PhysX GPU might be well worth including in a 2019 3D Vision Legacy build.
In fact some games like tekken 7 and Resident evil two remake have issues when using SLI with 3dvision.
It's probably a good idea for a backup card though
I haven't cared to do any tests to see if have yet reached a point where the lanes have become saturated and obviously can't until it looks like I inevitably pick up a 2080ti (I'm really torn about this, nvidia doesn't deserve anymore of my money.)
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
If you look at Necropants sig, you could get another PG278Q for surround
But perhaps the PG27VQ or another PG278QR might work. Typically 3D Vision surround requires that all 3 monitors are identical.
That's if you are even interested in 3D Vision Surround.
But of course if you find yourself loving 3D Vision Surround, you might end up needing to get an additional 3 back up monitors :P
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
I wager it's the most punishing task you can expect to put a gaming machine through.
I tend to only use 3D surround on older games (borderlands 2 ect) at 1440p because of it.
If I wanted a dedicated 3D surround machine I would have kept my 1080p screens, honestly, don't think there is a reliable 4k surround setup to run at a clean 3Dvision 1080p
Don't get me wrong 3D surround is still imo the ultimate way to experience 3dvision. If like me you practically require 60fps with high graphics settings, probably reconsider. It probably won't ever really be for me now in most scenario's due to future cards being locked out. Helifax will swear by it, he also has the gear and the preference. (also running 5760x1080 )
When I do a completely fresh build I will revisit a lot of newer games, but I wouldn't bother without 2080ti SLI with good scaling.
As for Physx, as you can see I chose to dump it along time ago. It's very situational. I didn't really end up noticing a big difference. In batman:AK I still dedicate a card for it though cause that game is a mess.
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
In some games you might not run "Ultra" and get 60FPS, so you might dial down a bit on the settings.
The CPU and the Mobo chipset also counts;)
I will probably buy an extra 2080Ti to SLI the current one for future.
My order of preference always was: 3D Surround > 2D Surround > 3D Single Screen > 2D Single Screen.
I hardly play games in 3D Single screen @ home (I do it on my laptop when I'm on the road).
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
That's what I found in most of the benchmarks too. Far Cry 5 in some benchmarks had lower fps with 2080ti SLI vs single card.
My current 1080ti is also a beast in itself and could work as a backup. By the time both cards die we might be well into second gen high FOV high resolution VR headsets as an alternative. All this money could be invested in a VR system. And maybe our community comes up with a way to get 3D working on newer GPUs. A lot of unknowns ATM.
I have Vive Pro with Wireless Adapter and it's absolutely the most immersive gaming. SDE and pixel density is just enough to keep you immersed and not kill the system (original Vive was way too pixilated for me)
And what about CPU. I think the only need for a new CPU in my case is having more PCI-E lanes. Or does it make a meaningful difference in 3D Vision gaming? Any ideas? The gaming benchmarks I read are more on 9900K's platform, little I can find on X299 platforms to give a clear picture.
There's plenty of stock for both PG278QR and PG27VQ monitors. And 2080ti will be around for some time as there's no news in sight of 3080ti. So for now I think it's best to wait and see. Maybe we can get 2080tis at deep discount once new gen is released, maybe PG27VQR is released and we can get PG27VQ at discount (PG278QR was same price before it but now $200 lwer), maybe before April 2020 NVidia adds support of 3080ti within 418 branch of drivers (lol what a wishful thinking), or maybe we find a hack to install 3D drivers on newer branches for newer GPUs.
But it's a good idea to have this discussion ahead and get some ideas and advice on what can be done to keep 3D gaming going for as long as possible.