[quote="BlueSkyDef"]I have seen some DLSS Images. Now I have some doubt about this feature.
I am starting to think It will not be compatible with 3D Vision or VR. Simply because the left and right images will not be trained to ignore left and right divergences. If it does work it will be an overall image with lower depth. Due to the AI removing Depth Cues that our brains are attuned to. It depends where this is done in the pipeline. All indication show it will be at the end of the line.[/quote]This is great, valid concern - although, I think NVIDIA’s current interest in VR technology may help it focus how DLSS handles multiple concurrent viewpoints. For some of the emerging wide FOV HMDs, they have suggested doubling viewpoints to 4.
GibsonRed - I am envious. My pre-order is not scheduled to arrive until November(?). For best overclocking (and lowest noise), the stock cooler really should be replaced with hybrid/water cooling (if your PC case has room).
BlueSkyDef said:I have seen some DLSS Images. Now I have some doubt about this feature.
I am starting to think It will not be compatible with 3D Vision or VR. Simply because the left and right images will not be trained to ignore left and right divergences. If it does work it will be an overall image with lower depth. Due to the AI removing Depth Cues that our brains are attuned to. It depends where this is done in the pipeline. All indication show it will be at the end of the line.
This is great, valid concern - although, I think NVIDIA’s current interest in VR technology may help it focus how DLSS handles multiple concurrent viewpoints. For some of the emerging wide FOV HMDs, they have suggested doubling viewpoints to 4.
GibsonRed - I am envious. My pre-order is not scheduled to arrive until November(?). For best overclocking (and lowest noise), the stock cooler really should be replaced with hybrid/water cooling (if your PC case has room).
Whyme466 I have used EK blocks on my 980ti and cpu.
I’m not convinced they are worth it after my last three builds were water cooled.
I probably will do eventually but this thing is boosting to between 1980 MHz and 2080mhz on its air cooler.
This card is a monster though. It’s twice as fast as my water cooled overclocked 980ti (with the cpu overclocked and watercooled)
And that’s with the 2080ti at standard clocks and aircooled with an air cooler on the cpu at standard clocks.
Performance is great, just the price that is ridiculous.
Shadow of the tomb raider play silky smooth at 4K max settings. I still don’t like using AA though. Makes the image look softer, blurrier and slightly washed out. I hope DLSS AA looks a lot better.
I think I was lucky getting mine so early as not many people seem to have them yet.
Whyme466 I have used EK blocks on my 980ti and cpu.
I’m not convinced they are worth it after my last three builds were water cooled.
I probably will do eventually but this thing is boosting to between 1980 MHz and 2080mhz on its air cooler.
This card is a monster though. It’s twice as fast as my water cooled overclocked 980ti (with the cpu overclocked and watercooled)
And that’s with the 2080ti at standard clocks and aircooled with an air cooler on the cpu at standard clocks.
Performance is great, just the price that is ridiculous.
Shadow of the tomb raider play silky smooth at 4K max settings. I still don’t like using AA though. Makes the image look softer, blurrier and slightly washed out. I hope DLSS AA looks a lot better.
I think I was lucky getting mine so early as not many people seem to have them yet.
GibsonRed - glad you received your card. Sounds like you are having NO issues with 3D Vision and 2080Ti drivers for Tomb Raider (or other games).
AA with line-interlaced displays can offer mixed results, since AA only benefits the horizontal axis. The vertical axis does not benefit from AA - in fact, it really should be filtered with an axis asymmetric bandwidth limiting filter instead BEFORE the line-interlaced formatting (for EDID display method) or TaB (3DMigoto display method). However, this asymmetrical filtering is never performed in practice, for any of the lossy 3D data compressing formats like SBS.
By the way, I have removed stock cooler and installed Arctic hybrid cooler on my last 3 GPUs (and used Arctic multi-fan air coolers on the 2 preceding GPUs). I have been very please with all mods, especially the hybrid coolers. They always significantly reduced GPU noise, and the hybrid coolers lowered operational temperatures by a least 10 C.
On the VR front, VIVE/HTC still has not shipped my wireless transmitter/receiver for my Vive Pro. It was formally “released” on Tuesday. I have asked them for status, but they are not committing to shipping date, beyond end of September (which is now)...
GibsonRed - glad you received your card. Sounds like you are having NO issues with 3D Vision and 2080Ti drivers for Tomb Raider (or other games).
AA with line-interlaced displays can offer mixed results, since AA only benefits the horizontal axis. The vertical axis does not benefit from AA - in fact, it really should be filtered with an axis asymmetric bandwidth limiting filter instead BEFORE the line-interlaced formatting (for EDID display method) or TaB (3DMigoto display method). However, this asymmetrical filtering is never performed in practice, for any of the lossy 3D data compressing formats like SBS.
By the way, I have removed stock cooler and installed Arctic hybrid cooler on my last 3 GPUs (and used Arctic multi-fan air coolers on the 2 preceding GPUs). I have been very please with all mods, especially the hybrid coolers. They always significantly reduced GPU noise, and the hybrid coolers lowered operational temperatures by a least 10 C.
On the VR front, VIVE/HTC still has not shipped my wireless transmitter/receiver for my Vive Pro. It was formally “released” on Tuesday. I have asked them for status, but they are not committing to shipping date, beyond end of September (which is now)...
[quote="RAGEdemon"]NVLink benchmarks...
https://www.youtube.com/watch?v=84OkcOYmXOk[/quote]
Wish I could get excited over benchmark scores, when developers don't even bother most of the time to support multi GPU configurations. Until this changes if ever it remains a waste of time. *looks at sig.*
Wish I could get excited over benchmark scores, when developers don't even bother most of the time to support multi GPU configurations. Until this changes if ever it remains a waste of time. *looks at sig.*
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
I'm not confident of the demise of SLi just yet, IMO, the predictions have been greatly exaggerated.
I usually play games a long time after initial release though, when they become fully patched, all the performance issues and bugs ironed out, and are much cheaper. At that point, there is mostly always a working SLi profile. The only game that I played that didn't have SLi support with great scaling was Doom 2016.
Invariably, someone always brings up the "fact" that unreal engine doesn't support SLi - well, it does, and has done so for quite some time. As well as unity - Mostly if data from previous frame isn't required (i.e no temporal AA etc).
Indeed you have to configure SLi profiles and perform some testing, but to me personally, this is no different from getting 3DV to work using hacks etc, or finding optimal graphics settings in games... I don't mind at all, in fact, I enjoy it to a degree. The unfortunate truth is that the vast majority of PC gamers don't even go into the game options to adjust their settings, they just expect things to work out of the box, including SLi... I find this absolutely bizarre.
Maybe others who have had bad luck with SLi play games that I don't play. I have played a LOT of games in my days, and if the choice is suffering from <60fps or spending double to get ~65% more performance in maybe 65% of games, then I'd drop money for it in a heartbeat - the value of getting a smooth, amazing, 3DV experience of a great peace of art in the little free time that I have, means far more to me than the portion of money I work hard to earn, so I can spend some of my free time on my hobby.
I personally won't be buying this generation (my 1080 SLi can play most games at 3DV @ my preferred resolution of 1600p just fine) unless maybe I can get a second hand 2080Ti cheap a few months from now... It doesn't help when you know you are being taken advantage of by a greedy company; but in my humble opinion, one has to step back and look past the price tag, even if it is vomit inducing, and focus on whether the value it will bring to oneself might outweigh that nauseous feeling :)
At the end of the day, every single person here (yes, even StarMan), would buy a 2080Ti SLi setup if it cost $1. The reason we don't is simply and only because of the price - and that price is very subjective depending on whose lens one looks through - an upper class millionaire, or a starving Rwandan child - it's useless to argue about price of anything generally - we all have our own value systems...
Perhaps an apt analogy: Isn't it strange how on the road, anyone going slower than us is a slow-mo, and everyone going faster than us is a maniac? :)
I'm not confident of the demise of SLi just yet, IMO, the predictions have been greatly exaggerated.
I usually play games a long time after initial release though, when they become fully patched, all the performance issues and bugs ironed out, and are much cheaper. At that point, there is mostly always a working SLi profile. The only game that I played that didn't have SLi support with great scaling was Doom 2016.
Invariably, someone always brings up the "fact" that unreal engine doesn't support SLi - well, it does, and has done so for quite some time. As well as unity - Mostly if data from previous frame isn't required (i.e no temporal AA etc).
Indeed you have to configure SLi profiles and perform some testing, but to me personally, this is no different from getting 3DV to work using hacks etc, or finding optimal graphics settings in games... I don't mind at all, in fact, I enjoy it to a degree. The unfortunate truth is that the vast majority of PC gamers don't even go into the game options to adjust their settings, they just expect things to work out of the box, including SLi... I find this absolutely bizarre.
Maybe others who have had bad luck with SLi play games that I don't play. I have played a LOT of games in my days, and if the choice is suffering from <60fps or spending double to get ~65% more performance in maybe 65% of games, then I'd drop money for it in a heartbeat - the value of getting a smooth, amazing, 3DV experience of a great peace of art in the little free time that I have, means far more to me than the portion of money I work hard to earn, so I can spend some of my free time on my hobby.
I personally won't be buying this generation (my 1080 SLi can play most games at 3DV @ my preferred resolution of 1600p just fine) unless maybe I can get a second hand 2080Ti cheap a few months from now... It doesn't help when you know you are being taken advantage of by a greedy company; but in my humble opinion, one has to step back and look past the price tag, even if it is vomit inducing, and focus on whether the value it will bring to oneself might outweigh that nauseous feeling :)
At the end of the day, every single person here (yes, even StarMan), would buy a 2080Ti SLi setup if it cost $1. The reason we don't is simply and only because of the price - and that price is very subjective depending on whose lens one looks through - an upper class millionaire, or a starving Rwandan child - it's useless to argue about price of anything generally - we all have our own value systems...
Perhaps an apt analogy: Isn't it strange how on the road, anyone going slower than us is a slow-mo, and everyone going faster than us is a maniac? :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="BlueSkyDef"]I have seen some DLSS Images. Now I have some doubt about this feature.
I am starting to think It will not be compatible with 3D Vision or VR. Simply because the left and right images will not be trained to ignore left and right divergences. If it does work it will be an overall image with lower depth. Due to the AI removing Depth Cues that our brains are attuned to. It depends where this is done in the pipeline. All indication show it will be at the end of the line.
I don't think my shader would be affected as long my shader right after DLSS. But, if it's after even my shader will suffer this problem.
Just trying to realistic when it comes to features and compatibility.
Yes, the cards right now are too expensive for today's tech. In turn also not powerful enough for tomorrow tech.
But, performance for the hear and now on existing titles it has that.[/quote]
man that would be a major disappointment if DLSS did not work well with 3D vision, its the feature I am most interested in as a 3D gamer. I always thought PC should have their own better version of 4k checkerboarding etc, I really liked that temporal filtering feature Watchdogs 2 had. It looked better than checkerboarding to me, and I ended up preferring it on with maxed out settings over true 4k and reduced graphical settings. I was annoyed Ubisoft never used the tech again, Wildlands and other titles reeeeeally could have used it!
I was also disappointed to see DLSS requires the game to have TAA in order to work, that rules out a couple of the games I was most interested in using it with. So even if it works fine with 3D vision, I worry it won't be implemented in as many titles as one would hope. we'll see after this initial push..
BlueSkyDef said:I have seen some DLSS Images. Now I have some doubt about this feature.
I am starting to think It will not be compatible with 3D Vision or VR. Simply because the left and right images will not be trained to ignore left and right divergences. If it does work it will be an overall image with lower depth. Due to the AI removing Depth Cues that our brains are attuned to. It depends where this is done in the pipeline. All indication show it will be at the end of the line.
I don't think my shader would be affected as long my shader right after DLSS. But, if it's after even my shader will suffer this problem.
Just trying to realistic when it comes to features and compatibility.
Yes, the cards right now are too expensive for today's tech. In turn also not powerful enough for tomorrow tech.
But, performance for the hear and now on existing titles it has that.
man that would be a major disappointment if DLSS did not work well with 3D vision, its the feature I am most interested in as a 3D gamer. I always thought PC should have their own better version of 4k checkerboarding etc, I really liked that temporal filtering feature Watchdogs 2 had. It looked better than checkerboarding to me, and I ended up preferring it on with maxed out settings over true 4k and reduced graphical settings. I was annoyed Ubisoft never used the tech again, Wildlands and other titles reeeeeally could have used it!
I was also disappointed to see DLSS requires the game to have TAA in order to work, that rules out a couple of the games I was most interested in using it with. So even if it works fine with 3D vision, I worry it won't be implemented in as many titles as one would hope. we'll see after this initial push..
Unfortunately, I just had to cancel my 2080 Ti preorder, which was going to ship later this week. I discovered that Arctic, after years of product evolution to create a great, reliable hybrid cooler, has DISCONTINUED its Accelero Hybrid cooler, and I cannot find stock anywhere. I am now planning to wait for EVGA to release their hybrid version of the 2080 Ti...
Unfortunately, I just had to cancel my 2080 Ti preorder, which was going to ship later this week. I discovered that Arctic, after years of product evolution to create a great, reliable hybrid cooler, has DISCONTINUED its Accelero Hybrid cooler, and I cannot find stock anywhere. I am now planning to wait for EVGA to release their hybrid version of the 2080 Ti...
I wish I could take that preorder hehe. So I was reading how ray tracing will only be on DX12, is that true? If so, I guess we won't be seeing real time ray tracing in 3D anytime soon - well actually, I guess SOTTR would be the only place you could see it once their RTX patch comes out. I can only hope Red Dead 2 has built in 3D like GTAV and maybe that title will have some RTX features, I just hope it doesn't take 2 years to get to PC.
Does DLSS only work on DX12? I hope not, I don't see why the tensor cores can't do their thing on DX11 but now I am worried all the new bells and whistles will be unavailable for us 3D users. I'm sure I would have turned ray tracing off for better performance in 3D anyway, but I'd like to have the options. the RTX lighting demo in Metro Exodus was impressive..hopefully they can optimize it so it does not kill FPS.
I wish I could take that preorder hehe. So I was reading how ray tracing will only be on DX12, is that true? If so, I guess we won't be seeing real time ray tracing in 3D anytime soon - well actually, I guess SOTTR would be the only place you could see it once their RTX patch comes out. I can only hope Red Dead 2 has built in 3D like GTAV and maybe that title will have some RTX features, I just hope it doesn't take 2 years to get to PC.
Does DLSS only work on DX12? I hope not, I don't see why the tensor cores can't do their thing on DX11 but now I am worried all the new bells and whistles will be unavailable for us 3D users. I'm sure I would have turned ray tracing off for better performance in 3D anyway, but I'd like to have the options. the RTX lighting demo in Metro Exodus was impressive..hopefully they can optimize it so it does not kill FPS.
There is a DLSS benchmark of FFXV. Not sure if it's publicly available. Does anyone know what API it uses? They also plan to add some ray tracing features, IIRC. Whatever happens, I hope they don't scrap the DX11 renderer.
There is a DLSS benchmark of FFXV. Not sure if it's publicly available. Does anyone know what API it uses? They also plan to add some ray tracing features, IIRC. Whatever happens, I hope they don't scrap the DX11 renderer.
It appears that this generation of GPUs (at least 2080 Ti) may have some type of power limit that makes water block and hybrid cooling less important, beyond noise reduction. See [url]https://m.youtube.com/watch?v=ORJfhlTAgfU[/url].
It appears that this generation of GPUs (at least 2080 Ti) may have some type of power limit that makes water block and hybrid cooling less important, beyond noise reduction. See " rel="nofollow" target = "_blank">https://m..
Gents, so it looks like Titan RTX is incoming...
[img]https://images.anandtech.com/doci/13668/Titan_RTX_Car2_678x452.jpg[/img]
https://www.anandtech.com/show/13668/nvidia-unveils-rtx-titan-2500-top-turing
Slightly faster than a 2080Ti, but over double the VRam. Likely identical gaming performance but huge benefit for professional applications at a reduced cost compared to equivalent purely professional cards apparently...
I would just like to humbly point out that this is on a 12nm process. All manufacturers - AMD, nVidia, Intel shall be moving to the equivalent of 7nm process in 2019. In my humble opinion, that would be a respectable increase in performance and reduction in price to warrant waiting till then - unless one has deep pockets :)
Slightly faster than a 2080Ti, but over double the VRam. Likely identical gaming performance but huge benefit for professional applications at a reduced cost compared to equivalent purely professional cards apparently...
I would just like to humbly point out that this is on a 12nm process. All manufacturers - AMD, nVidia, Intel shall be moving to the equivalent of 7nm process in 2019. In my humble opinion, that would be a respectable increase in performance and reduction in price to warrant waiting till then - unless one has deep pockets :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I did say this from the start, its gonna be out for christmas.
Though the Price seems to be another Joke once again, that i did not expect.
Luckilly i have other things to do than spend my cash on too expensive gpu's
I did say this from the start, its gonna be out for christmas.
Though the Price seems to be another Joke once again, that i did not expect.
Luckilly i have other things to do than spend my cash on too expensive gpu's
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
[quote="RAGEdemon"]Gents, so it looks like Titan RTX is incoming...
[img]https://images.anandtech.com/doci/13668/Titan_RTX_Car2_678x452.jpg[/img]
https://www.anandtech.com/show/13668/nvidia-unveils-rtx-titan-2500-top-turing
Slightly faster than a 2080Ti, but over double the VRam. Likely identical gaming performance but huge benefit for professional applications at a reduced cost compared to equivalent purely professional cards apparently...
I would just like to humbly point out that this is on a 12nm process. All manufacturers - AMD, nVidia, Intel shall be moving to the equivalent of 7nm process in 2019. In my humble opinion, that would be a respectable increase in performance and reduction in price to warrant waiting till then - unless one has deep pockets :)[/quote]
The actual 80 Ti card right on queue as I predicted:
https://forums.geforce.com/default/topic/1084843/the-geforce-lounge/the-rise-and-fall-of-bugattigreedia/
I'm Mooncheese over here:
[url]https://www.overclock.net/forum/225-hardware-news/1714994-nvidia-com-titan-rtx-announced-11.html#post27744810[/url]
Slightly faster than a 2080Ti, but over double the VRam. Likely identical gaming performance but huge benefit for professional applications at a reduced cost compared to equivalent purely professional cards apparently...
I would just like to humbly point out that this is on a 12nm process. All manufacturers - AMD, nVidia, Intel shall be moving to the equivalent of 7nm process in 2019. In my humble opinion, that would be a respectable increase in performance and reduction in price to warrant waiting till then - unless one has deep pockets :)
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
GibsonRed - I am envious. My pre-order is not scheduled to arrive until November(?). For best overclocking (and lowest noise), the stock cooler really should be replaced with hybrid/water cooling (if your PC case has room).
I’m not convinced they are worth it after my last three builds were water cooled.
I probably will do eventually but this thing is boosting to between 1980 MHz and 2080mhz on its air cooler.
This card is a monster though. It’s twice as fast as my water cooled overclocked 980ti (with the cpu overclocked and watercooled)
And that’s with the 2080ti at standard clocks and aircooled with an air cooler on the cpu at standard clocks.
Performance is great, just the price that is ridiculous.
Shadow of the tomb raider play silky smooth at 4K max settings. I still don’t like using AA though. Makes the image look softer, blurrier and slightly washed out. I hope DLSS AA looks a lot better.
I think I was lucky getting mine so early as not many people seem to have them yet.
AA with line-interlaced displays can offer mixed results, since AA only benefits the horizontal axis. The vertical axis does not benefit from AA - in fact, it really should be filtered with an axis asymmetric bandwidth limiting filter instead BEFORE the line-interlaced formatting (for EDID display method) or TaB (3DMigoto display method). However, this asymmetrical filtering is never performed in practice, for any of the lossy 3D data compressing formats like SBS.
By the way, I have removed stock cooler and installed Arctic hybrid cooler on my last 3 GPUs (and used Arctic multi-fan air coolers on the 2 preceding GPUs). I have been very please with all mods, especially the hybrid coolers. They always significantly reduced GPU noise, and the hybrid coolers lowered operational temperatures by a least 10 C.
On the VR front, VIVE/HTC still has not shipped my wireless transmitter/receiver for my Vive Pro. It was formally “released” on Tuesday. I have asked them for status, but they are not committing to shipping date, beyond end of September (which is now)...
Wish I could get excited over benchmark scores, when developers don't even bother most of the time to support multi GPU configurations. Until this changes if ever it remains a waste of time. *looks at sig.*
i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)
Are getting the founder's edition ? if so part of cooler is glued and do a google search for removing the stock cooler.
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
I usually play games a long time after initial release though, when they become fully patched, all the performance issues and bugs ironed out, and are much cheaper. At that point, there is mostly always a working SLi profile. The only game that I played that didn't have SLi support with great scaling was Doom 2016.
Invariably, someone always brings up the "fact" that unreal engine doesn't support SLi - well, it does, and has done so for quite some time. As well as unity - Mostly if data from previous frame isn't required (i.e no temporal AA etc).
Indeed you have to configure SLi profiles and perform some testing, but to me personally, this is no different from getting 3DV to work using hacks etc, or finding optimal graphics settings in games... I don't mind at all, in fact, I enjoy it to a degree. The unfortunate truth is that the vast majority of PC gamers don't even go into the game options to adjust their settings, they just expect things to work out of the box, including SLi... I find this absolutely bizarre.
Maybe others who have had bad luck with SLi play games that I don't play. I have played a LOT of games in my days, and if the choice is suffering from <60fps or spending double to get ~65% more performance in maybe 65% of games, then I'd drop money for it in a heartbeat - the value of getting a smooth, amazing, 3DV experience of a great peace of art in the little free time that I have, means far more to me than the portion of money I work hard to earn, so I can spend some of my free time on my hobby.
I personally won't be buying this generation (my 1080 SLi can play most games at 3DV @ my preferred resolution of 1600p just fine) unless maybe I can get a second hand 2080Ti cheap a few months from now... It doesn't help when you know you are being taken advantage of by a greedy company; but in my humble opinion, one has to step back and look past the price tag, even if it is vomit inducing, and focus on whether the value it will bring to oneself might outweigh that nauseous feeling :)
At the end of the day, every single person here (yes, even StarMan), would buy a 2080Ti SLi setup if it cost $1. The reason we don't is simply and only because of the price - and that price is very subjective depending on whose lens one looks through - an upper class millionaire, or a starving Rwandan child - it's useless to argue about price of anything generally - we all have our own value systems...
Perhaps an apt analogy: Isn't it strange how on the road, anyone going slower than us is a slow-mo, and everyone going faster than us is a maniac? :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
man that would be a major disappointment if DLSS did not work well with 3D vision, its the feature I am most interested in as a 3D gamer. I always thought PC should have their own better version of 4k checkerboarding etc, I really liked that temporal filtering feature Watchdogs 2 had. It looked better than checkerboarding to me, and I ended up preferring it on with maxed out settings over true 4k and reduced graphical settings. I was annoyed Ubisoft never used the tech again, Wildlands and other titles reeeeeally could have used it!
I was also disappointed to see DLSS requires the game to have TAA in order to work, that rules out a couple of the games I was most interested in using it with. So even if it works fine with 3D vision, I worry it won't be implemented in as many titles as one would hope. we'll see after this initial push..
Does DLSS only work on DX12? I hope not, I don't see why the tensor cores can't do their thing on DX11 but now I am worried all the new bells and whistles will be unavailable for us 3D users. I'm sure I would have turned ray tracing off for better performance in 3D anyway, but I'd like to have the options. the RTX lighting demo in Metro Exodus was impressive..hopefully they can optimize it so it does not kill FPS.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
https://www.anandtech.com/show/13668/nvidia-unveils-rtx-titan-2500-top-turing
Slightly faster than a 2080Ti, but over double the VRam. Likely identical gaming performance but huge benefit for professional applications at a reduced cost compared to equivalent purely professional cards apparently...
I would just like to humbly point out that this is on a 12nm process. All manufacturers - AMD, nVidia, Intel shall be moving to the equivalent of 7nm process in 2019. In my humble opinion, that would be a respectable increase in performance and reduction in price to warrant waiting till then - unless one has deep pockets :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Though the Price seems to be another Joke once again, that i did not expect.
Luckilly i have other things to do than spend my cash on too expensive gpu's
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
The actual 80 Ti card right on queue as I predicted:
https://forums.geforce.com/default/topic/1084843/the-geforce-lounge/the-rise-and-fall-of-bugattigreedia/
I'm Mooncheese over here:
https://www.overclock.net/forum/225-hardware-news/1714994-nvidia-com-titan-rtx-announced-11.html#post27744810
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
CPU: Intel Core i7 3770K @ 3.50GHz
MB: Asus P8Z77-V DELUXE
RAM: 32.0GB Dual-Channel DDR3 @ 799MHz (10-10-10-27)
VGA: Asus Strix GTX 1070 2x SLI
DISPLAY: Asus ROG PG278QR
OS: Windows 10 Home 64-bit