CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
They don't really moderate this forum unless called upon, because we all act like grown ups and nothing ever moderation-worthy really happens here... or so I thought;- I stand corrected ;-)
They don't really moderate this forum unless called upon, because we all act like grown ups and nothing ever moderation-worthy really happens here... or so I thought;- I stand corrected ;-)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Wow. Looks like Starman is having a full on breakdown on this thread!
What a mentalist. He clearly has schizophrenia!
Imagine the shame in being recognised as another forum user purely by your own vileness! :D
Ahhhh mr starman, your reputation proceeds you.
So according to him if we buy the new Turing gfx cards, we are the cancer of the gaming community and if you buy two of the old gen it’s even worse?
I said I’d sell the card on eBay as a flippant, off the cuff remark. You went full retard. Your basically condoning how business works in a commercialised world.
He sticks to pointing that out rather than addressing my other points as he has no counter argument.
How can buying new gfx cards make people ‘part of the problem’ in the gaming community? What’s true fool.
Tell you what starman, I’ll sell you my old, ‘hand me down’ gfx cards when I’m done with them if you can’t afford the new ones yourself.
You’re a vile, spoilt, trolling bitch of an inbred fuckwit who can’t see the past his own delusions!
Everything you’ve posted basically translates as ‘I can’t afford one, so I’ll put down everyone else who does, when they justify their reasons I’ll just make ad hominem attacks against their character and focus in on details but miss the point entirely’
It truly is quite comical!
Wake up or at least stick to your word and don’t post again. Do everyone a favour and GO AWAY!
Nobody wants you here, literally no one. Can’t you see that?
If everyone is against you, maybe you should have the intelligence to take a step back and consider you might, actually, be in the wrong!
Wow. Looks like Starman is having a full on breakdown on this thread!
What a mentalist. He clearly has schizophrenia!
Imagine the shame in being recognised as another forum user purely by your own vileness! :D
Ahhhh mr starman, your reputation proceeds you.
So according to him if we buy the new Turing gfx cards, we are the cancer of the gaming community and if you buy two of the old gen it’s even worse?
I said I’d sell the card on eBay as a flippant, off the cuff remark. You went full retard. Your basically condoning how business works in a commercialised world.
He sticks to pointing that out rather than addressing my other points as he has no counter argument.
How can buying new gfx cards make people ‘part of the problem’ in the gaming community? What’s true fool.
Tell you what starman, I’ll sell you my old, ‘hand me down’ gfx cards when I’m done with them if you can’t afford the new ones yourself.
You’re a vile, spoilt, trolling bitch of an inbred fuckwit who can’t see the past his own delusions!
Everything you’ve posted basically translates as ‘I can’t afford one, so I’ll put down everyone else who does, when they justify their reasons I’ll just make ad hominem attacks against their character and focus in on details but miss the point entirely’
It truly is quite comical!
Wake up or at least stick to your word and don’t post again. Do everyone a favour and GO AWAY!
Nobody wants you here, literally no one. Can’t you see that?
If everyone is against you, maybe you should have the intelligence to take a step back and consider you might, actually, be in the wrong!
I think at some point I will do some tests on FPS in 3d for SLI versus non SLI. It's been a while since anyone did it and I do want to get some new data to help me decide on what my upgrade path should be.
DX12 is obviously an anomaly at this point, especially as we currently have no way of fixing DX12 games. That's why I'm hoping that going forwards, mgpu will be more closely integrated into the big game engines.
I also expect that unless everyone gets completely fed up, we will find a DX12 solution at some point as well.
A big negative was when UE4 shipped without any SLI support, as it's widely used for VR and other modern titles, but of course SLI support was added to later versions of UE4.
In my opinion, the extra demands of VR make it more likely that a parallel rendering approach is going to be needed now more than ever, even when foveated rendering and eye tracking eventually takes off.
That's why although the jury is still out, I'm optimistic that mgpu won't go away. The fact that Nvidia created Nvlink and AMD are talking about PCIe4 based multi card rendering makes me think that although SLI support is sketchy, the concept is here to stay for the long term.
I'm aware that the driver for these technologies was not gaming, but I think there is a need/demand in the consumer market too. In some ways it's analagous to the way that CPUs are slowly moving to more cores rather than IPC.
I just noticed that vulcan/starman has a link to some firestrike results in his signature. He was getting 30k with a 1080ti, I'm getting 40k with my 1070s in SLI: https://www.3dmark.com/3dm/13657632
At the time when I purchased them, I was aware of the way that 3d vision scales really well with SLI, hence the reason I bought 2 cards. It was a gamble, but for the same money it would outperform (most of the time) what was the single fastest card at that point. Actually, I don't think the 1080ti was even available at that point.
I'm not sure the same thing would be true now, but then as others have pointed out, if you actually do already have the fastest single card already, where do you go from there? You buy another and accept that the law of diminishing returns applies.
Anyway, I'm not a soothsayer so let's see what happens with the next round of cards.
I think at some point I will do some tests on FPS in 3d for SLI versus non SLI. It's been a while since anyone did it and I do want to get some new data to help me decide on what my upgrade path should be.
DX12 is obviously an anomaly at this point, especially as we currently have no way of fixing DX12 games. That's why I'm hoping that going forwards, mgpu will be more closely integrated into the big game engines.
I also expect that unless everyone gets completely fed up, we will find a DX12 solution at some point as well.
A big negative was when UE4 shipped without any SLI support, as it's widely used for VR and other modern titles, but of course SLI support was added to later versions of UE4.
In my opinion, the extra demands of VR make it more likely that a parallel rendering approach is going to be needed now more than ever, even when foveated rendering and eye tracking eventually takes off.
That's why although the jury is still out, I'm optimistic that mgpu won't go away. The fact that Nvidia created Nvlink and AMD are talking about PCIe4 based multi card rendering makes me think that although SLI support is sketchy, the concept is here to stay for the long term.
I'm aware that the driver for these technologies was not gaming, but I think there is a need/demand in the consumer market too. In some ways it's analagous to the way that CPUs are slowly moving to more cores rather than IPC.
I just noticed that vulcan/starman has a link to some firestrike results in his signature. He was getting 30k with a 1080ti, I'm getting 40k with my 1070s in SLI: https://www.3dmark.com/3dm/13657632
At the time when I purchased them, I was aware of the way that 3d vision scales really well with SLI, hence the reason I bought 2 cards. It was a gamble, but for the same money it would outperform (most of the time) what was the single fastest card at that point. Actually, I don't think the 1080ti was even available at that point.
I'm not sure the same thing would be true now, but then as others have pointed out, if you actually do already have the fastest single card already, where do you go from there? You buy another and accept that the law of diminishing returns applies.
Anyway, I'm not a soothsayer so let's see what happens with the next round of cards.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Well Vulkan launched without Multi-GPU support, but that has changed with Strange Brigade.
AMD has implemented supporting drivers, no idea about Nvidia.
https://www.overclock3d.net/news/software/strange_brigade_is_the_first_game_to_support_the_vulkan_api_with_multi-gpu_systems/1
Keep in mind, Nvidia will not have NVLink available on any of the RTX 2070 GPUs.
2080 reviews have apparently also been delayed to the 19th...
[url]https://videocardz.com/newz/nvidia-changes-geforce-rtx-2080-reviews-date-to-september-19th[/url]
The biggest wait for a generation, the biggest wait from reveal to release, the biggest wait for rewiews, the biggest price increase per tier... I wish it could be the biggest performance increase too :p.
The biggest wait for a generation, the biggest wait from reveal to release, the biggest wait for rewiews, the biggest price increase per tier... I wish it could be the biggest performance increase too :p.
I kind of get his majesty starmans point of view about people bying new cards as if people buy this high priced cards without any realwold data its only sending one message. Yes we Are ready to pay ridicilous sums or even more from new gpus. But starman Was way out of line with his personal abusive posts. I would not get offended but some might, and i think most of don’t want that kind of language here.
At the end its our personal choise to buy or not to buy Something and we all have a right to our oppinnions and by being wrong and talking about Things we learn the best.
Your reap what you Plant
I kind of get his majesty starmans point of view about people bying new cards as if people buy this high priced cards without any realwold data its only sending one message. Yes we Are ready to pay ridicilous sums or even more from new gpus. But starman Was way out of line with his personal abusive posts. I would not get offended but some might, and i think most of don’t want that kind of language here.
At the end its our personal choise to buy or not to buy Something and we all have a right to our oppinnions and by being wrong and talking about Things we learn the best.
Your reap what you Plant
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
I preordered the RTX 2080 and then when I was how pre-owned GTX 1080ti price is going down, I cancelled my preorder and probably will get a GTX 1080ti that is still in warranty.
I preordered the RTX 2080 and then when I was how pre-owned GTX 1080ti price is going down, I cancelled my preorder and probably will get a GTX 1080ti that is still in warranty.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
Someone posted slides from Jensen's GTC presentation, comparing relative performance of 980/Ti, 1080/Ti, and 2080/Ti with NO RTX and NO AI DLSS:
[img]https://i.imgur.com/wptnvaG.jpg[/img]
I was literally just reading this.
Good article here...
https://www.overclock3d.net/news/gpu_displays/nvidia_reveals_rtx_2080_and_rtx_2080_ti_performance_data_at_gtc_japan_2018/1
The 1070 was slightly faster than the 980ti, looks like 2070 could be faster than the 1080ti after all.
They’ve also announced a lot of new games with ray tracing and DLSS support too.
Maybe they are reading the forums.
Shame nvidia can’t make a fucking graph properly. I’d love nvidia to pass that graph, with no units, to my old maths teacher.
So these cards are going to be beasts then. Glad I preordered now!
Personally I'd be surprised (in a pleasant way) if the 2070 ends up being faster than the 1080TI in current games.
The graphs are deliberately vague, but you can read into that what you want. It seems consistent that the 2080 is 1080TI ~+5-10% and the 2080TI is ~+40-50% or so... That's on a like for like without the extra features.
Obviously DLSS looks like a potential performance changer if supported.
Ray tracing is an attempt to put the other cores to good use and will eventually in future cards lead to a step change in rendering quality, but I expect it to hurt performance significantly, especially on the lower tier cards depending on how it's implemented. They might have sliders like they did with PhysX, ie. low/med/high.
All that said, we still don't actually know until the benchmarks come out, then we can make an informed decision. I still might try and get a couple of 1080TIs, but we'll see, there are many (many) factors at play here.
I'm disappointed I missed the 2080TI preorder, but only because the plan was to cancel if the reviews were underwhelming and now I don't have that option.
I have no doubt they are better in every conceivable way, but the price tags are hard to swallow, we'll see :-)
I'm not too bothered about the arguments around preordering and what constitutes a mainstream GPU compared to previous GPUs. The questions I ask myself are:
1. How big is the market for $1000 GPUs anyway? Those that get in upfront either don't care about the cost or are worried about history repeating itself and crypto mining driving prices up. Once the initial surge dries up, Nvidia will have to adjust prices if demand drops.
2. Who cares what they actually call the cards? To me, whether it's mainstream or high end is purely derived from the price/performance, not the naming convention/nomenclature.
Personally I'd be surprised (in a pleasant way) if the 2070 ends up being faster than the 1080TI in current games.
The graphs are deliberately vague, but you can read into that what you want. It seems consistent that the 2080 is 1080TI ~+5-10% and the 2080TI is ~+40-50% or so... That's on a like for like without the extra features.
Obviously DLSS looks like a potential performance changer if supported.
Ray tracing is an attempt to put the other cores to good use and will eventually in future cards lead to a step change in rendering quality, but I expect it to hurt performance significantly, especially on the lower tier cards depending on how it's implemented. They might have sliders like they did with PhysX, ie. low/med/high.
All that said, we still don't actually know until the benchmarks come out, then we can make an informed decision. I still might try and get a couple of 1080TIs, but we'll see, there are many (many) factors at play here.
I'm disappointed I missed the 2080TI preorder, but only because the plan was to cancel if the reviews were underwhelming and now I don't have that option.
I have no doubt they are better in every conceivable way, but the price tags are hard to swallow, we'll see :-)
I'm not too bothered about the arguments around preordering and what constitutes a mainstream GPU compared to previous GPUs. The questions I ask myself are:
1. How big is the market for $1000 GPUs anyway? Those that get in upfront either don't care about the cost or are worried about history repeating itself and crypto mining driving prices up. Once the initial surge dries up, Nvidia will have to adjust prices if demand drops.
2. Who cares what they actually call the cards? To me, whether it's mainstream or high end is purely derived from the price/performance, not the naming convention/nomenclature.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I agree with all your points.
There was an interview with dice employees, at the after party of the RTX launch ceremony, where they said that they had the Turing cards for 2 weeks before they showed the presentation.
They were previously doing raytracing on 4x Volta’s but they don’t have hardware ray tracing capabilities..
The battlefield raytracing presentation wasn’t even using the tensor cores to do the denoising, that was being done on the main part of the gpu, which will greatly affect performance.
They also said, and this is the most important bit, that raytracing will have lots of options, including being able to set a different resolution for ray tracing than the rest of the game.
I.e you could game in 4K whilst having ray tracing running at 1080p.
The proof’s in the pudding, but it sounds promising.
For this much money it better ‘just work’ as Jenson so elequently put it. Just work indeed.
I agree with all your points.
There was an interview with dice employees, at the after party of the RTX launch ceremony, where they said that they had the Turing cards for 2 weeks before they showed the presentation.
They were previously doing raytracing on 4x Volta’s but they don’t have hardware ray tracing capabilities..
The battlefield raytracing presentation wasn’t even using the tensor cores to do the denoising, that was being done on the main part of the gpu, which will greatly affect performance.
They also said, and this is the most important bit, that raytracing will have lots of options, including being able to set a different resolution for ray tracing than the rest of the game.
I.e you could game in 4K whilst having ray tracing running at 1080p.
The proof’s in the pudding, but it sounds promising.
For this much money it better ‘just work’ as Jenson so elequently put it. Just work indeed.
That's the key. If it's very customizable it will be a lot better than just "turn on -> lose 75% of your fps". I hope there are some decent quality presets that don't hurt performance much.
Now we have to wait and see if ray tracing doesn't come to DX11. If it's only DX12... we'll need a new wrapper.
That's the key. If it's very customizable it will be a lot better than just "turn on -> lose 75% of your fps". I hope there are some decent quality presets that don't hurt performance much.
Now we have to wait and see if ray tracing doesn't come to DX11. If it's only DX12... we'll need a new wrapper.
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
What a mentalist. He clearly has schizophrenia!
Imagine the shame in being recognised as another forum user purely by your own vileness! :D
Ahhhh mr starman, your reputation proceeds you.
So according to him if we buy the new Turing gfx cards, we are the cancer of the gaming community and if you buy two of the old gen it’s even worse?
I said I’d sell the card on eBay as a flippant, off the cuff remark. You went full retard. Your basically condoning how business works in a commercialised world.
He sticks to pointing that out rather than addressing my other points as he has no counter argument.
How can buying new gfx cards make people ‘part of the problem’ in the gaming community? What’s true fool.
Tell you what starman, I’ll sell you my old, ‘hand me down’ gfx cards when I’m done with them if you can’t afford the new ones yourself.
You’re a vile, spoilt, trolling bitch of an inbred fuckwit who can’t see the past his own delusions!
Everything you’ve posted basically translates as ‘I can’t afford one, so I’ll put down everyone else who does, when they justify their reasons I’ll just make ad hominem attacks against their character and focus in on details but miss the point entirely’
It truly is quite comical!
Wake up or at least stick to your word and don’t post again. Do everyone a favour and GO AWAY!
Nobody wants you here, literally no one. Can’t you see that?
If everyone is against you, maybe you should have the intelligence to take a step back and consider you might, actually, be in the wrong!
DX12 is obviously an anomaly at this point, especially as we currently have no way of fixing DX12 games. That's why I'm hoping that going forwards, mgpu will be more closely integrated into the big game engines.
I also expect that unless everyone gets completely fed up, we will find a DX12 solution at some point as well.
A big negative was when UE4 shipped without any SLI support, as it's widely used for VR and other modern titles, but of course SLI support was added to later versions of UE4.
In my opinion, the extra demands of VR make it more likely that a parallel rendering approach is going to be needed now more than ever, even when foveated rendering and eye tracking eventually takes off.
That's why although the jury is still out, I'm optimistic that mgpu won't go away. The fact that Nvidia created Nvlink and AMD are talking about PCIe4 based multi card rendering makes me think that although SLI support is sketchy, the concept is here to stay for the long term.
I'm aware that the driver for these technologies was not gaming, but I think there is a need/demand in the consumer market too. In some ways it's analagous to the way that CPUs are slowly moving to more cores rather than IPC.
I just noticed that vulcan/starman has a link to some firestrike results in his signature. He was getting 30k with a 1080ti, I'm getting 40k with my 1070s in SLI: https://www.3dmark.com/3dm/13657632
At the time when I purchased them, I was aware of the way that 3d vision scales really well with SLI, hence the reason I bought 2 cards. It was a gamble, but for the same money it would outperform (most of the time) what was the single fastest card at that point. Actually, I don't think the 1080ti was even available at that point.
I'm not sure the same thing would be true now, but then as others have pointed out, if you actually do already have the fastest single card already, where do you go from there? You buy another and accept that the law of diminishing returns applies.
Anyway, I'm not a soothsayer so let's see what happens with the next round of cards.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
AMD has implemented supporting drivers, no idea about Nvidia.
https://www.overclock3d.net/news/software/strange_brigade_is_the_first_game_to_support_the_vulkan_api_with_multi-gpu_systems/1
Keep in mind, Nvidia will not have NVLink available on any of the RTX 2070 GPUs.
https://videocardz.com/newz/nvidia-changes-geforce-rtx-2080-reviews-date-to-september-19th
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
At the end its our personal choise to buy or not to buy Something and we all have a right to our oppinnions and by being wrong and talking about Things we learn the best.
Your reap what you Plant
CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Good article here...
https://www.overclock3d.net/news/gpu_displays/nvidia_reveals_rtx_2080_and_rtx_2080_ti_performance_data_at_gtc_japan_2018/1
The 1070 was slightly faster than the 980ti, looks like 2070 could be faster than the 1080ti after all.
They’ve also announced a lot of new games with ray tracing and DLSS support too.
Maybe they are reading the forums.
Shame nvidia can’t make a fucking graph properly. I’d love nvidia to pass that graph, with no units, to my old maths teacher.
So these cards are going to be beasts then. Glad I preordered now!
The graphs are deliberately vague, but you can read into that what you want. It seems consistent that the 2080 is 1080TI ~+5-10% and the 2080TI is ~+40-50% or so... That's on a like for like without the extra features.
Obviously DLSS looks like a potential performance changer if supported.
Ray tracing is an attempt to put the other cores to good use and will eventually in future cards lead to a step change in rendering quality, but I expect it to hurt performance significantly, especially on the lower tier cards depending on how it's implemented. They might have sliders like they did with PhysX, ie. low/med/high.
All that said, we still don't actually know until the benchmarks come out, then we can make an informed decision. I still might try and get a couple of 1080TIs, but we'll see, there are many (many) factors at play here.
I'm disappointed I missed the 2080TI preorder, but only because the plan was to cancel if the reviews were underwhelming and now I don't have that option.
I have no doubt they are better in every conceivable way, but the price tags are hard to swallow, we'll see :-)
I'm not too bothered about the arguments around preordering and what constitutes a mainstream GPU compared to previous GPUs. The questions I ask myself are:
1. How big is the market for $1000 GPUs anyway? Those that get in upfront either don't care about the cost or are worried about history repeating itself and crypto mining driving prices up. Once the initial surge dries up, Nvidia will have to adjust prices if demand drops.
2. Who cares what they actually call the cards? To me, whether it's mainstream or high end is purely derived from the price/performance, not the naming convention/nomenclature.
Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
There was an interview with dice employees, at the after party of the RTX launch ceremony, where they said that they had the Turing cards for 2 weeks before they showed the presentation.
They were previously doing raytracing on 4x Volta’s but they don’t have hardware ray tracing capabilities..
The battlefield raytracing presentation wasn’t even using the tensor cores to do the denoising, that was being done on the main part of the gpu, which will greatly affect performance.
They also said, and this is the most important bit, that raytracing will have lots of options, including being able to set a different resolution for ray tracing than the rest of the game.
I.e you could game in 4K whilst having ray tracing running at 1080p.
The proof’s in the pudding, but it sounds promising.
For this much money it better ‘just work’ as Jenson so elequently put it. Just work indeed.
Now we have to wait and see if ray tracing doesn't come to DX11. If it's only DX12... we'll need a new wrapper.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com