RTX 2080 incoming...
  23 / 27    
Man stop talking about gimicks it makes me mad as fuck when people use thats word. Its a word that magigians use. if you design a product that does a certain thing, even if its taking something its not a fucking gimmick, it just works that way. Raytracing is not a gimmig, its an way to handle lightning. If it were a gimig it actually is not working, it would just make you think its working. Im fucking furious please stop ......LOL GibsonRed i understand you. I would like to Believe into sony but i just don't have the balls....but you are right from what i read And seen, sony is sharp. I hope you get a long life out of it !
Man stop talking about gimicks it makes me mad as fuck when people use thats word. Its a word that magigians use.
if you design a product that does a certain thing, even if its taking something its not a fucking gimmick, it just works that way.
Raytracing is not a gimmig, its an way to handle lightning. If it were a gimig it actually is not working, it would just make you think its working.

Im fucking furious please stop ......LOL

GibsonRed i understand you. I would like to Believe into sony but i just don't have the balls....but you are right from what i read
And seen, sony is sharp. I hope you get a long life out of it !

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

Posted 09/20/2018 04:32 AM   
I agree that the price of the 2080ti is too high for the benefit we will make out of it which is around 10 fps in 3D. my 2 1080ti probably worth a little bit more than 1 2080ti. it will be total imbecillity for me to sell my cards for one 2080ti and get lower fps in some games (which support SLI). ill stay with my cards and play SLI games in 4k. games which not support SLI ill play in 1440p 50-60 fps. if i had one 1080 ti i would probably keep it and play in 1440p. the price-benefit calculation shows its just not worth it. jumping from 980ti is totally worth it
I agree that the price of the 2080ti is too high for the benefit we will make out of it which is around 10 fps in 3D.
my 2 1080ti probably worth a little bit more than 1 2080ti. it will be total imbecillity for me to sell my cards for one 2080ti and get lower fps in some games (which support SLI).
ill stay with my cards and play SLI games in 4k. games which not support SLI ill play in 1440p 50-60 fps.
if i had one 1080 ti i would probably keep it and play in 1440p. the price-benefit calculation shows its just not worth it. jumping from 980ti is totally worth it

Posted 09/20/2018 07:07 AM   
I play in 3D@1080P so for me there is no benefit to upgrade and remember that the ray tracing support by the new cards are inverse ray-tracing also games will need to support it. This is first upgrade I'm passing on since my NV1 video card always have the latest.
I play in 3D@1080P so for me there is no benefit to upgrade and remember that the ray tracing support by the new cards are inverse ray-tracing also games will need to support it.

This is first upgrade I'm passing on since my NV1 video card always have the latest.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 09/20/2018 08:05 AM   
2080 should be in fact 2070, and the price shold not be more than 300€. Nvidia does not know what else to do to steal people's money. It does not give them the slightest embarrassment to wildly speculate and deceive.
2080 should be in fact 2070, and the price shold not be more than 300€. Nvidia does not know what else to do to steal people's money. It does not give them the slightest embarrassment to wildly speculate and deceive.

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "Gigabyte RTX 2080 Gaming OC"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

Posted 09/20/2018 02:09 PM   
"Gimmick: a trick or device intended to attract attention, publicity, or business." I think this describes very well an aspect of how Nvidia conducts business with "technologies" that never get real traction among developers nor the public, in part due to Nvidia itself moving on to a new gimmick every generation of cards and not giving proper support and attention after the initial hype fades. The device or trick can be fully working, what makes it a gimmick is that it is primarily designed for publicity and attention. Answer this, what percent of games support SLI, or Tesselation in 2018? That's just 2 examples of fully working, but GIMMICK technologies that had much publicity yet little impact in the long run. In fact, we cna see a parallel between Ray Tracing and Physx. At the time that NVIDIA bough AEGEA to get an edge on ATI, Physx - is a fully working technology - that was supposed to be the future of gaming and it push a lot of video card sales. But how many games actually use it? This Ray Tracing looks like Deja Vu to me. I wish it gets mass adoption, just like I wish every game had teselation, hairworks, 3D Vison Ready, SLI support, Physx, etc etc... And by the way, ATI does the same thing. The reason that I have remained an NVIDIA customer is because out of the two, to me it's the better choice.
"Gimmick: a trick or device intended to attract attention, publicity, or business."

I think this describes very well an aspect of how Nvidia conducts business with "technologies" that never get real traction among developers nor the public, in part due to Nvidia itself moving on to a new gimmick every generation of cards and not giving proper support and attention after the initial hype fades. The device or trick can be fully working, what makes it a gimmick is that it is primarily designed for publicity and attention.

Answer this, what percent of games support SLI, or Tesselation in 2018? That's just 2 examples of fully working, but GIMMICK technologies that had much publicity yet little impact in the long run. In fact, we cna see a parallel between Ray Tracing and Physx. At the time that NVIDIA bough AEGEA to get an edge on ATI, Physx - is a fully working technology - that was supposed to be the future of gaming and it push a lot of video card sales. But how many games actually use it? This Ray Tracing looks like Deja Vu to me.

I wish it gets mass adoption, just like I wish every game had teselation, hairworks, 3D Vison Ready, SLI support, Physx, etc etc...

And by the way, ATI does the same thing. The reason that I have remained an NVIDIA customer is because out of the two, to me it's the better choice.

CPU: Intel Core i7 3770K @ 3.50GHz
MB: Asus P8Z77-V DELUXE
RAM: 32.0GB Dual-Channel DDR3 @ 799MHz (10-10-10-27)
VGA: Asus Strix GTX 1070 2x SLI
DISPLAY: Asus ROG PG278QR
OS: Windows 10 Home 64-bit

Posted 09/20/2018 02:34 PM   
If money is no object I can see the point to getting a 2080ti especially for 4k gaming - but the 2080 looks dubious at best. I wonder what the benchmarks for the 2080 using ray tracing will be ? One tech site on youtube claims they had a developer lined up to show a beta of ray tracing - and it was pulled at short notice from pressure from outside forces.... now why would that be !
If money is no object I can see the point to getting a 2080ti especially for 4k gaming - but the 2080 looks dubious at best. I wonder what the benchmarks for the 2080 using ray tracing will be ? One tech site on youtube claims they had a developer lined up to show a beta of ray tracing - and it was pulled at short notice from pressure from outside forces.... now why would that be !

Posted 09/20/2018 03:45 PM   
PhysX is actually used in a lot of game engines but these days is utilised on the cpu now they are much faster. There was an nvidia engineer having a rant about it a few months ago. PhysX is on consoles phones, game engines you name it. https://techreport.com/news/27910/nvidia-physx-joins-the-free-source-party Looks like people like to hate without knowing all the facts. Also how can ray tracing be a gimmick when it’s used in CGI? It’s the best way of doing things and in the long run it’ll be easier for developers to implement rather than taking longer like most other nvidia tech. The DLSS is free for developers. All they have to do is send high resolution images of their games to the nvidia super computer and the rest is done by nvidia for free. People moaning about cards forget that performance will increase over time with driver updates. Just like they did with the previous generations of cards. I personally can’t wait to try out ray tracing. Those battlefield puddles and windows sure look good!
PhysX is actually used in a lot of game engines but these days is utilised on the cpu now they are much faster.
There was an nvidia engineer having a rant about it a few months ago.

PhysX is on consoles phones, game engines you name it.


https://techreport.com/news/27910/nvidia-physx-joins-the-free-source-party

Looks like people like to hate without knowing all the facts.

Also how can ray tracing be a gimmick when it’s used in CGI?
It’s the best way of doing things and in the long run it’ll be easier for developers to implement rather than taking longer like most other nvidia tech.

The DLSS is free for developers. All they have to do is send high resolution images of their games to the nvidia super computer and the rest is done by nvidia for free.

People moaning about cards forget that performance will increase over time with driver updates. Just like they did with the previous generations of cards.

I personally can’t wait to try out ray tracing. Those battlefield puddles and windows sure look good!

Posted 09/20/2018 03:51 PM   
Physx is used in some games, just like some games do use tesselation, and some other games have SLi profiles out of the box. I never said that ZERO games used it, I said MOST DO NOT, the percentage of adoption is low. And that statement remains true. Stating facts is not "hating". I don't hate NVIDIA, I buy their products when they offer value. This generation, at least so far, is falling short, and if you think Ray Tracing is the next best thing you'll see in future games, then go ahead and support it with your wallet. I see a vicious circle here. If you're willing to pay such a premium, $1500 for reflections in 2 or 3 games, then it's great you can afford it. I can afford it too, but is not worth it yet. This is the reason I usually upgrade every other generation. I remember paying a premium back in the days of Athlon 64, so it would be "future ready". Excepnt that once 64bit OS matured and became actually standard, my future "ready" processor could not run it decently as it was obsolete. The premium was wasted, when I bought it, no software really supported it, and when the software caught up, the processor was incapable even compared to cheaper options of that generation... Live and learn.
Physx is used in some games, just like some games do use tesselation, and some other games have SLi profiles out of the box. I never said that ZERO games used it, I said MOST DO NOT, the percentage of adoption is low. And that statement remains true. Stating facts is not "hating".

I don't hate NVIDIA, I buy their products when they offer value. This generation, at least so far, is falling short, and if you think Ray Tracing is the next best thing you'll see in future games, then go ahead and support it with your wallet. I see a vicious circle here.

If you're willing to pay such a premium, $1500 for reflections in 2 or 3 games, then it's great you can afford it. I can afford it too, but is not worth it yet. This is the reason I usually upgrade every other generation. I remember paying a premium back in the days of Athlon 64, so it would be "future ready". Excepnt that once 64bit OS matured and became actually standard, my future "ready" processor could not run it decently as it was obsolete. The premium was wasted, when I bought it, no software really supported it, and when the software caught up, the processor was incapable even compared to cheaper options of that generation...

Live and learn.

CPU: Intel Core i7 3770K @ 3.50GHz
MB: Asus P8Z77-V DELUXE
RAM: 32.0GB Dual-Channel DDR3 @ 799MHz (10-10-10-27)
VGA: Asus Strix GTX 1070 2x SLI
DISPLAY: Asus ROG PG278QR
OS: Windows 10 Home 64-bit

Posted 09/20/2018 03:59 PM   
The Ray Tracing is inverse the new 2080 cards couldn't even render the juggler demo in real time from the Amiga which is 30 years old also my 1080ti is faster than 2080 in almost every benchmark. This is the reason why the NDA was up a day before the video card was released.
The Ray Tracing is inverse the new 2080 cards couldn't even render the juggler demo in real time from the Amiga which is 30 years old also my 1080ti is faster than 2080 in almost every benchmark. This is the reason why the NDA was up a day before the video card was released.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 09/20/2018 04:30 PM   
https://www.youtube.com/watch?v=jyRnPgZO09k?t=27m6s [b]Misleading Results at 27:06 and Conclusion at 29:00 is where the meat is at, timestamps in video description. [/b] [quote="zig11727"]I play in 3D@1080P so for me there is no benefit to upgrade and remember that the ray tracing support by the new cards are inverse ray-tracing also games will need to support it. This is first upgrade I'm passing on since my NV1 video card always have the latest. [/quote] This is also the fist time I'm not upgrading as well, I've gone from 580M SLI to 680M SLI in 2011-2012 (M18xR1, M18xR2) to single 780 Ti in 2013 to 780 Ti SLI in 2014 to single 980 Ti in 2015 to single 1080 Ti in 2017. I was so looking forward to a nice 50% bump, I would have even accepted a $900 price-tag given the situation, but 25-30% for $1200, youre straight smoking crack NGreedia. No more BugattiGreedia for me. Don't buy this shit people, if enough rich morons buy enough then, as Steve concludes the video above, "we're doomed". How people can get punched in the face and say "Thank you sir, may I have another?!" is completely beyond me. Send a strong message to BugattiGreedia: if this trend continues, it will be the end of PC gaming. This isn't hyperbole. If this pricing trend continues 90% of PC gaming enthusiasts will be forced to turn to console gaming. I'm not going back to 30 FPS. Vote with your wallets people. And I don't want to hear shit about BugattiGreedia needing the money, they are sitting on record profits with the mining boom: https://www.extremetech.com/gaming/269135-nvidia-profits-skyrocket-to-record-highs Youre NOT investing in innovation buying this shit. Ray Tracing is basically the new Gameworks, it's NOT going to be widely implemented. This is a marketing gimmick, a truly slick marketing gimmick at that, but in the end, just a marketing gimmick. You ARE signalling to BugattiGreedia that their new core demographic is the .01% economic elite. If youre a PC gamer and are not the .01% economic elite that gets in line every year for anew $1k. iphone just because they can THEN DO NOT BUY THIS SHIT. If you care about the future of PC gaming, DO NOT BUY THIS SHIT.
?t=27m6s

Misleading Results at 27:06 and Conclusion at 29:00 is where the meat is at, timestamps in video description.

zig11727 said:I play in 3D@1080P so for me there is no benefit to upgrade and remember that the ray tracing support by the new cards are inverse ray-tracing also games will need to support it.

This is first upgrade I'm passing on since my NV1 video card always have the latest.


This is also the fist time I'm not upgrading as well, I've gone from 580M SLI to 680M SLI in 2011-2012 (M18xR1, M18xR2) to single 780 Ti in 2013 to 780 Ti SLI in 2014 to single 980 Ti in 2015 to single 1080 Ti in 2017. I was so looking forward to a nice 50% bump, I would have even accepted a $900 price-tag given the situation, but 25-30% for $1200, youre straight smoking crack NGreedia.

No more BugattiGreedia for me.

Don't buy this shit people, if enough rich morons buy enough then, as Steve concludes the video above, "we're doomed".

How people can get punched in the face and say "Thank you sir, may I have another?!" is completely beyond me.

Send a strong message to BugattiGreedia: if this trend continues, it will be the end of PC gaming.

This isn't hyperbole.

If this pricing trend continues 90% of PC gaming enthusiasts will be forced to turn to console gaming.

I'm not going back to 30 FPS.

Vote with your wallets people.

And I don't want to hear shit about BugattiGreedia needing the money, they are sitting on record profits with the mining boom:

https://www.extremetech.com/gaming/269135-nvidia-profits-skyrocket-to-record-highs

Youre NOT investing in innovation buying this shit. Ray Tracing is basically the new Gameworks, it's NOT going to be widely implemented. This is a marketing gimmick, a truly slick marketing gimmick at that, but in the end, just a marketing gimmick.

You ARE signalling to BugattiGreedia that their new core demographic is the .01% economic elite.

If youre a PC gamer and are not the .01% economic elite that gets in line every year for anew $1k. iphone just because they can THEN DO NOT BUY THIS SHIT.

If you care about the future of PC gaming, DO NOT BUY THIS SHIT.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/20/2018 04:32 PM   
https://www.youtube.com/watch?v=WDrpsv0QIR0

CPU: Intel Core i7 3770K @ 3.50GHz
MB: Asus P8Z77-V DELUXE
RAM: 32.0GB Dual-Channel DDR3 @ 799MHz (10-10-10-27)
VGA: Asus Strix GTX 1070 2x SLI
DISPLAY: Asus ROG PG278QR
OS: Windows 10 Home 64-bit

Posted 09/20/2018 07:12 PM   
AdoredTV 2080 and 2080 Ti Review Roundup and Analysis https://www.youtube.com/watch?v=cp9KApvrNVw Was RTX Rushed (I've come to the exact conclusion as Joker in regards to why RT was withheld from SOTTR launch) https://www.youtube.com/watch?v=G40nmaoqK5M&t=0s Good Old Gamer Review https://www.youtube.com/watch?v=zklCyX2OxXg&t=0s UFD Tech "Don't Buy Them!" https://www.youtube.com/watch?v=FDnIoWb44ds&t=0s
AdoredTV 2080 and 2080 Ti Review Roundup and Analysis


Was RTX Rushed (I've come to the exact conclusion as Joker in regards to why RT was withheld from SOTTR launch)
;t=0s

Good Old Gamer Review
;t=0s

UFD Tech "Don't Buy Them!"
;t=0s

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/20/2018 10:35 PM   
[b]Nvidia shares fall after Morgan Stanley says the performance of its new gaming card is disappointing[/b] Morgan Stanley says the gaming performance of Nvidia’s latest graphics card is below its expectations. “As review embargos broke for the new gaming products, performance improvements in older games is not the leap we had initially hoped for,” analyst Joseph Moore says. https://www.cnbc.com/2018/09/20/nvidia-falls-after-morgan-stanley-calls-new-gaming-card-disappointing.html
Nvidia shares fall after Morgan Stanley says the performance of its new gaming card is disappointing

Morgan Stanley says the gaming performance of Nvidia’s latest graphics card is below its expectations.
“As review embargos broke for the new gaming products, performance improvements in older games is not the leap we had initially hoped for,” analyst Joseph Moore says.


https://www.cnbc.com/2018/09/20/nvidia-falls-after-morgan-stanley-calls-new-gaming-card-disappointing.html

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/21/2018 03:40 PM   
Starman give it a rest already LOL
Starman give it a rest already LOL

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

Posted 09/21/2018 03:59 PM   
One more, as a reply to Good Old Gamer's AMD prognostications: https://www.youtube.com/watch?v=SpIlySe59Rc "I don't think that pointing to the 4K performance delta between 80 Ti models against price is helpful considering, as you state later in this video, the vast majority of PC Gaming enthusiasts, myself included (3440x1440 AW3418DW and 2560x1440 PG278Q for 3D Vision) are at 1080p (60%) and 2560x1440 (4%) vs 1% at 3840x2160 according to Steam Survey. If you look at the performance delta at 1440p, 980 Ti is 50% faster than 780 Ti for the same price and 1080 Ti is 60% faster than 980 Ti for the same price. Hell even GTX 1080 is some 30% faster than 980 Ti! In fact, and I've pointed this out elsewhere before viewing AdoredTV's latest post launch RTX review summary, in regards to BugattiGreedia renaming / rebadging shenanigans, the RTX 2080 is actually the Turing 70 card considering it's 0-5% faster than GTX 1080 Ti and the RTX 2080 Ti is actually the Turing 80 card considering it's 23-33% faster than GTX 1080 TI (23% at 1440p on average). So this is really how bad we are getting screwed, the 70 card is actually $800 and the 80 card is $1200 before taxes. BugattiGreedia knew full well that they could not come out with a straight face and inform the consuming public that they are now asking for $800 vs $450 for the 70 card and $1200 vs $600 for the 80 card (if we use FE prices then this would be $800 to be fair, i.e. GTX 1080 FE launch price). So they simply renamed the products, creating the illusion that $800 is getting you the 80 card and $1200 is getting you the 80 Ti card. And lets not forget, the supposed 2070 card will not only be only as fast as the outgoing 80 card, putting it firmly in new 60 card territory, i.e. GTX 960 as fast as GTX 780 and GTX 1060 as fast as GTX 980 but the SKU designator is even TU106, 106 identifying the card as the 60 card going all the way back to Kepler! So it's a near certainty that this is what has actually happened and in a last minute decision they decided to simply rebadge the cards in question so as to not look so completely out of touch from the consumer base from a price / performance perspective. Yes, because of record profits due to the crypto boom in 2017-2018 and AMD's absence they have become so brazen that they've essentially doubled the price of their products. And check this out. They most definitely have an actual 80 Ti card waiting in the wings, some 30% faster than the 80 card (2080 Ti) but now they will call it Titan XT and it will have a $1500 price-tag, which is perfectly in line with the pricing of their other products because: RTX 2080 (RTX 2070 going by performance) is now: $800 whereas GTX 1070, it's actual predecessor, is $400: 100% increase in price RTX 2080 Ti (RTX 2080 going by performance) is now $1200, whereas GTX 1080 is $450, a near 100% increase in price By logical extension, RTX Titan XT (RTX 2080 Ti going by performance) will be $1500, whereas GTX 1080 Ti is $700, a 100% increase in price. I've made this observation before viewing AdoredTV's RTX Review Summary, and he's been mostly accurate with his predictions with Turing and his analysis is on point: https://www.youtube.com/watch?v=cp9KApvrNVw"
One more, as a reply to Good Old Gamer's AMD prognostications:



"I don't think that pointing to the 4K performance delta between 80 Ti models against price is helpful considering, as you state later in this video, the vast majority of PC Gaming enthusiasts, myself included (3440x1440 AW3418DW and 2560x1440 PG278Q for 3D Vision) are at 1080p (60%) and 2560x1440 (4%) vs 1% at 3840x2160 according to Steam Survey.

If you look at the performance delta at 1440p, 980 Ti is 50% faster than 780 Ti for the same price and 1080 Ti is 60% faster than 980 Ti for the same price. Hell even GTX 1080 is some 30% faster than 980 Ti!

In fact, and I've pointed this out elsewhere before viewing AdoredTV's latest post launch RTX review summary, in regards to BugattiGreedia renaming / rebadging shenanigans, the RTX 2080 is actually the Turing 70 card considering it's 0-5% faster than GTX 1080 Ti and the RTX 2080 Ti is actually the Turing 80 card considering it's 23-33% faster than GTX 1080 TI (23% at 1440p on average).

So this is really how bad we are getting screwed, the 70 card is actually $800 and the 80 card is $1200 before taxes.

BugattiGreedia knew full well that they could not come out with a straight face and inform the consuming public that they are now asking for $800 vs $450 for the 70 card and $1200 vs $600 for the 80 card (if we use FE prices then this would be $800 to be fair, i.e. GTX 1080 FE launch price).

So they simply renamed the products, creating the illusion that $800 is getting you the 80 card and $1200 is getting you the 80 Ti card.

And lets not forget, the supposed 2070 card will not only be only as fast as the outgoing 80 card, putting it firmly in new 60 card territory, i.e. GTX 960 as fast as GTX 780 and GTX 1060 as fast as GTX 980 but the SKU designator is even TU106, 106 identifying the card as the 60 card going all the way back to Kepler!

So it's a near certainty that this is what has actually happened and in a last minute decision they decided to simply rebadge the cards in question so as to not look so completely out of touch from the consumer base from a price / performance perspective.

Yes, because of record profits due to the crypto boom in 2017-2018 and AMD's absence they have become so brazen that they've essentially doubled the price of their products.

And check this out. They most definitely have an actual 80 Ti card waiting in the wings, some 30% faster than the 80 card (2080 Ti) but now they will call it Titan XT and it will have a $1500 price-tag, which is perfectly in line with the pricing of their other products because:

RTX 2080 (RTX 2070 going by performance) is now: $800 whereas GTX 1070, it's actual predecessor, is $400: 100% increase in price

RTX 2080 Ti (RTX 2080 going by performance) is now $1200, whereas GTX 1080 is $450, a near 100% increase in price

By logical extension, RTX Titan XT (RTX 2080 Ti going by performance) will be $1500, whereas GTX 1080 Ti is $700, a 100% increase in price.

I've made this observation before viewing AdoredTV's RTX Review Summary, and he's been mostly accurate with his predictions with Turing and his analysis is on point:

;

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/21/2018 04:22 PM   
  23 / 27    
Scroll To Top