RTX 2080 incoming...
  15 / 27    
Thanks, D-Man11 $79 is the price.
Thanks, D-Man11


$79 is the price.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 09/01/2018 11:01 PM   
[quote="lou4612"]At this point I really don't care to hear from anyone who isn't a 3D Vision user in this 3D VISION section of the forums.... Sora can fuck off along with anyone else...who just here to talk BS. I'm for the progression of the new nvidia GPU and the continuation of 3D Vision software. Raytracing and 4K are great as long as they go hand in hand with 3D Vision. Don't give a single shit about 2D pancake gaming at this point nor do I care to hear from anyone who does. 3D Vision! What was that?? 3D Vision? Yes 3D Vision![/quote] I'm definitely a 3D Vision enthusiast, check my comment here: https://www.overclock.net/forum/69-nvidia/1706276-official-nvidia-rtx-2080-ti-owner-s-club-45.html [i]Thanks for the correction, yeah the original figure I came across was 650 GB/s but this article erroneously listed it as 1305 GB/s, leading me to conclude that maybe it was the way that memory bandwidth was reported with HBM2, kind of like how DDR system memory is listed at half it's value in CPU-Z etc. https://www.game-debate.com/gpu/inde...-ti-vs-titan-v I don't know that we can safely extrapolate gaming performance of 2080 Ti just by looking at Titan V performance because they aren't even the same GPU core, die size (~1/3rd of TU102 is allocated to Ray Tracing, physically). They literally have nothing in common other than node size. All they have is roughly the same memory bandwidth. I think the TFLOP to Timespy ratio is a safe way to estimate the performance of 2080 Ti unless 2080 Ti will do 2.2 GHz but given the fact that 2080 Ti has lower boost clocks than 2080 and 2080 is doing a little over 2000 MHz in the leaked Timespy bench and given the limitations of the node size, it's highly unlikely that 2.2 GHz is possible. 30% is probably a close figure, corporate executives tend to exaggerate performance of a product they want to sell, Nvidia is guilty of this, Tom Peterson's statement "30 to 45% faster" is probably much much closer to 30% with 45% being a statistical outlyer of some obscure title, if there is an truth to the figure at all (there probably isn't, 3.5 GB VRAM on the 970, remember that? Or did we conveniently forget? How about "970M and 980M being 80% as fast as their desktop counterparts" when in actuality they were closer to 60% as fast? I should know, I have a 980M in my Alienware M18x R2? How about Geforce Partnership Program? How about the fact that Nvidia is making reviewers sign a rather onerous NDA that stipulates that 3rd party AIB give Nvidia contact information of whoever is to do a review for their cards, Nvidia making a determination of qualification, and if need be, replacement of review outlet with a "certified" review outlet? https://www.hardocp.com/article/2018..._distribution/) I wouldn't trust anything coming out of Peterson or Huang's mouth / I wouldn't trust them with a 10 ft pole. They blatantly lied during Gamescom reveal, stating the prices were $500, $700 and $1k respectively when if you, at that exact moment, went to their store page the prices were $700, $800 and $1200. Then they artificially limited initial supply of the 2080 Ti, to make it seem to the consuming public that the demand was enormous and stoke demand, attempting to create a hysteria. I mean, seriously, don't trust these people. If Peterson said 2080 Ti is "30% to 45% faster than 1080 Ti" then the only truth I would extract from that would be 99% of games and benchmarks showing a 30% increase with some obscure game or bench showing higher. This is deceptive statistical marketing, and they are trained to do this. JUST LOOK AT THEIR STUPID STATISTICAL CHART FOR CRYING OUT LOUD. What is that, is 2080 1.5x faster at what? Does the 1.5x value start at the bottom of 1.5 on the chart, in the middle or the top? Etc. Etc. Nvidia took a big gamble with their trust with this release, and they would only get this brazen if they didn't have any competition and the vast majority of the consuming public lacks the competence to understand just how bad they are getting ripped off. Based solely on the comment replies I've received thus far here I would say that they succeeded. Your comment here is probably the only one written by anyone with an acceptable level of technical knowledge. I don't know where you stand in terms of the 2080 Ti being a value proposition, but to me, it's a complete rip off. I mean it's $200 more than Titan X and at least Titan X was around 50% faster than GTX 980 Ti. I mean 1080 Ti was 50% faster than 980 Ti for $700! No "introductory FE special price"! In fact, their removal of the FE pricing scheme was a response to our outcry, and guess what, they dropped 1080 Ti right around when AMD released Vega! What do they do when their competition isn't around? "Let's see how much more profit we can make! I honestly hope that 2080 Ti is 50% faster than 1080 Ti but having thought long and hard and based on writing all of the above, dude I'm NOT paying $1k for this if it's 30% faster. No way in hell. I will just turn the resolution down (I'm mostly wanting to get this for 3D Vision on my PG278Q anyhow, I tried Mass Effect: Andromeda with the 3D Vision fix and HOLY CRAP, this is one of, if not the best 3D Vision title I have, I think even better than The Witcher 3. When you fire concussion shot you can see the round flying down range and because of the parallax it looks like real life, the way transparent objects, smoke, the ringed planet on the Main Menu, all of the dialogue cut-scenes look insanely good, you can see into the textures of the characters, like the sheen on their cheeks etc, and that initial emergency descent down to habitat 7, wow, like I literally tensed up, and when jetpack jumping, in 3D like I'm literally preparing for landing with my leg muscles when your character lands. I tried it at 3440x1440 on the AW3418D and even with 80 FPS avg everything maxed G-Sync it paled in comparison to 3D. Like I didn't even want to play it. 3D is that good if done correctly. I'm getting 50 FPS having made considerable compromises at 2294x1296, 10% less resolution than 2560x1440 via Custom Resolution in NVCP, any less than this and the image dims and blurs more than I would like, I've shelved the game hoping a 2080 Ti would be a 50% bump but considering it probably only is 30% I'm looking at going from 50 to 65 FPS here, meaning more demanding areas will bring it down under 60 and anything under 60 FPS solid in 3D Vision is really choppy. So I don't know if it's even worth it. If this sounds ridiculous it's because it is. Whether youre at 4k or are trying to run 2560x1440 3D Vision, if youre at 50 FPS in a lot of titles youre only going to at 65 going from 1080 Ti to 2080 Ti. $1300 after tax for that? Youre straight smoking crack. [/i]
lou4612 said:At this point I really don't care to hear from anyone who isn't a 3D Vision user in this 3D VISION section of the forums....

Sora can fuck off along with anyone else...who just here to talk BS.

I'm for the progression of the new nvidia GPU and the continuation of 3D Vision software.

Raytracing and 4K are great as long as they go hand in hand with 3D Vision.

Don't give a single shit about 2D pancake gaming at this point nor do I care to hear from anyone who does.

3D Vision!

What was that??

3D Vision?

Yes 3D Vision!


I'm definitely a 3D Vision enthusiast, check my comment here:


https://www.overclock.net/forum/69-nvidia/1706276-official-nvidia-rtx-2080-ti-owner-s-club-45.html


Thanks for the correction, yeah the original figure I came across was 650 GB/s but this article erroneously listed it as 1305 GB/s, leading me to conclude that maybe it was the way that memory bandwidth was reported with HBM2, kind of like how DDR system memory is listed at half it's value in CPU-Z etc.


https://www.game-debate.com/gpu/inde...-ti-vs-titan-v


I don't know that we can safely extrapolate gaming performance of 2080 Ti just by looking at Titan V performance because they aren't even the same GPU core, die size (~1/3rd of TU102 is allocated to Ray Tracing, physically). They literally have nothing in common other than node size. All they have is roughly the same memory bandwidth.

I think the TFLOP to Timespy ratio is a safe way to estimate the performance of 2080 Ti unless 2080 Ti will do 2.2 GHz but given the fact that 2080 Ti has lower boost clocks than 2080 and 2080 is doing a little over 2000 MHz in the leaked Timespy bench and given the limitations of the node size, it's highly unlikely that 2.2 GHz is possible.

30% is probably a close figure, corporate executives tend to exaggerate performance of a product they want to sell, Nvidia is guilty of this, Tom Peterson's statement "30 to 45% faster" is probably much much closer to 30% with 45% being a statistical outlyer of some obscure title, if there is an truth to the figure at all (there probably isn't, 3.5 GB VRAM on the 970, remember that? Or did we conveniently forget? How about "970M and 980M being 80% as fast as their desktop counterparts" when in actuality they were closer to 60% as fast? I should know, I have a 980M in my Alienware M18x R2? How about Geforce Partnership Program? How about the fact that Nvidia is making reviewers sign a rather onerous NDA that stipulates that 3rd party AIB give Nvidia contact information of whoever is to do a review for their cards, Nvidia making a determination of qualification, and if need be, replacement of review outlet with a "certified" review outlet? https://www.hardocp.com/article/2018..._distribution/)


I wouldn't trust anything coming out of Peterson or Huang's mouth / I wouldn't trust them with a 10 ft pole.

They blatantly lied during Gamescom reveal, stating the prices were $500, $700 and $1k respectively when if you, at that exact moment, went to their store page the prices were $700, $800 and $1200. Then they artificially limited initial supply of the 2080 Ti, to make it seem to the consuming public that the demand was enormous and stoke demand, attempting to create a hysteria. I mean, seriously, don't trust these people. If Peterson said 2080 Ti is "30% to 45% faster than 1080 Ti" then the only truth I would extract from that would be 99% of games and benchmarks showing a 30% increase with some obscure game or bench showing higher. This is deceptive statistical marketing, and they are trained to do this. JUST LOOK AT THEIR STUPID STATISTICAL CHART FOR CRYING OUT LOUD. What is that, is 2080 1.5x faster at what? Does the 1.5x value start at the bottom of 1.5 on the chart, in the middle or the top? Etc. Etc.

Nvidia took a big gamble with their trust with this release, and they would only get this brazen if they didn't have any competition and the vast majority of the consuming public lacks the competence to understand just how bad they are getting ripped off. Based solely on the comment replies I've received thus far here I would say that they succeeded. Your comment here is probably the only one written by anyone with an acceptable level of technical knowledge. I don't know where you stand in terms of the 2080 Ti being a value proposition, but to me, it's a complete rip off. I mean it's $200 more than Titan X and at least Titan X was around 50% faster than GTX 980 Ti. I mean 1080 Ti was 50% faster than 980 Ti for $700! No "introductory FE special price"! In fact, their removal of the FE pricing scheme was a response to our outcry, and guess what, they dropped 1080 Ti right around when AMD released Vega! What do they do when their competition isn't around? "Let's see how much more profit we can make!

I honestly hope that 2080 Ti is 50% faster than 1080 Ti but having thought long and hard and based on writing all of the above, dude I'm NOT paying $1k for this if it's 30% faster. No way in hell. I will just turn the resolution down (I'm mostly wanting to get this for 3D Vision on my PG278Q anyhow, I tried Mass Effect: Andromeda with the 3D Vision fix and HOLY CRAP, this is one of, if not the best 3D Vision title I have, I think even better than The Witcher 3. When you fire concussion shot you can see the round flying down range and because of the parallax it looks like real life, the way transparent objects, smoke, the ringed planet on the Main Menu, all of the dialogue cut-scenes look insanely good, you can see into the textures of the characters, like the sheen on their cheeks etc, and that initial emergency descent down to habitat 7, wow, like I literally tensed up, and when jetpack jumping, in 3D like I'm literally preparing for landing with my leg muscles when your character lands. I tried it at 3440x1440 on the AW3418D and even with 80 FPS avg everything maxed G-Sync it paled in comparison to 3D. Like I didn't even want to play it. 3D is that good if done correctly. I'm getting 50 FPS having made considerable compromises at 2294x1296, 10% less resolution than 2560x1440 via Custom Resolution in NVCP, any less than this and the image dims and blurs more than I would like, I've shelved the game hoping a 2080 Ti would be a 50% bump but considering it probably only is 30% I'm looking at going from 50 to 65 FPS here, meaning more demanding areas will bring it down under 60 and anything under 60 FPS solid in 3D Vision is really choppy. So I don't know if it's even worth it. If this sounds ridiculous it's because it is. Whether youre at 4k or are trying to run 2560x1440 3D Vision, if youre at 50 FPS in a lot of titles youre only going to at 65 going from 1080 Ti to 2080 Ti. $1300 after tax for that? Youre straight smoking crack.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/02/2018 12:01 AM   
Benchmarks (very salty and grainy ones one must remember) are up here... [url]https://www.3dcenter.org/news/angeblicher-benchmark-leak-zeigt-die-geforce-rtx-2080-ti-um-375-vor-der-geforce-gtx-1080-ti[/url]
Benchmarks (very salty and grainy ones one must remember) are up here...

https://www.3dcenter.org/news/angeblicher-benchmark-leak-zeigt-die-geforce-rtx-2080-ti-um-375-vor-der-geforce-gtx-1080-ti

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 09/03/2018 12:01 PM   
[quote="RAGEdemon"]Benchmarks (very salty and grainy ones one must remember) are up here... [url]https://www.3dcenter.org/news/angeblicher-benchmark-leak-zeigt-die-geforce-rtx-2080-ti-um-375-vor-der-geforce-gtx-1080-ti[/url][/quote] Those benchmarks are a week old and have been determined to be fake on day one. Actual benchmarks here: Leaked 2080 Ti FE Timespy bench shows it 30% faster than 1080 Ti with the 1080 Ti at default clocks and the 2080 Ti presumably using the +90 MHz vbios that they are all shipped with and only 20% faster when the 1080 Ti is overclocked: https://old.reddit.com/r/nv... There might be another 100 MHz in it on top of the +90 MHz vbios but to expect this thing to do 2.2 GHz is wishful thinking considering the node similarity between 12 and 16nm. That this benchmark is Timespy is further damning because of the nature of the DX12 API which nearly completely removes the display driver from the equation meaning there's very little, if any, wiggle room here for driver optimization between now and launch and there's no telling whether or not the 2080 Ti FE here is using a launch driver or not, but as I've stated, this doesn't really matter with DX12. So maybe 25% faster? To me that's a huge fail and this is precisely why the Gamescom reveal was 90% hyping up Ray Tracing with next-to-no performance comparison and metrics. That there is even an NDA to begin with smacks of Nvidia wanting to hide this. I mean, let consumers make an informed decision and show them the performance of your product?!
RAGEdemon said:Benchmarks (very salty and grainy ones one must remember) are up here...

https://www.3dcenter.org/news/angeblicher-benchmark-leak-zeigt-die-geforce-rtx-2080-ti-um-375-vor-der-geforce-gtx-1080-ti


Those benchmarks are a week old and have been determined to be fake on day one.

Actual benchmarks here:

Leaked 2080 Ti FE Timespy bench shows it 30% faster than 1080 Ti with the 1080 Ti at default clocks and the 2080 Ti presumably using the +90 MHz vbios that they are all shipped with and only 20% faster when the 1080 Ti is overclocked:


https://old.reddit.com/r/nv...


There might be another 100 MHz in it on top of the +90 MHz vbios but to expect this thing to do 2.2 GHz is wishful thinking considering the node similarity between 12 and 16nm.

That this benchmark is Timespy is further damning because of the nature of the DX12 API which nearly completely removes the display driver from the equation meaning there's very little, if any, wiggle room here for driver optimization between now and launch and there's no telling whether or not the 2080 Ti FE here is using a launch driver or not, but as I've stated, this doesn't really matter with DX12.

So maybe 25% faster?

To me that's a huge fail and this is precisely why the Gamescom reveal was 90% hyping up Ray Tracing with next-to-no performance comparison and metrics.

That there is even an NDA to begin with smacks of Nvidia wanting to hide this. I mean, let consumers make an informed decision and show them the performance of your product?!

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/03/2018 06:25 PM   
Thanks for the benchmarks. I don’t think any benchmarks will be accurate until the proper driver is out. Would be good if they were real though, especially pre driver! If you like it technical watch this insightful video. Digital foundary interviewed dice and there’s loads of info about raytracing and Dices implementation that I haven’t read about before. At the nvidia keynote Dice weren’t even using the tensor cores to do the raytracing denoising. They only had Turing for 2 weeks before the presentation. They were working with Volta titans which don’t have hardware raytracing acceleration built in so performance will be a lot higher at launch. Dice also say there will be lots of options for raytracing and you’ll be able to have the game running at different resolutions to the ray tracings resolution etc. Guess I was right about that eh Sora! Anyway it’s all in the video...... https://youtu.be/8kQ3l6wN6ns
Thanks for the benchmarks. I don’t think any benchmarks will be accurate until the proper driver is out.
Would be good if they were real though, especially pre driver!

If you like it technical watch this insightful video.
Digital foundary interviewed dice and there’s loads of info about raytracing and Dices implementation that I haven’t read about before.

At the nvidia keynote Dice weren’t even using the tensor cores to do the raytracing denoising. They only had Turing for 2 weeks before the presentation.
They were working with Volta titans which don’t have hardware raytracing acceleration built in so performance will be a lot higher at launch.

Dice also say there will be lots of options for raytracing and you’ll be able to have the game running at different resolutions to the ray tracings resolution etc. Guess I was right about that eh Sora!

Anyway it’s all in the video......


https://youtu.be/8kQ3l6wN6ns

Posted 09/03/2018 06:30 PM   
[quote="GibsonRed"]Thanks for the benchmarks. I don’t think any benchmarks will be accurate until the proper driver is out. Would be good if they were real though, especially pre driver! If you like it technical watch this insightful video. Digital foundary interviewed dice and there’s loads of info about raytracing and Dices implementation that I haven’t read about before. At the nvidia keynote Dice weren’t even using the tensor cores to do the raytracing denoising. They only had Turing for 2 weeks before the presentation. They were working with Volta titans which don’t have hardware raytracing acceleration built in so performance will be a lot higher at launch. Dice also say there will be lots of options for raytracing and you’ll be able to have the game running at different resolutions to the ray tracings resolution etc. Guess I was right about that eh Sora! Anyway it’s all in the video...... https://youtu.be/8kQ3l6wN6ns [/quote] No, Timespy is an excellent benchmark leak, and was likely chosen for a leak benchmark precisely because, like Vulkan, DX12 is relatively unaffected by the display driver, meaning, there's next-to-no difference between what youre seeing now and what you will see with launch drivers, and that's assuming that launch drivers weren't in use already: level 1 Wuselon 132 points · 1 day ago · edited 1 day ago my 1080ti (Aorus non Xtreme) oced (2025core/1500vram) does: Test1 69,35 fps (screenshot is 18,63 % faster) and Test2 62,73 fps (screenshot is 18,89 % faster) https://www.3dmark.com/spy/2862234 ..... and it does Stock: Test1 63.29 fps (screenshot is 29,98 % faster) Test 2 56.76 fps (screenshot is 31,39 % faster) https://www.3dmark.com/spy/4334643 https://www.reddit.com/r/nvidia/comments/9cck3f/ts_score_of_rtx_2080ti_videocardzcom/
GibsonRed said:Thanks for the benchmarks. I don’t think any benchmarks will be accurate until the proper driver is out.
Would be good if they were real though, especially pre driver!

If you like it technical watch this insightful video.
Digital foundary interviewed dice and there’s loads of info about raytracing and Dices implementation that I haven’t read about before.

At the nvidia keynote Dice weren’t even using the tensor cores to do the raytracing denoising. They only had Turing for 2 weeks before the presentation.
They were working with Volta titans which don’t have hardware raytracing acceleration built in so performance will be a lot higher at launch.

Dice also say there will be lots of options for raytracing and you’ll be able to have the game running at different resolutions to the ray tracings resolution etc. Guess I was right about that eh Sora!

Anyway it’s all in the video......


https://youtu.be/8kQ3l6wN6ns



No, Timespy is an excellent benchmark leak, and was likely chosen for a leak benchmark precisely because, like Vulkan, DX12 is relatively unaffected by the display driver, meaning, there's next-to-no difference between what youre seeing now and what you will see with launch drivers, and that's assuming that launch drivers weren't in use already:


level 1
Wuselon
132 points
·
1 day ago
·
edited 1 day ago
my 1080ti (Aorus non Xtreme) oced (2025core/1500vram) does:

Test1 69,35 fps (screenshot is 18,63 % faster) and

Test2 62,73 fps (screenshot is 18,89 % faster)


https://www.3dmark.com/spy/2862234


..... and it does Stock:

Test1 63.29 fps (screenshot is 29,98 % faster)

Test 2 56.76 fps (screenshot is 31,39 % faster)


https://www.3dmark.com/spy/4334643


https://www.reddit.com/r/nvidia/comments/9cck3f/ts_score_of_rtx_2080ti_videocardzcom/

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/03/2018 06:44 PM   
I was actually replying to rage demon but never mind. You must have posted just before me so I didn’t notice your reply. I know how direct X 12 works. There’s no need for the explanation. Fact is no one knows until the 12th but they aren’t going to bring out a card that’s 10% faster overclocked than the previous generation at standard clocks, like your implying. Nvidia might be greedy but they aren’t fucking stupid. When I get my 2080ti, if the performance is only 10% faster than a 1080ti I’ll send it back (read sell on eBay for £2000) You can’t see sense for your hatred and bitterness. Wake up and stop tainting this thread with your bullshit. Even the bloody pot plants know how you feel at the cards by now. If you watch the video you’ll see that raytracing have quality and resolution settings so you won’t be playing at 1080p resolution for the game itself. I’m pumped for these cards. This is the best thing to happen to gfx in a decade. You do understand that when everything is raytraced we’ll have photorealistic gfx? This is the start of graphical perfection and I want in. Go away with your trolling,......
I was actually replying to rage demon but never mind.
You must have posted just before me so I didn’t notice your reply.

I know how direct X 12 works. There’s no need for the explanation.

Fact is no one knows until the 12th but they aren’t going to bring out a card that’s 10% faster overclocked than the previous generation at standard clocks, like your implying.

Nvidia might be greedy but they aren’t fucking stupid.

When I get my 2080ti, if the performance is only 10% faster than a 1080ti I’ll send it back (read sell on eBay for £2000)

You can’t see sense for your hatred and bitterness. Wake up and stop tainting this thread with your bullshit.
Even the bloody pot plants know how you feel at the cards by now.

If you watch the video you’ll see that raytracing have quality and resolution settings so you won’t be playing at 1080p resolution for the game itself.

I’m pumped for these cards.
This is the best thing to happen to gfx in a decade. You do understand that when everything is raytraced we’ll have photorealistic gfx?
This is the start of graphical perfection and I want in.
Go away with your trolling,......

Posted 09/03/2018 07:29 PM   
[quote="GibsonRed"]I was actually replying to rage demon but never mind. You must have posted just before me so I didn’t notice your reply. I know how direct X 12 works. There’s no need for the explanation. Fact is no one knows until the 12th but they aren’t going to bring out a card that’s 10% faster overclocked than the previous generation at standard clocks, like your implying. Nvidia might be greedy but they aren’t fucking stupid. When I get my 2080ti, if the performance is only 10% faster than a 1080ti I’ll send it back (read sell on eBay for £2000) You can’t see sense for your hatred and bitterness. Wake up and stop tainting this thread with your bullshit. Even the bloody pot plants know how you feel at the cards by now. If you watch the video you’ll see that raytracing have quality and resolution settings so you won’t be playing at 1080p resolution for the game itself. I’m pumped for these cards. This is the best thing to happen to gfx in a decade. You do understand that when everything is raytraced we’ll have photorealistic gfx? This is the start of graphical perfection and I want in. Go away with your trolling,......[/quote] Trolling now huh? LMAO. They have already overclocked the card by 90 MHz in an attempt to justify the insane FE price gouge. What do you think youre going to get with 12nm, another 300 MHz on top of 90 MHz? Are you drunk? You might get another 100 MHz on top of that factory 90 MHz, absolute best case scenario. You don't understand the trend? Let's see: 980 Ti with 1200 MHz boost clocks but could do 1500 MHz, a 25% increase. vs Pascal that was aggressively clocked from the factory, where my 1080 Ti FE does 1911 MHz just setting PT to 120%, but won't do more than 2012-2025 MHz reliably under full water block with load temps of 40C. I'm at +120 core / +400 memory Factory clocked 1070 and 1080 do 1850 MHz, they can do around 2000-2050 Mhz, no more. A 15% increase. Expecting another 200 MHz up and over +90 MHz, that ain't gonna happen, no way, no how. They have already aggressively clocked the FE cards from the factory in an attempt to justify the price gouge. You MIGHT get another 50-100% MHz out of them, which equates to maybe 5% performance on top of that, taking the gap between overclocked 1080 Ti to overclocked 2080 Ti from 18%, and I'll be generous here, MAYBE 25%. If I'm an embittered pessimist then youre 100% a deluded fanboy cheerleader moron or probably an NGreedia PR shill. Enjoy your "graphical perfection" with your $1300 2080 Ti card on your 1440p or 4K monitor at 40-60 FPS 1080p that looks worse than actual 1080p because of interpolation in maybe 10% of titles that will actually support Ray Farce. By the time Ray Farce is doable at greater framerate and resolution, i.e late 2019 on 7nm Volta or soon thereafter, 2080 Ti will look like a GTX 580 in terms of Ray Farce performance compared to 780 Ti. Only complete total morons who have a lot of money pay early adopter fee. But in this instance, youre not only paying the early adopters fee, youre voting in favor of FE price gouging going forward by buying a product THAT YOU HAVE ZERO F'ING CLUE HOW IT'S GOING TO PERFORM, just straight up making a blind purchase: https://youtu.be/ebxp0Ie_abI So youre actually harming the consumer-base. Let's take a show of hands, how many reading this have $1300 for a 2080 Ti FE pre-order even if NGreedia didn't artificially limit supply to create the illusion of high demand and you could buy one? Vote 1 for Yes and 0 for No in the following comments. But now, because of rich morons who think that they're going to "graphical perfection nirvana" and pre-ordering something that is way overpriced and that they know exactly zero about, now this will be the trend going forward and we can probably expect the next architecture to be "Hey guys, meet the 2180, don't worry about actual performance metrics, because it can do Advanced PhysX, we know that next-to-no developers implemented Ray Tracing, but this is going to be different this time around, and you have to have it. Look at the 2180 Ti, it's fully 20% faster than the 2080 Ti, and here's the pricing, starting from: 2170 FE: $900 2180 FE: $1100 2180 Ti: $1500 !"
GibsonRed said:I was actually replying to rage demon but never mind.
You must have posted just before me so I didn’t notice your reply.

I know how direct X 12 works. There’s no need for the explanation.

Fact is no one knows until the 12th but they aren’t going to bring out a card that’s 10% faster overclocked than the previous generation at standard clocks, like your implying.

Nvidia might be greedy but they aren’t fucking stupid.

When I get my 2080ti, if the performance is only 10% faster than a 1080ti I’ll send it back (read sell on eBay for £2000)

You can’t see sense for your hatred and bitterness. Wake up and stop tainting this thread with your bullshit.
Even the bloody pot plants know how you feel at the cards by now.

If you watch the video you’ll see that raytracing have quality and resolution settings so you won’t be playing at 1080p resolution for the game itself.

I’m pumped for these cards.
This is the best thing to happen to gfx in a decade. You do understand that when everything is raytraced we’ll have photorealistic gfx?
This is the start of graphical perfection and I want in.
Go away with your trolling,......


Trolling now huh?

LMAO.

They have already overclocked the card by 90 MHz in an attempt to justify the insane FE price gouge.

What do you think youre going to get with 12nm, another 300 MHz on top of 90 MHz?

Are you drunk?

You might get another 100 MHz on top of that factory 90 MHz, absolute best case scenario.

You don't understand the trend?

Let's see: 980 Ti with 1200 MHz boost clocks but could do 1500 MHz, a 25% increase.

vs

Pascal that was aggressively clocked from the factory, where my 1080 Ti FE does 1911 MHz just setting PT to 120%, but won't do more than 2012-2025 MHz reliably under full water block with load temps of 40C.

I'm at +120 core / +400 memory

Factory clocked 1070 and 1080 do 1850 MHz, they can do around 2000-2050 Mhz, no more. A 15% increase.


Expecting another 200 MHz up and over +90 MHz, that ain't gonna happen, no way, no how. They have already aggressively clocked the FE cards from the factory in an attempt to justify the price gouge. You MIGHT get another 50-100% MHz out of them, which equates to maybe 5% performance on top of that, taking the gap between overclocked 1080 Ti to overclocked 2080 Ti from 18%, and I'll be generous here, MAYBE 25%.

If I'm an embittered pessimist then youre 100% a deluded fanboy cheerleader moron or probably an NGreedia PR shill.

Enjoy your "graphical perfection" with your $1300 2080 Ti card on your 1440p or 4K monitor at 40-60 FPS 1080p that looks worse than actual 1080p because of interpolation in maybe 10% of titles that will actually support Ray Farce.

By the time Ray Farce is doable at greater framerate and resolution, i.e late 2019 on 7nm Volta or soon thereafter, 2080 Ti will look like a GTX 580 in terms of Ray Farce performance compared to 780 Ti.


Only complete total morons who have a lot of money pay early adopter fee.

But in this instance, youre not only paying the early adopters fee, youre voting in favor of FE price gouging going forward by buying a product THAT YOU HAVE ZERO F'ING CLUE HOW IT'S GOING TO PERFORM, just straight up making a blind purchase:

https://youtu.be/ebxp0Ie_abI

So youre actually harming the consumer-base.

Let's take a show of hands, how many reading this have $1300 for a 2080 Ti FE pre-order even if NGreedia didn't artificially limit supply to create the illusion of high demand and you could buy one?

Vote 1 for Yes and 0 for No in the following comments.

But now, because of rich morons who think that they're going to "graphical perfection nirvana" and pre-ordering something that is way overpriced and that they know exactly zero about, now this will be the trend going forward and we can probably expect the next architecture to be "Hey guys, meet the 2180, don't worry about actual performance metrics, because it can do Advanced PhysX, we know that next-to-no developers implemented Ray Tracing, but this is going to be different this time around, and you have to have it. Look at the 2180 Ti, it's fully 20% faster than the 2080 Ti, and here's the pricing, starting from:

2170 FE: $900
2180 FE: $1100
2180 Ti: $1500

!"

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/03/2018 08:26 PM   
Yeah shure the 2080ti´s price is a joke but if GibsonRed want´s to upgrade then so be it. Everyone is entitled for oppinnion. Nvidia might be greed but they are not stupid. Though the price is a Joke i bet they sell like hot cakes. Sadly.
Yeah shure the 2080ti´s price is a joke but if GibsonRed want´s to upgrade then so be it.
Everyone is entitled for oppinnion.
Nvidia might be greed but they are not stupid. Though the price is a Joke i bet they sell like hot cakes.
Sadly.

CoreX9 Custom watercooling (valkswagen polo radiator)
I7-8700k@4.7
TitanX pascal with shitty stock cooler
Win7/10
Video: Passive 3D fullhd 3D@60hz/channel Denon x1200w /Hc5 x 2 Geobox501->eeColorBoxes->polarizers/omega filttersCustom made silverscreen
Ocupation: Enterprenior.Painting/surfacing/constructions
Interests/skills:
3D gaming,3D movies, 3D printing,Drums, Bass and guitar.
Suomi - FINLAND - perkele

Posted 09/03/2018 08:37 PM   
[quote="GibsonRed"] Nvidia might be greedy but they aren’t fucking stupid. [/quote] I really think Nvidia is stupid, at least people who lead the company. Maybe there are employees that are not, but that is another thing. They have had the chace to make good things, and they only grants bad reputation. Money money money..., that is all..., what a poor people.
GibsonRed said:
Nvidia might be greedy but they aren’t fucking stupid.


I really think Nvidia is stupid, at least people who lead the company. Maybe there are employees that are not, but that is another thing. They have had the chace to make good things, and they only grants bad reputation.

Money money money..., that is all..., what a poor people.

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "Gigabyte RTX 2080 Gaming OC"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

Posted 09/03/2018 09:34 PM   
[quote="xXxStarManxXx"] What do you think youre going to get with 12nm, another 300 MHz on top of 90 MHz? Are you drunk? [/quote] Where did I say it would overclock 300Mhz? You just pulled that straight out of your ass! [quote="xXxStarManxXx"]You might get another 100 MHz on top of that factory 90 MHz, absolute best case scenario. [/quote] How do you know this? Your not just making things up again are you? [quote="xXxStarManxXx"]You don't understand the trend? [/quote] More like I don't understand your drivel. You're just spouting made up statistics and quotes straight from the Beano! [quote="xXxStarManxXx"]Let's see: 980 Ti with 1200 MHz boost clocks but could do 1500 MHz, a 25% increase. [/quote] Yes that's correct. I actually got mine higher than that but never mind. [quote="xXxStarManxXx"]Pascal that was aggressively clocked from the factory, where my 1080 Ti FE does 1911 MHz just setting PT to 120%, but won't do more than 2012-2025 MHz reliably under full water block with load temps of 40C. [/quote] The 1080ti base clock is 1481. So you managed a 29% overclock? Still don't get your point. [quote="xXxStarManxXx"]I'm at +120 core / +400 memory [/quote] You should flash your GFX card bios to raise the power limit. That's the way to get the most out your water cooled setup. Worked a treat with my 980ti. [quote="xXxStarManxXx"]Factory clocked 1070 and 1080 do 1850 MHz, they can do around 2000-2050 Mhz, no more. A 15% increase. [/quote] That's great but whats your point? Actually the 1080 baseclock is 1607 Mhz whilst the 1070 baseclock is 1506 Mhz. Thats a lot more than a 15% overclock going off of your figures. [quote="xXxStarManxXx"]Expecting another 200 MHz up and over +90 MHz, that ain't gonna happen, no way, no how. [/quote] 300Mhz, now 200Mhz?!?!? I've never said this and neither has anyone else. Why are you just making things up? It's delusional. After correcting your made up numbers though I'd say it is looking quite likely we will get near 2000Mhz from the baseclock of 1545Mhz with the 2080ti going off previous generations of cards. You have to remember that its not just the clock speed its the CUDA core count too. You can't just judge all overclocked clock speeds equally, like you are doing! (not to mention DDR6) [quote="xXxStarManxXx"]They have already aggressively clocked the FE cards from the factory in an attempt to justify the price gouge. You MIGHT get another 50-100% MHz out of them, which equates to maybe 5% performance on top of that, taking the gap between overclocked 1080 Ti to overclocked 2080 Ti from 18%, and I'll be generous here, MAYBE 25%.[/quote] This is just pure bumf. The 2080ti baseclock is 1545 so a 30% overclock would be 2008 Mhz. They've overclocked the FE cards more than usual as they have a two fan setup now. You are so blinkered it's beyond belief! Yet again, more of your made up figures. Your full of it! [quote="xXxStarManxXx"]If I'm an embittered pessimist then youre 100% a deluded fanboy cheerleader moron or probably an NGreedia PR shill. [/quote] I'm a fan boy to performance, not NVidia. I'm not deluded when it comes to these things, I'm quite the opposite actually. Just assume what you want though, per usual. Whatever makes you feel better. I'm not an Nvidia shill but i'll quite happily get paid for correcting morons on the forums if they do want to pay me! [quote="xXxStarManxXx"]Enjoy your "graphical perfection" with your $1300 2080 Ti card on your 1440p or 4K monitor at 40-60 FPS 1080p that looks worse than actual 1080p because of interpolation in maybe 10% of titles that will actually support Ray Farce. [/quote] You didn't watch the video I linked to then, with Dice talkng about how performance will be up massively at release and raytracing settings can be changed (including resolution) independent of the game resolution? How about the alpha build shown wasn't even using the tensor cores to denoise the raytracing and was using the 2080ti main gfx core instead? What are you on about interpolation? Thats for smoothing motion in movies and sports?!?! do you mean non native scaling? As that won't be an issue. Don't worry about me, I'll be enjoying playing on my ultra widescreen 34" G sync gaming monitor or my 120" Sony 4k projector in HDR, I havent decided yet. Either way I'll be playing at 4k with raytracing on (in battlefield anyway, watch that video and you'll see) [quote="xXxStarManxXx"]By the time Ray Farce is doable at greater framerate and resolution, i.e late 2019 on 7nm Volta or soon thereafter, 2080 Ti will look like a GTX 580 in terms of Ray Farce performance compared to 780 Ti.[/quote] Yeah hopefully. I can't wait, it's exciting times we are living in. [quote="xXxStarManxXx"]Only complete total morons who have a lot of money pay early adopter fee. [/quote] Sure must be a lot of them about as they've sold out everywhere! Oh I forgot, you said they only made 5 to make it look like they are selling!?!?!?! Brilliant. You've got an answer for everything! Maybe if you got off your butt and worked hard like I do you'd be able to afford one too? [quote="xXxStarManxXx"]But in this instance, youre not only paying the early adopters fee, youre voting in favor of FE price gouging going forward by buying a product THAT YOU HAVE ZERO F'ING CLUE HOW IT'S GOING TO PERFORM, just straight up making a blind purchase: [/quote] I haven't bought the FE card can't you read properly? I don't know exactly how well it will perform but I do have a clue. Seeing that the 1080ti is 79% faster than my 980ti at 4K I know that the 2080ti will be at least 100% faster than my 980ti which is a massive upgrade for me. If it performs badly (it wont) ill send it back or ebay it and make a profit. I can't lose! I can't help but notice that you haven't taken into account the memory bandwidth of the 2080ti at all (616GB/s VS the 1080ti 484 GB/s) I also know it'll be HDCP 2.2 compliant for my 4k streaming as well as have ray tracing, deep learning cores, tensor cores, USB-C future VR connector, better AA and NVlink. [quote="xXxStarManxXx"]So youre actually harming the consumer-base. [/quote] What are you dribbling on about? [quote="xXxStarManxXx"]Let's take a show of hands, how many reading this have $1300 for a 2080 Ti FE pre-order even if NGreedia didn't artificially limit supply to create the illusion of high demand and you could buy one? Vote 1 for Yes and 0 for No in the following comments. [/quote] Now it's getting embarrassing. [quote="xXxStarManxXx"]But now, because of rich morons who think that they're going to "graphical perfection nirvana" and pre-ordering something that is way overpriced and that they know exactly zero about, (+ more drivel...........blah blah blah) [/quote] It's the biggest consumer die ever made, it costs money. 10 years of R&D into raytracing. DDR6 RAM (11GB's of) Raytracing is the pinnacle of graphics. You obviously have no idea how raytracing will change games making comments like that. When everything is raytraced, it literally will be 'Graphical perfection nirvana' as it will be photorealistic. If you can't justify buying one then dont. Do us a favour though and don't start getting salty on forums because some people can justify buying them! It shows a lack of emapthy and therefore a lack of character! Give yourself a slap and grow up a bit.
xXxStarManxXx said:
What do you think youre going to get with 12nm, another 300 MHz on top of 90 MHz?
Are you drunk?

Where did I say it would overclock 300Mhz? You just pulled that straight out of your ass!

xXxStarManxXx said:You might get another 100 MHz on top of that factory 90 MHz, absolute best case scenario.

How do you know this? Your not just making things up again are you?

xXxStarManxXx said:You don't understand the trend?


More like I don't understand your drivel. You're just spouting made up statistics and quotes straight from the Beano!


xXxStarManxXx said:Let's see: 980 Ti with 1200 MHz boost clocks but could do 1500 MHz, a 25% increase.


Yes that's correct. I actually got mine higher than that but never mind.


xXxStarManxXx said:Pascal that was aggressively clocked from the factory, where my 1080 Ti FE does 1911 MHz just setting PT to 120%, but won't do more than 2012-2025 MHz reliably under full water block with load temps of 40C.


The 1080ti base clock is 1481. So you managed a 29% overclock? Still don't get your point.

xXxStarManxXx said:I'm at +120 core / +400 memory


You should flash your GFX card bios to raise the power limit. That's the way to get the most out your water cooled setup. Worked a treat with my 980ti.

xXxStarManxXx said:Factory clocked 1070 and 1080 do 1850 MHz, they can do around 2000-2050 Mhz, no more. A 15% increase.


That's great but whats your point? Actually the 1080 baseclock is 1607 Mhz whilst the 1070 baseclock is 1506 Mhz. Thats a lot more than a 15% overclock going off of your figures.

xXxStarManxXx said:Expecting another 200 MHz up and over +90 MHz, that ain't gonna happen, no way, no how.

300Mhz, now 200Mhz?!?!? I've never said this and neither has anyone else.
Why are you just making things up?
It's delusional.
After correcting your made up numbers though I'd say it is looking quite likely we will get near 2000Mhz from the baseclock of 1545Mhz with the 2080ti going off previous generations of cards.

You have to remember that its not just the clock speed its the CUDA core count too. You can't just judge all overclocked clock speeds equally, like you are doing! (not to mention DDR6)
xXxStarManxXx said:They have already aggressively clocked the FE cards from the factory in an attempt to justify the price gouge. You MIGHT get another 50-100% MHz out of them, which equates to maybe 5% performance on top of that, taking the gap between overclocked 1080 Ti to overclocked 2080 Ti from 18%, and I'll be generous here, MAYBE 25%.

This is just pure bumf. The 2080ti baseclock is 1545 so a 30% overclock would be 2008 Mhz. They've overclocked the FE cards more than usual as they have a two fan setup now. You are so blinkered it's beyond belief!
Yet again, more of your made up figures. Your full of it!


xXxStarManxXx said:If I'm an embittered pessimist then youre 100% a deluded fanboy cheerleader moron or probably an NGreedia PR shill.

I'm a fan boy to performance, not NVidia. I'm not deluded when it comes to these things, I'm quite the opposite actually. Just assume what you want though, per usual. Whatever makes you feel better.
I'm not an Nvidia shill but i'll quite happily get paid for correcting morons on the forums if they do want to pay me!

xXxStarManxXx said:Enjoy your "graphical perfection" with your $1300 2080 Ti card on your 1440p or 4K monitor at 40-60 FPS 1080p that looks worse than actual 1080p because of interpolation in maybe 10% of titles that will actually support Ray Farce.

You didn't watch the video I linked to then, with Dice talkng about how performance will be up massively at release and raytracing settings can be changed (including resolution) independent of the game resolution? How about the alpha build shown wasn't even using the tensor cores to denoise the raytracing and was using the 2080ti main gfx core instead?
What are you on about interpolation? Thats for smoothing motion in movies and sports?!?! do you mean non native scaling? As that won't be an issue.
Don't worry about me, I'll be enjoying playing on my ultra widescreen 34" G sync gaming monitor or my 120" Sony 4k projector in HDR, I havent decided yet. Either way I'll be playing at 4k with raytracing on (in battlefield anyway, watch that video and you'll see)

xXxStarManxXx said:By the time Ray Farce is doable at greater framerate and resolution, i.e late 2019 on 7nm Volta or soon thereafter, 2080 Ti will look like a GTX 580 in terms of Ray Farce performance compared to 780 Ti.

Yeah hopefully. I can't wait, it's exciting times we are living in.


xXxStarManxXx said:Only complete total morons who have a lot of money pay early adopter fee.

Sure must be a lot of them about as they've sold out everywhere!
Oh I forgot, you said they only made 5 to make it look like they are selling!?!?!?! Brilliant. You've got an answer for everything!
Maybe if you got off your butt and worked hard like I do you'd be able to afford one too?


xXxStarManxXx said:But in this instance, youre not only paying the early adopters fee, youre voting in favor of FE price gouging going forward by buying a product THAT YOU HAVE ZERO F'ING CLUE HOW IT'S GOING TO PERFORM, just straight up making a blind purchase:

I haven't bought the FE card can't you read properly?
I don't know exactly how well it will perform but I do have a clue.
Seeing that the 1080ti is 79% faster than my 980ti at 4K I know that the 2080ti will be at least 100% faster than my 980ti which is a massive upgrade for me.
If it performs badly (it wont) ill send it back or ebay it and make a profit. I can't lose!

I can't help but notice that you haven't taken into account the memory bandwidth of the 2080ti at all (616GB/s VS the 1080ti 484 GB/s)
I also know it'll be HDCP 2.2 compliant for my 4k streaming as well as have ray tracing, deep learning cores, tensor cores, USB-C future VR connector, better AA and NVlink.

xXxStarManxXx said:So youre actually harming the consumer-base.


What are you dribbling on about?

xXxStarManxXx said:Let's take a show of hands, how many reading this have $1300 for a 2080 Ti FE pre-order even if NGreedia didn't artificially limit supply to create the illusion of high demand and you could buy one?

Vote 1 for Yes and 0 for No in the following comments.


Now it's getting embarrassing.

xXxStarManxXx said:But now, because of rich morons who think that they're going to "graphical perfection nirvana" and pre-ordering something that is way overpriced and that they know exactly zero about, (+ more drivel...........blah blah blah)


It's the biggest consumer die ever made, it costs money. 10 years of R&D into raytracing. DDR6 RAM (11GB's of)
Raytracing is the pinnacle of graphics. You obviously have no idea how raytracing will change games making comments like that. When everything is raytraced, it literally will be 'Graphical perfection nirvana' as it will be photorealistic.

If you can't justify buying one then dont. Do us a favour though and don't start getting salty on forums because some people can justify buying them! It shows a lack of emapthy and therefore a lack of character! Give yourself a slap and grow up a bit.

Posted 09/04/2018 01:04 AM   
[quote="GibsonRed"] When I get my 2080ti, if the performance is only 10% faster than a 1080ti I’ll send it back (read sell on eBay for £2000) Go away with your trolling,......[/quote] Not only will you encourage Nvidia's FE price gouging by pre-ordering completely blindly but instead of returning it to them so that they can sell it at MSRP to someone else youre basically going to be a scalper and sell it on ebay for $2k quid (or so you think, do you think there will be demand for this when everyone finds out it's a whopping 18% faster than 1080 Ti when both cards are overclocked?) Dude. Fuck off and die. Youre a cancer to PC Gaming. I'm not even going to read your last comment here.
GibsonRed said:

When I get my 2080ti, if the performance is only 10% faster than a 1080ti I’ll send it back (read sell on eBay for £2000)
Go away with your trolling,......


Not only will you encourage Nvidia's FE price gouging by pre-ordering completely blindly but instead of returning it to them so that they can sell it at MSRP to someone else youre basically going to be a scalper and sell it on ebay for $2k quid (or so you think, do you think there will be demand for this when everyone finds out it's a whopping 18% faster than 1080 Ti when both cards are overclocked?)

Dude.


Fuck off and die.


Youre a cancer to PC Gaming.


I'm not even going to read your last comment here.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Posted 09/04/2018 02:30 AM   
Gentlemen, please, relax! Generally speaking, I humbly believe that both of you chaps are correct; it's just a matter of individual perspective and personal value :) We all love the info sharing and the epic debate, but let's also remember to... [img]http://i.imgur.com/8R7aaNm.png[/img]
Gentlemen, please, relax!

Generally speaking, I humbly believe that both of you chaps are correct; it's just a matter of individual perspective and personal value :)

We all love the info sharing and the epic debate, but let's also remember to...
Image

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

Posted 09/04/2018 03:26 AM   
[quote="xXxStarManxXx"]Not only will you encourage Nvidia's FE price gouging by pre-ordering completely blindly but instead of returning it to them so that they can sell it at MSRP to someone else youre basically going to be a scalper and sell it on ebay for $2k quid (or so you think, do you think there will be demand for this when everyone finds out it's a whopping 18% faster than 1080 Ti when both cards are overclocked?)[/quote] I won’t be selling it as it obviously going to be faster than 18%. Maybe you missed nvidia slides. I’d take that over your made up info, anyday. Yes there will be a huge demand for them. That’s why they are sold out everywhere. You can’t see the wood for the trees, your unfounded hatred blinds you to the facts. [quote="xXxStarManxXx"]Dude. Fuck off and die. Youre a cancer to PC Gaming. I'm not even going to read your last comment here. [/quote] Nice counter argument. You blatantly don’t acknowledge any of your mis information and go for ad hoc comments and insults instead. What a pathetic little man you are. I think it’s obvious who’s in the wrong here. People on this thread are asking you to shut up and you keep banging on. You’re like a dog with a frisby, let go ffs. Everyone knows you read my last comment. You just don’t like the fact that the truth hurts and I’ve called you out on it. We are talking about a gfx card and you’re telling people to ‘fuck off and die’ You must have metal health problems making comments like that. Can’t handle the truth that you’re a misinformed, self entitled, spoilt brat who can’t afford the latest and greatest this time round and is throwing the toys out of his pram, trying to put down everyone else down who can. You are a very weak individual it appears. Show some character and grow up as your acting like a little kid. Pathetic.
xXxStarManxXx said:Not only will you encourage Nvidia's FE price gouging by pre-ordering completely blindly but instead of returning it to them so that they can sell it at MSRP to someone else youre basically going to be a scalper and sell it on ebay for $2k quid (or so you think, do you think there will be demand for this when everyone finds out it's a whopping 18% faster than 1080 Ti when both cards are overclocked?)


I won’t be selling it as it obviously going to be faster than 18%. Maybe you missed nvidia slides. I’d take that over your made up info, anyday.
Yes there will be a huge demand for them. That’s why they are sold out everywhere. You can’t see the wood for the trees, your unfounded hatred blinds you to the facts.


xXxStarManxXx said:Dude.


Fuck off and die.


Youre a cancer to PC Gaming.


I'm not even going to read your last comment here.


Nice counter argument. You blatantly don’t acknowledge any of your mis information and go for ad hoc comments and insults instead.
What a pathetic little man you are.

I think it’s obvious who’s in the wrong here. People on this thread are asking you to shut up and you keep banging on.
You’re like a dog with a frisby, let go ffs.
Everyone knows you read my last comment. You just don’t like the fact that the truth hurts and I’ve called you out on it.
We are talking about a gfx card and you’re telling people to ‘fuck off and die’
You must have metal health problems making comments like that.
Can’t handle the truth that you’re a misinformed, self entitled, spoilt brat who can’t afford the latest and greatest this time round and is throwing the toys out of his pram, trying to put down everyone else down who can.
You are a very weak individual it appears. Show some character and grow up as your acting like a little kid.
Pathetic.

Posted 09/04/2018 10:42 AM   
Wait till the NDA lifts and there is no reason for the language that's being used. This is a friendly forum and lets keep it that way.
Wait till the NDA lifts and there is no reason for the language that's being used. This is a friendly forum and lets keep it that way.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

Posted 09/04/2018 11:30 AM   
  15 / 27    
Scroll To Top