I own both cards and playing at 1080P and 4K and yes with the same setting in games it roughly around 2 to 2.5 times faster.
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-Titan-X-vs-Nvidia-GeForce-GTX-690/3282vsm8241
[quote=""]So let me see if I have this straight- people are losing their minds with rage because of a 5% dip in performance in some games, and a bad result in a single brand new and buggy game? That constitutes gimping nowadays?
Here's what I see:
1) NVidia releases Maxwell cards- nonstop bitching ensues because they aren't as fast as promised, and not enough better than last gen.
2) NVidia improves drivers for Maxwell fixing bugs and improving performance- nonstop bitching ensues because the new products are better than the old ones.
What's the common thread here? That's right, nonstop bitching.[/quote]
Yupp:) They are also comparing the 960 to a 780Ti and in Witcher 3 960 beats the 780Ti. Of course it does. Witcher 3 is heavy on tesselation (about 3x more units than the 700 series). Now it is also possible the driver has been optimized for 900 series only. (It is irrelevant that I get 99% gpu usage and 60fps if I could get 80fps at 99% :) ) But, based on older games which I tested I haven't notice any difference except the approx. 5% that seems to be related to the driver.
Back to the topic, A titan X is nice but it seems to be an overkill. a 980Ti would work perfectly! You don't really need those 12Gb of VRAM in any way... and now with DX12 around the corner and unified memory management VRAM will be even less important (once devs will start using DX12 that is:) )
This is my opinion thou...
said:So let me see if I have this straight- people are losing their minds with rage because of a 5% dip in performance in some games, and a bad result in a single brand new and buggy game? That constitutes gimping nowadays?
Here's what I see:
1) NVidia releases Maxwell cards- nonstop bitching ensues because they aren't as fast as promised, and not enough better than last gen.
2) NVidia improves drivers for Maxwell fixing bugs and improving performance- nonstop bitching ensues because the new products are better than the old ones.
What's the common thread here? That's right, nonstop bitching.
Yupp:) They are also comparing the 960 to a 780Ti and in Witcher 3 960 beats the 780Ti. Of course it does. Witcher 3 is heavy on tesselation (about 3x more units than the 700 series). Now it is also possible the driver has been optimized for 900 series only. (It is irrelevant that I get 99% gpu usage and 60fps if I could get 80fps at 99% :) ) But, based on older games which I tested I haven't notice any difference except the approx. 5% that seems to be related to the driver.
Back to the topic, A titan X is nice but it seems to be an overkill. a 980Ti would work perfectly! You don't really need those 12Gb of VRAM in any way... and now with DX12 around the corner and unified memory management VRAM will be even less important (once devs will start using DX12 that is:) )
This is my opinion thou...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
It's hard to say it's X faster than a 690, but it's significant. When the drivers aren't crashing GTA 5 it's buttery smooth at max everything except DOF and motion blur that I always turn off for 3D. I'm not trying 3D with Witcher 3 yet but going from a 690 to a titan x on that is let me turn on DSR and it's spectacular.
*edit* and I am not hostage to sli profiles which is a win.
It's hard to say it's X faster than a 690, but it's significant. When the drivers aren't crashing GTA 5 it's buttery smooth at max everything except DOF and motion blur that I always turn off for 3D. I'm not trying 3D with Witcher 3 yet but going from a 690 to a titan x on that is let me turn on DSR and it's spectacular.
*edit* and I am not hostage to sli profiles which is a win.
bo3b
TL;DR ?
;)
OK, let me try a shorter version for you. I'll explain and prove it was intentional, and Nvidia should be sued for this:
(edit: I failed in making this post short. Sorry :P )
a) this is not the first time they did it. The scenario is the same:
- wait until some BIG game release
- don't give the already prepared update for the generation that's not being sold by Nvidia any more
- act like you're surprised and it was just a bug
- fix "the bug" after all the reviews are done, after 95% of the game copies are sold, after people already altered their decissions about buying a new GPU, by reading reviews and benchmarks
- after that, scores will remain, 1% of the reviews will be updated, and the shady move will profit Nvidia even after they "fixed the bug".
Don't be naive talking about "Maxwell specific" optimizations, cause there are none. I gave you the link in the longer post above, explaining why Nvidia drivers are so much faster compared to the AMD's, when the game releases. That's one thing, but not most important. Most important is this:
- There are benchamarks in EVERY other game besides P.C. and W3, where gtx960 is slightly slower or slightly faster than gtx760. Now, 960 beats gtx770 (128bit vs. 256bit card!) and even gtx780 (non-Ti):
[img]http://www.techspot.com/articles-info/1000/bench/1080p_Clear.png[/img]
[img]http://pclab.pl/zdjecia/artykuly/chaostheory/2015/05/w3/charts6/w3l_high_1920.png[/img]
Are you really that naive to believe it's just a coincidence?
Let me underline this:
24fps = 760 (256-bit)
37fps = 960 (128-bit)
39fps - 780 (384-bit)
- Nvidia did it before, for example I've seen +70% differences between 470 and 570 or 460 and 560 (non-Ti), and I use Nvidia cards since forever, and I saw many so called "bugs" resulting in sudden drop in framerate, artificially creating a though in my head "you need to upgrade, your old card isn't good enough anymore"
Sometimes the sequel to some game comes up, using 100% the same engine as the old game, but now the new generation shows huge improvements in comparison to the older game. And... if it really were "new architecture optimizations", the old game would speed up just as well. But doesn't. Isn't that telling something?
Besides we're talking about Nvidia who likes to cheat since famous GeForce FX series cheated in Far Cry and other games. That's their tradition, you might say ;) .
It's not a coincidence that the "Kepler fix" is not released to this day. Meanwhile people are buying overpriced GTX960s thinking it's a good card. And it's not. No card with 128-bit bus should ever be sold for 220-250€ (Euros) in 2015. And noone who spent 400€ for GTX780 (new, bought as the newest architecture available 9 months ago, should be now left abandoned because Nvidia is too busy flogging Maxwells. And oh, don't forget the argument that there's only limited amount of workforce, and it's normal that optimizations for Keplers will now appear months after the game releases, because I'll remind you the stupid Androidworks. Think about how much people and workforce Nvidia spent on that...
P.S. I'm not talking about SLI.
I don't get it why you are, again, defending Nvidia and call criticizing it "bitching'. You defended them about dropping 3D Vision support if I'm not mistaken.
I still admire you (as you help making 3DV usable :) ), don't get mad or something, please ;) , but I find it hard to comprehend why you are... well... "not bitching". ;)
TL;DR ?
;)
OK, let me try a shorter version for you. I'll explain and prove it was intentional, and Nvidia should be sued for this:
(edit: I failed in making this post short. Sorry :P )
a) this is not the first time they did it. The scenario is the same:
- wait until some BIG game release
- don't give the already prepared update for the generation that's not being sold by Nvidia any more
- act like you're surprised and it was just a bug
- fix "the bug" after all the reviews are done, after 95% of the game copies are sold, after people already altered their decissions about buying a new GPU, by reading reviews and benchmarks
- after that, scores will remain, 1% of the reviews will be updated, and the shady move will profit Nvidia even after they "fixed the bug".
Don't be naive talking about "Maxwell specific" optimizations, cause there are none. I gave you the link in the longer post above, explaining why Nvidia drivers are so much faster compared to the AMD's, when the game releases. That's one thing, but not most important. Most important is this:
- There are benchamarks in EVERY other game besides P.C. and W3, where gtx960 is slightly slower or slightly faster than gtx760. Now, 960 beats gtx770 (128bit vs. 256bit card!) and even gtx780 (non-Ti):
Are you really that naive to believe it's just a coincidence?
Let me underline this:
24fps = 760 (256-bit)
37fps = 960 (128-bit)
39fps - 780 (384-bit)
- Nvidia did it before, for example I've seen +70% differences between 470 and 570 or 460 and 560 (non-Ti), and I use Nvidia cards since forever, and I saw many so called "bugs" resulting in sudden drop in framerate, artificially creating a though in my head "you need to upgrade, your old card isn't good enough anymore"
Sometimes the sequel to some game comes up, using 100% the same engine as the old game, but now the new generation shows huge improvements in comparison to the older game. And... if it really were "new architecture optimizations", the old game would speed up just as well. But doesn't. Isn't that telling something?
Besides we're talking about Nvidia who likes to cheat since famous GeForce FX series cheated in Far Cry and other games. That's their tradition, you might say ;) .
It's not a coincidence that the "Kepler fix" is not released to this day. Meanwhile people are buying overpriced GTX960s thinking it's a good card. And it's not. No card with 128-bit bus should ever be sold for 220-250€ (Euros) in 2015. And noone who spent 400€ for GTX780 (new, bought as the newest architecture available 9 months ago, should be now left abandoned because Nvidia is too busy flogging Maxwells. And oh, don't forget the argument that there's only limited amount of workforce, and it's normal that optimizations for Keplers will now appear months after the game releases, because I'll remind you the stupid Androidworks. Think about how much people and workforce Nvidia spent on that...
P.S. I'm not talking about SLI.
I don't get it why you are, again, defending Nvidia and call criticizing it "bitching'. You defended them about dropping 3D Vision support if I'm not mistaken.
I still admire you (as you help making 3DV usable :) ), don't get mad or something, please ;) , but I find it hard to comprehend why you are... well... "not bitching". ;)
[quote=""]Thanks Kolreth and zig. It does sound promising. You don't know of any 4k gaming benchmarks that compare the titanX and 690 by any chance? Cheers[/quote]
I've not seen 690 benchmarks in years sorry.
By the way of course nvidia are optimising the new cards more than the old cards. Why would you expect any different?
I'm sure the 780ti performed a lot better when the 9 series were released than when it first came out. They've been optimising the 7 series for a year already.
Think about it. You'd be more annoyed if they brought out a new card and it was 4-8% faster than last gen like Intel CPUs.
Sounds like people are annoyed that their card isn't the fastest on the block anymore. Get over it!
By the way of course nvidia are optimising the new cards more than the old cards. Why would you expect any different?
I'm sure the 780ti performed a lot better when the 9 series were released than when it first came out. They've been optimising the 7 series for a year already.
Think about it. You'd be more annoyed if they brought out a new card and it was 4-8% faster than last gen like Intel CPUs.
Sounds like people are annoyed that their card isn't the fastest on the block anymore. Get over it!
[quote=""]By the way of course nvidia are optimising the new cards more than the old cards. Why would you expect any different?
I'm sure the 780ti performed a lot better when the 9 series were released than when it first came out. They've been optimising the 7 series for a year already.
Think about it. You'd be more annoyed if they brought out a new card and it was 4-8% faster than last gen like Intel CPUs.
Sounds like people are annoyed that their card isn't the fastest on the block anymore. Get over it! [/quote]
The 780ti BEHAVES EXACTLY THE SAME with the current driver as with the LAST DRIVER BEFORE the 900 Series launch!! +-5%.
Man I am not crazy nor am I defending nvidia!
Just to show you that: https://forums.geforce.com/default/topic/832749/geforce-drivers/official-nvidia-352-86-whql-game-ready-display-driver-feedback-thread-released-5-18-15-/post/4546764/#4546764
But, nowdays the "consumer" has the feeling he knows everything and just because he paid for a product they can Boss the manufacturer around. Well no you CAN'T!! We got lucky that nvidia even admitted some issues and they are fixing them. They could have just shut up.
I also want to see an HONEST BUSINESSMAN or company ! Don't make me laugh... Commerce is all ABOUT LYING in order to sell make the absolute maximum profit ! (Even the ancient Greeks knew this! The God of Commerce and Stealing was the same;)) )
In any case, the problem is not NVIDIA in the first place!!! Normally there SHOULDN'T EVEN BE A NEED TO RELEASE DRIVERS FOR GAMES WTF!!! You should release drivers that allow the hardware to "talk" to the OS and expose features. The Application should be written in such a way to be optimized across different hardware.
And here in between we have DirectX and OpenGL. The freaking middleware... (Unlike consoles where you have low level access and the optimizations are made by the developer and not Freaking Sony or MS).
The whole concept of the current generation of driver is wrong!!! This is the real problem ! Because of this nvidia can take advantage and "optimize" stuff for what they want;))
So, of course is taking advantage of this! Now, the real problem is if they did it on purpose or not and mostly WHO "ordered" this! (If this is the case I bet it was a nice guy in a nice suit that has 0 knowledge about GPUs or drivers or software in general)...
In any case, based on the benchmarks from above I still fail to see the huge difference everyone is complaining about....
Oh and for the record a 128bit vs 256bit BUS SPEED means exactly NOTHING unless you are talking about the EXACT hardware platform...(760 vs 960 are different hardware platforms so ...)
said:By the way of course nvidia are optimising the new cards more than the old cards. Why would you expect any different?
I'm sure the 780ti performed a lot better when the 9 series were released than when it first came out. They've been optimising the 7 series for a year already.
Think about it. You'd be more annoyed if they brought out a new card and it was 4-8% faster than last gen like Intel CPUs.
Sounds like people are annoyed that their card isn't the fastest on the block anymore. Get over it!
The 780ti BEHAVES EXACTLY THE SAME with the current driver as with the LAST DRIVER BEFORE the 900 Series launch!! +-5%.
Man I am not crazy nor am I defending nvidia!
Just to show you that: https://forums.geforce.com/default/topic/832749/geforce-drivers/official-nvidia-352-86-whql-game-ready-display-driver-feedback-thread-released-5-18-15-/post/4546764/#4546764
But, nowdays the "consumer" has the feeling he knows everything and just because he paid for a product they can Boss the manufacturer around. Well no you CAN'T!! We got lucky that nvidia even admitted some issues and they are fixing them. They could have just shut up.
I also want to see an HONEST BUSINESSMAN or company ! Don't make me laugh... Commerce is all ABOUT LYING in order to sell make the absolute maximum profit ! (Even the ancient Greeks knew this! The God of Commerce and Stealing was the same;)) )
In any case, the problem is not NVIDIA in the first place!!! Normally there SHOULDN'T EVEN BE A NEED TO RELEASE DRIVERS FOR GAMES WTF!!! You should release drivers that allow the hardware to "talk" to the OS and expose features. The Application should be written in such a way to be optimized across different hardware.
And here in between we have DirectX and OpenGL. The freaking middleware... (Unlike consoles where you have low level access and the optimizations are made by the developer and not Freaking Sony or MS).
The whole concept of the current generation of driver is wrong!!! This is the real problem ! Because of this nvidia can take advantage and "optimize" stuff for what they want;))
So, of course is taking advantage of this! Now, the real problem is if they did it on purpose or not and mostly WHO "ordered" this! (If this is the case I bet it was a nice guy in a nice suit that has 0 knowledge about GPUs or drivers or software in general)...
In any case, based on the benchmarks from above I still fail to see the huge difference everyone is complaining about....
Oh and for the record a 128bit vs 256bit BUS SPEED means exactly NOTHING unless you are talking about the EXACT hardware platform...(760 vs 960 are different hardware platforms so ...)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
[quote=""]Thanks Kolreth and zig. It does sound promising. You don't know of any 4k gaming benchmarks that compare the titanX and 690 by any chance? Cheers[/quote]
If you thinking about playing at 4K then i would definitely recommend 980ti which will be hell of a improvement even power wise compare to 690. You`ll get 4GB extra memory which is crucial in this case and also that Hugh performance increase.
I would also recommend 980 Kingpin, which can easy go above Titan X but you might not be very happy with all that Overclocking so i`ll skip that.
Keep in mind that rumored Titan X Ultra is on the way so if you`ll be holding back then you might find yourself with bigger headache then right now.
said:Thanks Kolreth and zig. It does sound promising. You don't know of any 4k gaming benchmarks that compare the titanX and 690 by any chance? Cheers
If you thinking about playing at 4K then i would definitely recommend 980ti which will be hell of a improvement even power wise compare to 690. You`ll get 4GB extra memory which is crucial in this case and also that Hugh performance increase.
I would also recommend 980 Kingpin, which can easy go above Titan X but you might not be very happy with all that Overclocking so i`ll skip that.
Keep in mind that rumored Titan X Ultra is on the way so if you`ll be holding back then you might find yourself with bigger headache then right now.
That's for the info chaps.
I'm been gaming in 4k since August last year. My 690gtx is holding up well still. I seem to be getting a lot of higher score than most benchmarks which I find bizarre.
I'll be watercooling the gfx card anyway, I already have a custom loop for the CPU.
I held off with the 970 and 980 due to lack of VRAM and the Titan X is just impossible to justify. Ridiculous price!
I'm fed up of SLI microstutter and not much support in games.
It's either 980ti or AMD 'fury' for me.
I just don't want to upgrade, spend £6-£700 on a gfx card and waterblock and find out I get 10fps more.
That would be pants!
Anyone got any benchmarks for games at 4k with a 690 and titanX?
Thanks again everyone.
That's for the info chaps.
I'm been gaming in 4k since August last year. My 690gtx is holding up well still. I seem to be getting a lot of higher score than most benchmarks which I find bizarre.
I'll be watercooling the gfx card anyway, I already have a custom loop for the CPU.
I held off with the 970 and 980 due to lack of VRAM and the Titan X is just impossible to justify. Ridiculous price!
I'm fed up of SLI microstutter and not much support in games.
It's either 980ti or AMD 'fury' for me.
I just don't want to upgrade, spend £6-£700 on a gfx card and waterblock and find out I get 10fps more.
That would be pants!
Anyone got any benchmarks for games at 4k with a 690 and titanX?
Thanks again everyone.
[quote=""]That's for the info chaps.
I'm been gaming in 4k since August last year. My 690gtx is holding up well still. I seem to be getting a lot of higher score than most benchmarks which I find bizarre.
I'll be watercooling the gfx card anyway, I already have a custom loop for the CPU.
I held off with the 970 and 980 due to lack of VRAM and the Titan X is just impossible to justify. Ridiculous price!
I'm fed up of SLI microstutter and not much support in games.
It's either 980ti or AMD 'fury' for me.
I just don't want to upgrade, spend £6-£700 on a gfx card and waterblock and find out I get 10fps more.
That would be pants!
Anyone got any benchmarks for games at 4k with a 690 and titanX?
Thanks again everyone.[/quote]
What kind of 4K you talking about. What is your resolution ?
said:That's for the info chaps.
I'm been gaming in 4k since August last year. My 690gtx is holding up well still. I seem to be getting a lot of higher score than most benchmarks which I find bizarre.
I'll be watercooling the gfx card anyway, I already have a custom loop for the CPU.
I held off with the 970 and 980 due to lack of VRAM and the Titan X is just impossible to justify. Ridiculous price!
I'm fed up of SLI microstutter and not much support in games.
It's either 980ti or AMD 'fury' for me.
I just don't want to upgrade, spend £6-£700 on a gfx card and waterblock and find out I get 10fps more.
That would be pants!
Anyone got any benchmarks for games at 4k with a 690 and titanX?
Thanks again everyone.
What kind of 4K you talking about. What is your resolution ?
This is my rig. Only got room for 1 gfx card.
[URL=http://s1109.photobucket.com/user/DudeTheDuke/media/6A22D202-E6DF-4905-AE85-A678BBB10FDB.jpg.html][IMG]http://i1109.photobucket.com/albums/h423/DudeTheDuke/6A22D202-E6DF-4905-AE85-A678BBB10FDB.jpg[/IMG][/URL]
[URL=http://s1109.photobucket.com/user/DudeTheDuke/media/4AE3A14D-D19E-4921-B7EC-810D360A5F6E.jpg.html][IMG]http://i1109.photobucket.com/albums/h423/DudeTheDuke/4AE3A14D-D19E-4921-B7EC-810D360A5F6E.jpg[/IMG][/URL]
[URL=http://s1109.photobucket.com/user/DudeTheDuke/media/B3262B92-BBC6-48A9-AE2B-7A50C564AB23.jpg.html][IMG]http://i1109.photobucket.com/albums/h423/DudeTheDuke/B3262B92-BBC6-48A9-AE2B-7A50C564AB23.jpg[/IMG][/URL]
My problem is I don't think there is enough room above the card for water pipes. All EK blocks seem to have the inlet and outlet pipes on the top which isn't good for small form factors like my corsair air240 case.
The AMD fury is like the 295x2 which has the inlet and outlet water pipe on the side of the card so will work.
Otherwise I'll have to cut a section out of the transparent top which would look crap!
This is my rig. Only got room for 1 gfx card.
My problem is I don't think there is enough room above the card for water pipes. All EK blocks seem to have the inlet and outlet pipes on the top which isn't good for small form factors like my corsair air240 case.
The AMD fury is like the 295x2 which has the inlet and outlet water pipe on the side of the card so will work.
Otherwise I'll have to cut a section out of the transparent top which would look crap!
[quote=""]What kind of 4K you talking about. What is your resolution ?[/quote]
I have a Samsung u28d590 4k monitor at 3840x2160 I play at ultra setting with no AA.
I think you`ve been posting on wrong forum. We are not interested in 4K at all. Only 3D Surround is something what is above that. Others right in here trying to deal with the best possible quality of Stereo3D. That is why we asking which card will work the best in ours rigs and according to our needs.
I think you`ve been posting on wrong forum. We are not interested in 4K at all. Only 3D Surround is something what is above that. Others right in here trying to deal with the best possible quality of Stereo3D. That is why we asking which card will work the best in ours rigs and according to our needs.
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-Titan-X-vs-Nvidia-GeForce-GTX-690/3282vsm8241
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
Yupp:) They are also comparing the 960 to a 780Ti and in Witcher 3 960 beats the 780Ti. Of course it does. Witcher 3 is heavy on tesselation (about 3x more units than the 700 series). Now it is also possible the driver has been optimized for 900 series only. (It is irrelevant that I get 99% gpu usage and 60fps if I could get 80fps at 99% :) ) But, based on older games which I tested I haven't notice any difference except the approx. 5% that seems to be related to the driver.
Back to the topic, A titan X is nice but it seems to be an overkill. a 980Ti would work perfectly! You don't really need those 12Gb of VRAM in any way... and now with DX12 around the corner and unified memory management VRAM will be even less important (once devs will start using DX12 that is:) )
This is my opinion thou...
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
*edit* and I am not hostage to sli profiles which is a win.
TL;DR ?
;)
OK, let me try a shorter version for you. I'll explain and prove it was intentional, and Nvidia should be sued for this:
(edit: I failed in making this post short. Sorry :P )
a) this is not the first time they did it. The scenario is the same:
- wait until some BIG game release
- don't give the already prepared update for the generation that's not being sold by Nvidia any more
- act like you're surprised and it was just a bug
- fix "the bug" after all the reviews are done, after 95% of the game copies are sold, after people already altered their decissions about buying a new GPU, by reading reviews and benchmarks
- after that, scores will remain, 1% of the reviews will be updated, and the shady move will profit Nvidia even after they "fixed the bug".
Don't be naive talking about "Maxwell specific" optimizations, cause there are none. I gave you the link in the longer post above, explaining why Nvidia drivers are so much faster compared to the AMD's, when the game releases. That's one thing, but not most important. Most important is this:
- There are benchamarks in EVERY other game besides P.C. and W3, where gtx960 is slightly slower or slightly faster than gtx760. Now, 960 beats gtx770 (128bit vs. 256bit card!) and even gtx780 (non-Ti):
Are you really that naive to believe it's just a coincidence?
Let me underline this:
24fps = 760 (256-bit)
37fps = 960 (128-bit)
39fps - 780 (384-bit)
- Nvidia did it before, for example I've seen +70% differences between 470 and 570 or 460 and 560 (non-Ti), and I use Nvidia cards since forever, and I saw many so called "bugs" resulting in sudden drop in framerate, artificially creating a though in my head "you need to upgrade, your old card isn't good enough anymore"
Sometimes the sequel to some game comes up, using 100% the same engine as the old game, but now the new generation shows huge improvements in comparison to the older game. And... if it really were "new architecture optimizations", the old game would speed up just as well. But doesn't. Isn't that telling something?
Besides we're talking about Nvidia who likes to cheat since famous GeForce FX series cheated in Far Cry and other games. That's their tradition, you might say ;) .
It's not a coincidence that the "Kepler fix" is not released to this day. Meanwhile people are buying overpriced GTX960s thinking it's a good card. And it's not. No card with 128-bit bus should ever be sold for 220-250€ (Euros) in 2015. And noone who spent 400€ for GTX780 (new, bought as the newest architecture available 9 months ago, should be now left abandoned because Nvidia is too busy flogging Maxwells. And oh, don't forget the argument that there's only limited amount of workforce, and it's normal that optimizations for Keplers will now appear months after the game releases, because I'll remind you the stupid Androidworks. Think about how much people and workforce Nvidia spent on that...
P.S. I'm not talking about SLI.
I don't get it why you are, again, defending Nvidia and call criticizing it "bitching'. You defended them about dropping 3D Vision support if I'm not mistaken.
I still admire you (as you help making 3DV usable :) ), don't get mad or something, please ;) , but I find it hard to comprehend why you are... well... "not bitching". ;)
I've not seen 690 benchmarks in years sorry.
I'm sure the 780ti performed a lot better when the 9 series were released than when it first came out. They've been optimising the 7 series for a year already.
Think about it. You'd be more annoyed if they brought out a new card and it was 4-8% faster than last gen like Intel CPUs.
Sounds like people are annoyed that their card isn't the fastest on the block anymore. Get over it!
The 780ti BEHAVES EXACTLY THE SAME with the current driver as with the LAST DRIVER BEFORE the 900 Series launch!! +-5%.
Man I am not crazy nor am I defending nvidia!
Just to show you that: https://forums.geforce.com/default/topic/832749/geforce-drivers/official-nvidia-352-86-whql-game-ready-display-driver-feedback-thread-released-5-18-15-/post/4546764/#4546764
But, nowdays the "consumer" has the feeling he knows everything and just because he paid for a product they can Boss the manufacturer around. Well no you CAN'T!! We got lucky that nvidia even admitted some issues and they are fixing them. They could have just shut up.
I also want to see an HONEST BUSINESSMAN or company ! Don't make me laugh... Commerce is all ABOUT LYING in order to sell make the absolute maximum profit ! (Even the ancient Greeks knew this! The God of Commerce and Stealing was the same;)) )
In any case, the problem is not NVIDIA in the first place!!! Normally there SHOULDN'T EVEN BE A NEED TO RELEASE DRIVERS FOR GAMES WTF!!! You should release drivers that allow the hardware to "talk" to the OS and expose features. The Application should be written in such a way to be optimized across different hardware.
And here in between we have DirectX and OpenGL. The freaking middleware... (Unlike consoles where you have low level access and the optimizations are made by the developer and not Freaking Sony or MS).
The whole concept of the current generation of driver is wrong!!! This is the real problem ! Because of this nvidia can take advantage and "optimize" stuff for what they want;))
So, of course is taking advantage of this! Now, the real problem is if they did it on purpose or not and mostly WHO "ordered" this! (If this is the case I bet it was a nice guy in a nice suit that has 0 knowledge about GPUs or drivers or software in general)...
In any case, based on the benchmarks from above I still fail to see the huge difference everyone is complaining about....
Oh and for the record a 128bit vs 256bit BUS SPEED means exactly NOTHING unless you are talking about the EXACT hardware platform...(760 vs 960 are different hardware platforms so ...)
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
If you thinking about playing at 4K then i would definitely recommend 980ti which will be hell of a improvement even power wise compare to 690. You`ll get 4GB extra memory which is crucial in this case and also that Hugh performance increase.
I would also recommend 980 Kingpin, which can easy go above Titan X but you might not be very happy with all that Overclocking so i`ll skip that.
Keep in mind that rumored Titan X Ultra is on the way so if you`ll be holding back then you might find yourself with bigger headache then right now.
https://steamcommunity.com/profiles/76561198014296177/
I'm been gaming in 4k since August last year. My 690gtx is holding up well still. I seem to be getting a lot of higher score than most benchmarks which I find bizarre.
I'll be watercooling the gfx card anyway, I already have a custom loop for the CPU.
I held off with the 970 and 980 due to lack of VRAM and the Titan X is just impossible to justify. Ridiculous price!
I'm fed up of SLI microstutter and not much support in games.
It's either 980ti or AMD 'fury' for me.
I just don't want to upgrade, spend £6-£700 on a gfx card and waterblock and find out I get 10fps more.
That would be pants!
Anyone got any benchmarks for games at 4k with a 690 and titanX?
Thanks again everyone.
What kind of 4K you talking about. What is your resolution ?
https://steamcommunity.com/profiles/76561198014296177/
My problem is I don't think there is enough room above the card for water pipes. All EK blocks seem to have the inlet and outlet pipes on the top which isn't good for small form factors like my corsair air240 case.
The AMD fury is like the 295x2 which has the inlet and outlet water pipe on the side of the card so will work.
Otherwise I'll have to cut a section out of the transparent top which would look crap!
I have a Samsung u28d590 4k monitor at 3840x2160 I play at ultra setting with no AA.
https://steamcommunity.com/profiles/76561198014296177/