680 sli or 780

what woudl be better for a 680 user

get another 680
get rid of 6x and invest in 780
  1 / 3    
Hey guys, its not really related to 3dvision but I was wondering what are your thoughts on either getting another 680 or dropping 6x series and going for 780 instead (with another added sometime later).
Hey guys, its not really related to 3dvision but I was wondering what are your thoughts on either getting another 680 or dropping 6x series and going for 780 instead (with another added sometime later).

Acer H5360 / BenQ XL2420T + 3D Vision 2 Kit - EVGA GTX 980TI 6GB - i7-3930K@4.0GHz - DX79SI- 16GB RAM@2133 - Win10x64 Home - HTC VIVE

#1
Posted 11/08/2013 07:07 PM   
also for sli users out there, I have 680 gigabyte 4gb version would it be fine if I get 2gb version? or just go for 780ti?
also for sli users out there, I have 680 gigabyte 4gb version would it be fine if I get 2gb version?

or just go for 780ti?

Acer H5360 / BenQ XL2420T + 3D Vision 2 Kit - EVGA GTX 980TI 6GB - i7-3930K@4.0GHz - DX79SI- 16GB RAM@2133 - Win10x64 Home - HTC VIVE

#2
Posted 11/08/2013 07:13 PM   
With SLI you don't double up on usable VRAM, so your if your 1st card has 4gb VRAM so should your 2nd card. If your 2nd card only has 2GB VRAM your system will only use 2GB VRAM in total. SLI is good if you have a decent card and can easily afford the 2nd card and can put up with the added noise, heat and power requirements. Single bigger cards are better if you can easily afford it, as well as being able to sell your old card, you have no problems with SLI scaling or SLI profiles, there is also less noise and heat, and you can be happy that your card will always be performing to its maximum. There are a lot of games that don't scale well with SLI, so sometimes it feels like you paid out money which isn't being fully used. I would read up on some benchmarks first, and think about noise and heat because its hell in the summer sitting next to a PC kicking out 60C and hearing twice the fan noise.
With SLI you don't double up on usable VRAM, so your if your 1st card has 4gb VRAM so should your 2nd card.

If your 2nd card only has 2GB VRAM your system will only use 2GB VRAM in total.

SLI is good if you have a decent card and can easily afford the 2nd card and can put up with the added noise, heat and power requirements.

Single bigger cards are better if you can easily afford it, as well as being able to sell your old card, you have no problems with SLI scaling or SLI profiles, there is also less noise and heat, and you can be happy that your card will always be performing to its maximum.

There are a lot of games that don't scale well with SLI, so sometimes it feels like you paid out money which isn't being fully used.

I would read up on some benchmarks first, and think about noise and heat because its hell in the summer sitting next to a PC kicking out 60C and hearing twice the fan noise.

i7 4790k @ 4.6 - 16GB RAM - 2x SLI Titan X
27" ASUS ROG SWIFT, 28" - 65" Samsung UHD8200 4k 3DTV - Oculus Rift CV1 - 34" Acer Predator X34 Ultrawide

Old kit:
i5 2500k @ 4.4 - 8gb RAM
Acer H5360BD projector
GTX 580, SLI 670, GTX 980 EVGA SC
Acer XB280HK 4k 60hz
Oculus DK2

#3
Posted 11/08/2013 07:47 PM   
I purchased a titan to avoid SLI problems I would go with 780ti.
I purchased a titan to avoid SLI problems I would go with 780ti.

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

#4
Posted 11/08/2013 08:19 PM   
Think how much it all cost - you already got one 680 another one will be cheaper then replacing it. 680 is quite cheap right now because of all this new one. It doesn`t mean worse. Mine 680 beat a lot more expensive cards. I have 680 SLI setup overclocked really good and i can easy go against 780ti which is way more expensive even if i count my water cooling system on them.
Think how much it all cost - you already got one 680 another one will be cheaper then replacing it. 680 is quite cheap right now because of all this new one. It doesn`t mean worse. Mine 680 beat a lot more expensive cards.
I have 680 SLI setup overclocked really good and i can easy go against 780ti which is way more expensive even if i count my water cooling system on them.
One card is always better than two cards (for compatibility, ease of use, etc). If you've already got one though, it's up to your budget whether or not it's worth it to ditch it altogether.
One card is always better than two cards (for compatibility, ease of use, etc). If you've already got one though, it's up to your budget whether or not it's worth it to ditch it altogether.

i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard

#6
Posted 11/08/2013 09:41 PM   
The answer of course is "that depends". We don't know what the rest of your rig consists of and if your trying to fix a current problem or simply have some cash burning a hole in your pocket and your expecting a future problem with having a single 680 card. +SLI: if you stay with popular AAA titles there's usually a decent SLI profile which will gain you almost %50 better performance. SLI seems to be a natural fit for 3DVision and I think others have mentioned having issues with single cards where those with SLI setups don't. You'd keep your current 680 card investment instead of taking a loss in selling it. Having two cards running at 50-70% can be less fan noise than a single card running at 100% and gives you some leeway or cushion for scenes that might otherwise cause frame rate dips. -SLI: Nvidia does not release a profile for every game. There are plenty of INDI and smaller releases that don't get a official profile. Unless your good at experimentation and create your own profiles the extra performance won't be there for every game. Two cards take twice the +12V power and as such is fairly demanding on power supplies and case cooling. The cards have to get the cold air from SOMEWHERE and if your not feeding enough cold air into your case your cards could run hot. Some installs (depending on the cooling setup on the GTX itself) the top card will end up running 10-20F hotter than the bottom (#2) card. The biggest issue may be the question of when/how to upgrade afterwards. 2x680 is going to out perform all the single 7xx/Titian cards and many of the low end 8xx cards. If the next generation Nvidia GPU's follow the performance curve of the last few generations you may end up needing 2x880's to get 2x performance over the 2x680's or wait to get 1x 980 two generations from now. I went from 1x460=>2x460=>2x570=>2x780 making my next upgrade 2x980 (or better).
The answer of course is "that depends".

We don't know what the rest of your rig consists of and if your trying to fix a current problem or simply have some cash burning a hole in your pocket and your expecting a future problem with having a single 680 card.

+SLI: if you stay with popular AAA titles there's usually a decent SLI profile which will gain you almost %50 better performance. SLI seems to be a natural fit for 3DVision and I think others have mentioned having issues with single cards where those with SLI setups don't. You'd keep your current 680 card investment instead of taking a loss in selling it. Having two cards running at 50-70% can be less fan noise than a single card running at 100% and gives you some leeway or cushion for scenes that might otherwise cause frame rate dips.

-SLI: Nvidia does not release a profile for every game. There are plenty of INDI and smaller releases that don't get a official profile. Unless your good at experimentation and create your own profiles the extra performance won't be there for every game. Two cards take twice the +12V power and as such is fairly demanding on power supplies and case cooling. The cards have to get the cold air from SOMEWHERE and if your not feeding enough cold air into your case your cards could run hot. Some installs (depending on the cooling setup on the GTX itself) the top card will end up running 10-20F hotter than the bottom (#2) card.

The biggest issue may be the question of when/how to upgrade afterwards. 2x680 is going to out perform all the single 7xx/Titian cards and many of the low end 8xx cards. If the next generation Nvidia GPU's follow the performance curve of the last few generations you may end up needing 2x880's to get 2x performance over the 2x680's or wait to get 1x 980 two generations from now.

I went from 1x460=>2x460=>2x570=>2x780 making my next upgrade 2x980 (or better).

i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"

#7
Posted 11/09/2013 02:07 AM   
Also, since this is in the 3D Vision section- if anyone else is considering this, it's very clear that dual 680 SLI is the way to go. SLI is a natural fit for 3D Vision, and the performance of SLI 680 is far superior to even a single Titan.
Also, since this is in the 3D Vision section- if anyone else is considering this, it's very clear that dual 680 SLI is the way to go. SLI is a natural fit for 3D Vision, and the performance of SLI 680 is far superior to even a single Titan.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#8
Posted 11/09/2013 02:29 AM   
Want to play Skyrim with 3D Vision and some cool ENB post processing? Then it works only on SLI;)) Like many have said...SLI is the way to go for 3D Vision. Personally, I think that any SLI setup for MONO (2D)SINGLE Monitor is an OVERKILL.... (unless you want to go Surround or even better 3D Surround) So, first think what you want to achieve and what you want to use it for. Based on that you should get a fairly easy response for your question;))
Want to play Skyrim with 3D Vision and some cool ENB post processing? Then it works only on SLI;))
Like many have said...SLI is the way to go for 3D Vision. Personally, I think that any SLI setup for MONO (2D)SINGLE Monitor is an OVERKILL.... (unless you want to go Surround or even better 3D Surround)

So, first think what you want to achieve and what you want to use it for. Based on that you should get a fairly easy response for your question;))

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#9
Posted 11/09/2013 02:38 AM   
If you ever want to play in 3D, go with SLI. You'll get a 100% performance boost. If you're playing in 2d, go look up benchmarks online and pick the best option. Keep in mind that if you get a 780 you can probably sell your 680 on eBay too.
If you ever want to play in 3D, go with SLI. You'll get a 100% performance boost. If you're playing in 2d, go look up benchmarks online and pick the best option. Keep in mind that if you get a 780 you can probably sell your 680 on eBay too.

#10
Posted 11/09/2013 06:12 AM   
first of all thanks for your input, I really appreciated that. secondly sorry for not being specific enough. The thing that lead me to think about the upgrade was bf4 as I've noticed that my CPU usage (3930K@4.4 http://ark.intel.com/pl/products/63697/intel-core-i7-3930k-processor-12m-cache-up-to-3_80-ghz) is around 30-40% while my GPU (gigabyte GTX680 4GB slightly oc'ed http://www.gigabyte.us/products/product-page.aspx?pid=4373#ov) is running at 99%. As for my other components I'm using DX79SI http://www.intel.com/content/www/us/en/motherboards/desktop-motherboards/desktop-board-dx79si.html in a zalman case http://www.zalman.com/global/product/Product_Read.php?Idx=367, I'm also using Scythe NINJA 3 Rev.B for my CPU cooling so yeah I have a hovercraft :) Im using 3-monitor setup but only one is capable of displaying 3d benq XL2420T and I'm also using Acer H5360.
first of all thanks for your input, I really appreciated that.

secondly sorry for not being specific enough. The thing that lead me to think about the upgrade was bf4 as I've noticed that my CPU usage (3930K@4.4 http://ark.intel.com/pl/products/63697/intel-core-i7-3930k-processor-12m-cache-up-to-3_80-ghz) is around 30-40% while my GPU (gigabyte GTX680 4GB slightly oc'ed http://www.gigabyte.us/products/product-page.aspx?pid=4373#ov) is running at 99%.

As for my other components I'm using DX79SI http://www.intel.com/content/www/us/en/motherboards/desktop-motherboards/desktop-board-dx79si.html in a zalman case http://www.zalman.com/global/product/Product_Read.php?Idx=367, I'm also using Scythe NINJA 3 Rev.B for my CPU cooling so yeah I have a hovercraft :) Im using 3-monitor setup but only one is capable of displaying 3d benq XL2420T and I'm also using Acer H5360.

Acer H5360 / BenQ XL2420T + 3D Vision 2 Kit - EVGA GTX 980TI 6GB - i7-3930K@4.0GHz - DX79SI- 16GB RAM@2133 - Win10x64 Home - HTC VIVE

#11
Posted 11/09/2013 09:19 AM   
So... are you running BF4 in Surround? If you are running in Surround you pretty much have to do SLI to get enough performance. And SLI 780 would of course be better than SLI 680, but you'd get a dramatic boost with a second card. Here's NVidia's comparative graph: [img]http://international.download.nvidia.com/webassets/en_US/shared/images/products/shared/lineup.png[/img] The tallest spike there is GTX 690, which is essentially SLI 680. Naturally you could do better with SLI 770, or better, at a higher price. For the purposes that you've described, some 3D gaming with monitor or Acer projector, BF3 on high settings, and/or Surround, there is no way that a single 780 would be the right choice. Not when you've already got a kick-ass 680 to start with.
So... are you running BF4 in Surround?

If you are running in Surround you pretty much have to do SLI to get enough performance. And SLI 780 would of course be better than SLI 680, but you'd get a dramatic boost with a second card.

Here's NVidia's comparative graph:

Image

The tallest spike there is GTX 690, which is essentially SLI 680. Naturally you could do better with SLI 770, or better, at a higher price.


For the purposes that you've described, some 3D gaming with monitor or Acer projector, BF3 on high settings, and/or Surround, there is no way that a single 780 would be the right choice. Not when you've already got a kick-ass 680 to start with.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#12
Posted 11/09/2013 12:39 PM   
actually I only play bf4 with my main monitor as I do favour frame-rate over candy graphics (still dont want to play on low) so I'm getting like 70-120 with the mix of high/ultra at 1080p but I would like to have them locked at 120 the whole time :) side note: what is the difference between a situation when somebody is using the G-Sync and one that I have locked 120fps on a 120Hz display? From my understandings is that the gsync will sync the refresh rate if I have some dips in fps to match fps with refresh rate but is no different from a situation where I have frames matching refresh rate the whole time, is that right? interesting thing is that I cant get 120fps constant no matter what, even if I set the graphics on LOW and scale down the resolution (so bf4 looks like tetris) I still have less than 100fps (GPU usage is just going down). I'm not sure whats the deal is here. Performance on the projector isnt an issue since its 720p
actually I only play bf4 with my main monitor as I do favour frame-rate over candy graphics (still dont want to play on low) so I'm getting like 70-120 with the mix of high/ultra at 1080p but I would like to have them locked at 120 the whole time :)

side note: what is the difference between a situation when somebody is using the G-Sync and one that I have locked 120fps on a 120Hz display? From my understandings is that the gsync will sync the refresh rate if I have some dips in fps to match fps with refresh rate but is no different from a situation where I have frames matching refresh rate the whole time, is that right?

interesting thing is that I cant get 120fps constant no matter what, even if I set the graphics on LOW and scale down the resolution (so bf4 looks like tetris) I still have less than 100fps (GPU usage is just going down). I'm not sure whats the deal is here.

Performance on the projector isnt an issue since its 720p

Acer H5360 / BenQ XL2420T + 3D Vision 2 Kit - EVGA GTX 980TI 6GB - i7-3930K@4.0GHz - DX79SI- 16GB RAM@2133 - Win10x64 Home - HTC VIVE

#13
Posted 11/09/2013 01:03 PM   
I recently pulled my 2nd 690 from my Rig and dropped it into my other Rig. I have better BF4 performance with just one 690 than 2 in BF4. and I'm down an average of maybe 5 to 10 frames with My 690 in single GPU mode for BF4. In none of my combinations was I able to maintain 120 frames per second in BF4. I don't know anyone who can. While Rendering and the GPU is important there are other parts to BF4 that may limit the frame rate. Just saying.
I recently pulled my 2nd 690 from my Rig and dropped it into my other Rig. I have better BF4 performance with just one 690 than 2 in BF4. and I'm down an average of maybe 5 to 10 frames with My 690 in single GPU mode for BF4. In none of my combinations was I able to maintain 120 frames per second in BF4. I don't know anyone who can. While Rendering and the GPU is important there are other parts to BF4 that may limit the frame rate. Just saying.

Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10

#14
Posted 11/09/2013 01:55 PM   
[quote="tehace"]actually I only play bf4 with my main monitor as I do favour frame-rate over candy graphics (still dont want to play on low) so I'm getting like 70-120 with the mix of high/ultra at 1080p but I would like to have them locked at 120 the whole time :) side note: what is the difference between a situation when somebody is using the G-Sync and one that I have locked 120fps on a 120Hz display? From my understandings is that the gsync will sync the refresh rate if I have some dips in fps to match fps with refresh rate but is no different from a situation where I have frames matching refresh rate the whole time, is that right? interesting thing is that I cant get 120fps constant no matter what, even if I set the graphics on LOW and scale down the resolution (so bf4 looks like tetris) I still have less than 100fps (GPU usage is just going down). I'm not sure whats the deal is here. Performance on the projector isnt an issue since its 720p [/quote]Good info. That fills in some gaps. Based on your observations and msm903, I'd say that in BF4, you are actually CPU limited, not GPU limited. Your experiment of turning everything to low, and lowered resolution is the usual definitive test for CPU vs. GPU. When you see that the CPU is only running at 35-40%, that could easily be because the other cores are not active. Not sure if BF3 is comparable to BF4, but in BF3 the scaling with extra cores was OK, but not stunning. With 12 threads, I'd expect at least a third of them to be idle in game. You are pretty likely to be limited by how fast the first 3 or 4 threads can run. To answer your question about GSync- as you expect, it only kicks in if you go [i]lower [/i]than the 120Hz target. Anything faster runs the same as it does today. It doesn't sound like you can achieve solidly over 120, and besides, there is nearly certain to be some firefights or situations where you dip below 120 fps, and when you do, you pay a full frame penalty without GSync. With GSync, it should be imperceptible. Seems to me that you could really benefit from GSync, and more GPU, maybe not as much. You could maybe add water cooling (Kraken for example) and try to bump your Sandy Bridge up to 5Ghz. Sandy Bridge is generally seen to overclock pretty well. That would probably make a bigger difference than more GPU.
tehace said:actually I only play bf4 with my main monitor as I do favour frame-rate over candy graphics (still dont want to play on low) so I'm getting like 70-120 with the mix of high/ultra at 1080p but I would like to have them locked at 120 the whole time :)

side note: what is the difference between a situation when somebody is using the G-Sync and one that I have locked 120fps on a 120Hz display? From my understandings is that the gsync will sync the refresh rate if I have some dips in fps to match fps with refresh rate but is no different from a situation where I have frames matching refresh rate the whole time, is that right?

interesting thing is that I cant get 120fps constant no matter what, even if I set the graphics on LOW and scale down the resolution (so bf4 looks like tetris) I still have less than 100fps (GPU usage is just going down). I'm not sure whats the deal is here.

Performance on the projector isnt an issue since its 720p
Good info. That fills in some gaps.

Based on your observations and msm903, I'd say that in BF4, you are actually CPU limited, not GPU limited. Your experiment of turning everything to low, and lowered resolution is the usual definitive test for CPU vs. GPU.

When you see that the CPU is only running at 35-40%, that could easily be because the other cores are not active. Not sure if BF3 is comparable to BF4, but in BF3 the scaling with extra cores was OK, but not stunning. With 12 threads, I'd expect at least a third of them to be idle in game. You are pretty likely to be limited by how fast the first 3 or 4 threads can run.


To answer your question about GSync- as you expect, it only kicks in if you go lower than the 120Hz target. Anything faster runs the same as it does today.

It doesn't sound like you can achieve solidly over 120, and besides, there is nearly certain to be some firefights or situations where you dip below 120 fps, and when you do, you pay a full frame penalty without GSync. With GSync, it should be imperceptible.

Seems to me that you could really benefit from GSync, and more GPU, maybe not as much. You could maybe add water cooling (Kraken for example) and try to bump your Sandy Bridge up to 5Ghz. Sandy Bridge is generally seen to overclock pretty well. That would probably make a bigger difference than more GPU.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#15
Posted 11/09/2013 11:54 PM   
  1 / 3    
Scroll To Top