[quote="bo3b"]The only slight caveat is that with more cores and Hyperthreading it's hard to get much overclock. A lot of games are essentially single threaded, and you mentioned AC4, which suffers from this. This is why I went with a 4 core chip, so that I could overclock the primary core better. Not sure it was the right choice though, because overclocking Haswell is pretty limited no matter what.[/quote]
What overclock did you get? (your sig says you're running at stock)
[quote]For the 660 vs 750ti as a dedicated PhysX card- I honestly have no idea. Probably 750ti is overkill, but the only way to know would be testing, and I haven't seen any testing of that level of card for PhysX. 660 is a little more expensive than 750ti, and 750ti is more power efficient which might matter for 3 cards on your PSU. [/quote]
WhiteSkyMage, I don't know about the 660 or 750ti, but I know about the 650ti, which I'm assuming isn't too dissimilar from either of those cards (I'm pretty sure 650ti was based off the 660 architecture, and I'm assuming just on name alone that the 750ti is targeting a similar performance bracket.).
I can tell you that a 650ti is not quite good enough if you really want an ideal build. In my tests, a 650 (NOT 650ti) was great in most PhysX games, but not quite good enough for Arkahm Origins, which is probably the de facto benchmark at the moment, as it uses a bunch of new PhysX turbulence effects.
Since doing the testing shown on my blog, I upgraded the 650 to a 650ti, which closed the gap significantly in Arkham Origins, but still wasn't quite as good as just running the twin Titans.
I'm happy with the 650ti because it's a half-size card which greatly helps the heat/noise situation in my case. But if I were building a blue-sky future-proof beast like you are, I'd definitely get something a little stronger for PhysX, especially if Arkham Origins is any indication of where PhysX is going.
bo3b said:The only slight caveat is that with more cores and Hyperthreading it's hard to get much overclock. A lot of games are essentially single threaded, and you mentioned AC4, which suffers from this. This is why I went with a 4 core chip, so that I could overclock the primary core better. Not sure it was the right choice though, because overclocking Haswell is pretty limited no matter what.
What overclock did you get? (your sig says you're running at stock)
For the 660 vs 750ti as a dedicated PhysX card- I honestly have no idea. Probably 750ti is overkill, but the only way to know would be testing, and I haven't seen any testing of that level of card for PhysX. 660 is a little more expensive than 750ti, and 750ti is more power efficient which might matter for 3 cards on your PSU.
WhiteSkyMage, I don't know about the 660 or 750ti, but I know about the 650ti, which I'm assuming isn't too dissimilar from either of those cards (I'm pretty sure 650ti was based off the 660 architecture, and I'm assuming just on name alone that the 750ti is targeting a similar performance bracket.).
I can tell you that a 650ti is not quite good enough if you really want an ideal build. In my tests, a 650 (NOT 650ti) was great in most PhysX games, but not quite good enough for Arkahm Origins, which is probably the de facto benchmark at the moment, as it uses a bunch of new PhysX turbulence effects.
Since doing the testing shown on my blog, I upgraded the 650 to a 650ti, which closed the gap significantly in Arkham Origins, but still wasn't quite as good as just running the twin Titans.
I'm happy with the 650ti because it's a half-size card which greatly helps the heat/noise situation in my case. But if I were building a blue-sky future-proof beast like you are, I'd definitely get something a little stronger for PhysX, especially if Arkham Origins is any indication of where PhysX is going.
Alright well, looks like I need to go around some other forums and ask about what's the best choice for dedicated PhysX card... The only drawback I have with GTX 660 or 760 is the cost and the Kepler architecture. I really ain't gonna be happy seeing something "last gen" in the build...also that I really wanna pay a small amount for the PhysX card cuz really that's 3 cards in already.
Seeing that GTX 750Ti gets me more performance than the GTX 650/Ti, and it's more efficient, meaning that I won't need another radiator for it...i will be WC those cards including CPU so...yeah that's 6 rads already. As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see
For that 6-core CPU, I know that most games are single threat, but new games like BF4, which will show up in the next couple of years, they would probably go more to the multi-threat performance...so yeah that's a small proof on for the future...
Anyway I will see what happens and what other people say about those mid-range cards.
Alright well, looks like I need to go around some other forums and ask about what's the best choice for dedicated PhysX card... The only drawback I have with GTX 660 or 760 is the cost and the Kepler architecture. I really ain't gonna be happy seeing something "last gen" in the build...also that I really wanna pay a small amount for the PhysX card cuz really that's 3 cards in already.
Seeing that GTX 750Ti gets me more performance than the GTX 650/Ti, and it's more efficient, meaning that I won't need another radiator for it...i will be WC those cards including CPU so...yeah that's 6 rads already. As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see
For that 6-core CPU, I know that most games are single threat, but new games like BF4, which will show up in the next couple of years, they would probably go more to the multi-threat performance...so yeah that's a small proof on for the future...
Anyway I will see what happens and what other people say about those mid-range cards.
Based on my favorite graph:
[img]http://international.download.nvidia.com/webassets/en_US/shared/images/products/shared/lineup.png[/img]
That 750Ti is actually pretty damn slow. Slower than a GTX 570. A LOT slower than a 660. Don't go only by the naming, the marketing weasels try to cheat you at every turn. This is one reason I use the Borderlands benchmark instead: The price.
That 750Ti is actually pretty damn slow. Slower than a GTX 570. A LOT slower than a 660. Don't go only by the naming, the marketing weasels try to cheat you at every turn. This is one reason I use the Borderlands benchmark instead: The price.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Well, judging by that graph, a 760 or 770 would be a pretty sweet choice. Not TOO expensive (at least not compared to the rest of your rig), but a handsome amount of power that avoids the diminishing returns of the highest cards, and should give a PhysX boost for years to come.
[quote="WhiteSkyMage"]As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see[/quote]
I recommend investigating a modded VBIOS for your cards that get rid of GPU Boost 2.0. Boost 2.0 is a decent power-saving feature, but it can get to be a real nuisance in games, creating unpredictable FPS and even stuttering, as the card's clocks are constantly jumping up and down. It's also downright infuriating watching your overclocked cards dip down to stock speeds or worse even though they're well below the temparature threshold, and your game isn't even getting 60fps!
[url="http://forum.techinferno.com/nvidia-video-cards/3454-nvidia-gtx-titan-modified-vbios-more-control-better-overclocking.html"]This guy [/url]has done a modded vbios for the Titan that gives you constant clocks, and since installing it onto my cards 6 months ago or so, I haven't looked back. I settled on a modest overclock that doesn't run too hot, and my cards never deviate from that except when idle. If I can help it, I will never buy another GPU Boost card again unless I know there's a modded vbios available.
He (or someone else) may have also made a modded vbios for the cards you're going to get.
Well, judging by that graph, a 760 or 770 would be a pretty sweet choice. Not TOO expensive (at least not compared to the rest of your rig), but a handsome amount of power that avoids the diminishing returns of the highest cards, and should give a PhysX boost for years to come.
WhiteSkyMage said:As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see
I recommend investigating a modded VBIOS for your cards that get rid of GPU Boost 2.0. Boost 2.0 is a decent power-saving feature, but it can get to be a real nuisance in games, creating unpredictable FPS and even stuttering, as the card's clocks are constantly jumping up and down. It's also downright infuriating watching your overclocked cards dip down to stock speeds or worse even though they're well below the temparature threshold, and your game isn't even getting 60fps!
This guy has done a modded vbios for the Titan that gives you constant clocks, and since installing it onto my cards 6 months ago or so, I haven't looked back. I settled on a modest overclock that doesn't run too hot, and my cards never deviate from that except when idle. If I can help it, I will never buy another GPU Boost card again unless I know there's a modded vbios available.
He (or someone else) may have also made a modded vbios for the cards you're going to get.
[quote="Volnaiskra"]Well, judging by that graph, a 760 or 770 would be a pretty sweet choice. Not TOO expensive (at least not compared to the rest of your rig), but a handsome amount of power that avoids the diminishing returns of the highest cards, and should give a PhysX boost for years to come.
[quote="WhiteSkyMage"]As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see[/quote]
I recommend investigating a modded VBIOS for your cards that get rid of GPU Boost 2.0. Boost 2.0 is a decent power-saving feature, but it can get to be a real nuisance in games, creating unpredictable FPS and even stuttering, as the card's clocks are constantly jumping up and down. It's also downright infuriating watching your overclocked cards dip down to stock speeds or worse even though they're well below the temparature threshold, and your game isn't even getting 60fps!
[url="http://forum.techinferno.com/nvidia-video-cards/3454-nvidia-gtx-titan-modified-vbios-more-control-better-overclocking.html"]This guy [/url]has done a modded vbios for the Titan that gives you constant clocks, and since installing it onto my cards 6 months ago or so, I haven't looked back. I settled on a modest overclock that doesn't run too hot, and my cards never deviate from that except when idle. If I can help it, I will never buy another GPU Boost card again unless I know there's a modded vbios available.
He (or someone else) may have also made a modded vbios for the cards you're going to get.
[/quote]
Alright, looks like reviewers are not the best thing to watch all the time...they go in favour of the companies and advertising...i never thought GPU boost is actually a negative...I will have that in mind... I am anyway getting most components on BF, hopefully then the GTX 760 will be cheaper. I would count that as a card for PhysX dedication...hopefully the 20nm cards will be coming in december...
Volnaiskra said:Well, judging by that graph, a 760 or 770 would be a pretty sweet choice. Not TOO expensive (at least not compared to the rest of your rig), but a handsome amount of power that avoids the diminishing returns of the highest cards, and should give a PhysX boost for years to come.
WhiteSkyMage said:As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see
I recommend investigating a modded VBIOS for your cards that get rid of GPU Boost 2.0. Boost 2.0 is a decent power-saving feature, but it can get to be a real nuisance in games, creating unpredictable FPS and even stuttering, as the card's clocks are constantly jumping up and down. It's also downright infuriating watching your overclocked cards dip down to stock speeds or worse even though they're well below the temparature threshold, and your game isn't even getting 60fps!
This guy has done a modded vbios for the Titan that gives you constant clocks, and since installing it onto my cards 6 months ago or so, I haven't looked back. I settled on a modest overclock that doesn't run too hot, and my cards never deviate from that except when idle. If I can help it, I will never buy another GPU Boost card again unless I know there's a modded vbios available.
He (or someone else) may have also made a modded vbios for the cards you're going to get.
Alright, looks like reviewers are not the best thing to watch all the time...they go in favour of the companies and advertising...i never thought GPU boost is actually a negative...I will have that in mind... I am anyway getting most components on BF, hopefully then the GTX 760 will be cheaper. I would count that as a card for PhysX dedication...hopefully the 20nm cards will be coming in december...
[quote="WhiteSkyMage"]Alright, looks like reviewers are not the best thing to watch all the time...they go in favour of the companies and advertising...i never thought GPU boost is actually a negative...I will have that in mind... I am anyway getting most components on BF, hopefully then the GTX 760 will be cheaper. I would count that as a card for PhysX dedication...hopefully the 20nm cards will be coming in december... [/quote]Also note that the GTX 760 is a Kepler part, GK104. Pretty much a 670 with GPU boost.
[url]http://www.anandtech.com/show/7103/nvidia-geforce-gtx-760-review[/url]
WhiteSkyMage said:Alright, looks like reviewers are not the best thing to watch all the time...they go in favour of the companies and advertising...i never thought GPU boost is actually a negative...I will have that in mind... I am anyway getting most components on BF, hopefully then the GTX 760 will be cheaper. I would count that as a card for PhysX dedication...hopefully the 20nm cards will be coming in december...
Also note that the GTX 760 is a Kepler part, GK104. Pretty much a 670 with GPU boost.
[quote="Volnaiskra"][quote="WhiteSkyMage"]...hopefully the 20nm cards will be coming in december... [/quote]Will it be that long?! I didn't realise.[/quote]
Yes, I was REALLY sad when I heard about that :( It's simply that TSMC are not ready to start full scale 20nm production and there is a rumor going around that there will be a delay, a big one. Because of it, they expect Maxwell Cards to launch in Jan-Feb next year and if we are lucky, to see of the first cards around X-mas.
[quote="bo3b"]Also note that the GTX 760 is a Kepler part, GK104. Pretty much a 670 with GPU boost.
[url]http://www.anandtech.com/show/7103/nvidia-geforce-gtx-760-review[/url][/quote]
Yup, sweet! - a perfect match card! That's what I need - just between GTX 660Ti and GTX 670. Thanks a lot. That will hopefully be enough for a heavy PhysX game. :) I will find a way to get around that GPU boost and OC it at a reasonable clock.
WhiteSkyMage said:...hopefully the 20nm cards will be coming in december...
Will it be that long?! I didn't realise.
Yes, I was REALLY sad when I heard about that :( It's simply that TSMC are not ready to start full scale 20nm production and there is a rumor going around that there will be a delay, a big one. Because of it, they expect Maxwell Cards to launch in Jan-Feb next year and if we are lucky, to see of the first cards around X-mas.
bo3b said:Also note that the GTX 760 is a Kepler part, GK104. Pretty much a 670 with GPU boost.
Yup, sweet! - a perfect match card! That's what I need - just between GTX 660Ti and GTX 670. Thanks a lot. That will hopefully be enough for a heavy PhysX game. :) I will find a way to get around that GPU boost and OC it at a reasonable clock.
[quote="bo3b"]For the Haswell-E CPU, the only reason to not go with the 8 core version would be cost. But it's a pretty big delta. Intel always bends you over for the absolute top chip.
I'd say it depends upon how much video editing you do. You will absolutely see a significant difference from 6 to 8 cores in video editing, compression, conversion. If you are getting paid for video work, there is no question it is worth the money. If you do it for fun, it's time, but may not matter enough to justify the cost.
Game-wise there is no drawback to 8 cores, they just mostly won't get used so it's a waste of money. 4 cores with i5 (what I run, no hyperthread), is probably not quite right for the future. We already have a couple of games, BF4, Rage, probably WatchDogs.
Seems to me the 6 core (12 thread with HyperThreading) is plenty for gaming for at least a couple of years.
The only slight caveat is that with more cores and Hyperthreading it's hard to get much overclock. A lot of games are essentially single threaded, and you mentioned AC4, which suffers from this. This is why I went with a 4 core chip, so that I could overclock the primary core better. Not sure it was the right choice though, because overclocking Haswell is pretty limited no matter what.
If I were buying new today, and video editing and re-encoding is on the agenda, I would go with your plan of 6 core Haswell-E.
....
....
....
Hope that helps.[/quote]
Oh god, this is bad - I managed to "tell the future" about that i7 5820K will be 6 core and will have around the same specs as 4960X, but looks like I will not be able to go SLI and have one PhysX card with 28 PCIe lanes. Ok... I will see how that will work out...
Now I can tell there will be GTX980Ti with 20nm architecture where GTX980 will be only GM204 28nm. Nvidia is reported to jump over GTX 800 series to avoid confusion between mobile and desktop gpus. Oh well, I dont mind.
Looks like playing around with Anti Aliasing on 2D is fun... I will see how that goes with 3D with my new rig soon. Maxwell cards will probably be on the shelves before December :)
Do you guys have any experience with Anti Aliasing...? I see for every game there is different AA that works best for its visuals.
TXAA is bluuuuury! and MSAA cuts down FPS in slices...FXAA works best for most games, as far as i can see...
bo3b said:For the Haswell-E CPU, the only reason to not go with the 8 core version would be cost. But it's a pretty big delta. Intel always bends you over for the absolute top chip.
I'd say it depends upon how much video editing you do. You will absolutely see a significant difference from 6 to 8 cores in video editing, compression, conversion. If you are getting paid for video work, there is no question it is worth the money. If you do it for fun, it's time, but may not matter enough to justify the cost.
Game-wise there is no drawback to 8 cores, they just mostly won't get used so it's a waste of money. 4 cores with i5 (what I run, no hyperthread), is probably not quite right for the future. We already have a couple of games, BF4, Rage, probably WatchDogs.
Seems to me the 6 core (12 thread with HyperThreading) is plenty for gaming for at least a couple of years.
The only slight caveat is that with more cores and Hyperthreading it's hard to get much overclock. A lot of games are essentially single threaded, and you mentioned AC4, which suffers from this. This is why I went with a 4 core chip, so that I could overclock the primary core better. Not sure it was the right choice though, because overclocking Haswell is pretty limited no matter what.
If I were buying new today, and video editing and re-encoding is on the agenda, I would go with your plan of 6 core Haswell-E.
....
....
....
Hope that helps.
Oh god, this is bad - I managed to "tell the future" about that i7 5820K will be 6 core and will have around the same specs as 4960X, but looks like I will not be able to go SLI and have one PhysX card with 28 PCIe lanes. Ok... I will see how that will work out...
Now I can tell there will be GTX980Ti with 20nm architecture where GTX980 will be only GM204 28nm. Nvidia is reported to jump over GTX 800 series to avoid confusion between mobile and desktop gpus. Oh well, I dont mind.
Looks like playing around with Anti Aliasing on 2D is fun... I will see how that goes with 3D with my new rig soon. Maxwell cards will probably be on the shelves before December :)
Do you guys have any experience with Anti Aliasing...? I see for every game there is different AA that works best for its visuals.
TXAA is bluuuuury! and MSAA cuts down FPS in slices...FXAA works best for most games, as far as i can see...
What overclock did you get? (your sig says you're running at stock)
WhiteSkyMage, I don't know about the 660 or 750ti, but I know about the 650ti, which I'm assuming isn't too dissimilar from either of those cards (I'm pretty sure 650ti was based off the 660 architecture, and I'm assuming just on name alone that the 750ti is targeting a similar performance bracket.).
I can tell you that a 650ti is not quite good enough if you really want an ideal build. In my tests, a 650 (NOT 650ti) was great in most PhysX games, but not quite good enough for Arkahm Origins, which is probably the de facto benchmark at the moment, as it uses a bunch of new PhysX turbulence effects.
Since doing the testing shown on my blog, I upgraded the 650 to a 650ti, which closed the gap significantly in Arkham Origins, but still wasn't quite as good as just running the twin Titans.
I'm happy with the 650ti because it's a half-size card which greatly helps the heat/noise situation in my case. But if I were building a blue-sky future-proof beast like you are, I'd definitely get something a little stronger for PhysX, especially if Arkham Origins is any indication of where PhysX is going.
Seeing that GTX 750Ti gets me more performance than the GTX 650/Ti, and it's more efficient, meaning that I won't need another radiator for it...i will be WC those cards including CPU so...yeah that's 6 rads already. As I know the cooler the card is, the more aggressive GPU Boost 2.0 gets, so maybe I will not need to OC it even...I will see
For that 6-core CPU, I know that most games are single threat, but new games like BF4, which will show up in the next couple of years, they would probably go more to the multi-threat performance...so yeah that's a small proof on for the future...
Anyway I will see what happens and what other people say about those mid-range cards.
That 750Ti is actually pretty damn slow. Slower than a GTX 570. A LOT slower than a 660. Don't go only by the naming, the marketing weasels try to cheat you at every turn. This is one reason I use the Borderlands benchmark instead: The price.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I recommend investigating a modded VBIOS for your cards that get rid of GPU Boost 2.0. Boost 2.0 is a decent power-saving feature, but it can get to be a real nuisance in games, creating unpredictable FPS and even stuttering, as the card's clocks are constantly jumping up and down. It's also downright infuriating watching your overclocked cards dip down to stock speeds or worse even though they're well below the temparature threshold, and your game isn't even getting 60fps!
This guy has done a modded vbios for the Titan that gives you constant clocks, and since installing it onto my cards 6 months ago or so, I haven't looked back. I settled on a modest overclock that doesn't run too hot, and my cards never deviate from that except when idle. If I can help it, I will never buy another GPU Boost card again unless I know there's a modded vbios available.
He (or someone else) may have also made a modded vbios for the cards you're going to get.
Alright, looks like reviewers are not the best thing to watch all the time...they go in favour of the companies and advertising...i never thought GPU boost is actually a negative...I will have that in mind... I am anyway getting most components on BF, hopefully then the GTX 760 will be cheaper. I would count that as a card for PhysX dedication...hopefully the 20nm cards will be coming in december...
http://www.anandtech.com/show/7103/nvidia-geforce-gtx-760-review
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Yes, I was REALLY sad when I heard about that :( It's simply that TSMC are not ready to start full scale 20nm production and there is a rumor going around that there will be a delay, a big one. Because of it, they expect Maxwell Cards to launch in Jan-Feb next year and if we are lucky, to see of the first cards around X-mas.
Yup, sweet! - a perfect match card! That's what I need - just between GTX 660Ti and GTX 670. Thanks a lot. That will hopefully be enough for a heavy PhysX game. :) I will find a way to get around that GPU boost and OC it at a reasonable clock.
Oh god, this is bad - I managed to "tell the future" about that i7 5820K will be 6 core and will have around the same specs as 4960X, but looks like I will not be able to go SLI and have one PhysX card with 28 PCIe lanes. Ok... I will see how that will work out...
Now I can tell there will be GTX980Ti with 20nm architecture where GTX980 will be only GM204 28nm. Nvidia is reported to jump over GTX 800 series to avoid confusion between mobile and desktop gpus. Oh well, I dont mind.
Looks like playing around with Anti Aliasing on 2D is fun... I will see how that goes with 3D with my new rig soon. Maxwell cards will probably be on the shelves before December :)
Do you guys have any experience with Anti Aliasing...? I see for every game there is different AA that works best for its visuals.
TXAA is bluuuuury! and MSAA cuts down FPS in slices...FXAA works best for most games, as far as i can see...