Battlefield 3 3DVision Benchmarks
  31 / 43    
[quote name='chiz' date='07 December 2011 - 02:58 PM' timestamp='1323287920' post='1338977']
Because all that extra geometry needs extra calculations and draw calls to be issued, which places additional burden on the CPU before the GPU. If the CPU can't keep up, the GPU goes unutilized. As I mentioned earlier, and Grestorn touched on it too, the Frostbite 2 engine handles all stereo rendering with true dual-rendering that requires 2x draw calls. This puts ~2x the burden on the CPU while the overall CPU performance budget remains the same. We even see some of this going on in Skyrim if you face a certain direction outdoors. The problem is most games still do not maximize multiple CPU cores and have lightly threaded rendering engines, so the CPU quickly becomes a bottleneck.

I stumbled across the entire GDC presentation from DICE, its worth a read and you can get an idea of how CPU reliant BF3/Frostbite 2 is for maximizing GPU performance. Imo, BFBC2 was even worst when it came to CPU requirements and relative performance, but the good news is I think DICE has room for improvement. Slide 27 starts with the discussion of BF3's changes to rendering performance:

http://www.slideshare.net/DICEStudio/directx-11-rendering-in-battlefield-3

What we really need from DICE is to fully unlock DX11 performance with the kind of multi-threaded rendering performance gains we saw with Civ 5. There's an awesome read about DX11 and multi-threaded rendering and its impact on CPU/GPU performance on AnandTech: http://forums.anandtech.com/showpost.php?p=31520674&postcount=28

Basically, with stereo 3D, DICE has doubled the CPU requirements for rendering without doubling CPU budget/performance. If they could at the very least, double the number of rendering threads (one for each camera view), that would go a long way in taking care of the low performance problem for some users with really powerful graphics cards. It wouldn't help me too much though as I'm pretty much fully utilized on both GPUs.

Also, I wouldn't rule out VRAM being an issue in some cases, BF3 uses 1400MB for me with mostly Ultra settings, Shadows High, Ambient Occlusion Off and MSAA Off (Deferred AA). Really high VRAM use for a game with no AA enabled.
[/quote]


Those are very interesting reads (thanks) but neither is really relevant. The GDC slides talk about how instancing [i]is [/i]a CPU burden but how their engine allows for greater instancing without extra CPU overhead associated with that extra instancing. The second article only relates to GPU rendering/multithreading. The fact remains that since not a single CPU core (and it's not hyperthreaded) [b]nor[/b] the GPUs are even close to becoming saturated that they are not bottlenecks. As far as VRAM, changing the texture quality from Ultra to Low and everything in between has literally zero affect on the framerate. That all points to a HUGE inefficiency somewhere, either in DICE's 3D rendering techniques or nvidia's drivers - at least as far as I can think of. If any of my CPU cores were near being maxed, even if one of the four were starting to hit 70% I would completely agree, but all of the physical cores are at about 50% and only slightly better for the GPUs at times when I'm getting 30fps. Can you think of another game which displays such behavior?
[quote name='chiz' date='07 December 2011 - 02:58 PM' timestamp='1323287920' post='1338977']

Because all that extra geometry needs extra calculations and draw calls to be issued, which places additional burden on the CPU before the GPU. If the CPU can't keep up, the GPU goes unutilized. As I mentioned earlier, and Grestorn touched on it too, the Frostbite 2 engine handles all stereo rendering with true dual-rendering that requires 2x draw calls. This puts ~2x the burden on the CPU while the overall CPU performance budget remains the same. We even see some of this going on in Skyrim if you face a certain direction outdoors. The problem is most games still do not maximize multiple CPU cores and have lightly threaded rendering engines, so the CPU quickly becomes a bottleneck.



I stumbled across the entire GDC presentation from DICE, its worth a read and you can get an idea of how CPU reliant BF3/Frostbite 2 is for maximizing GPU performance. Imo, BFBC2 was even worst when it came to CPU requirements and relative performance, but the good news is I think DICE has room for improvement. Slide 27 starts with the discussion of BF3's changes to rendering performance:



http://www.slideshare.net/DICEStudio/directx-11-rendering-in-battlefield-3



What we really need from DICE is to fully unlock DX11 performance with the kind of multi-threaded rendering performance gains we saw with Civ 5. There's an awesome read about DX11 and multi-threaded rendering and its impact on CPU/GPU performance on AnandTech: http://forums.anandtech.com/showpost.php?p=31520674&postcount=28



Basically, with stereo 3D, DICE has doubled the CPU requirements for rendering without doubling CPU budget/performance. If they could at the very least, double the number of rendering threads (one for each camera view), that would go a long way in taking care of the low performance problem for some users with really powerful graphics cards. It wouldn't help me too much though as I'm pretty much fully utilized on both GPUs.



Also, I wouldn't rule out VRAM being an issue in some cases, BF3 uses 1400MB for me with mostly Ultra settings, Shadows High, Ambient Occlusion Off and MSAA Off (Deferred AA). Really high VRAM use for a game with no AA enabled.







Those are very interesting reads (thanks) but neither is really relevant. The GDC slides talk about how instancing is a CPU burden but how their engine allows for greater instancing without extra CPU overhead associated with that extra instancing. The second article only relates to GPU rendering/multithreading. The fact remains that since not a single CPU core (and it's not hyperthreaded) nor the GPUs are even close to becoming saturated that they are not bottlenecks. As far as VRAM, changing the texture quality from Ultra to Low and everything in between has literally zero affect on the framerate. That all points to a HUGE inefficiency somewhere, either in DICE's 3D rendering techniques or nvidia's drivers - at least as far as I can think of. If any of my CPU cores were near being maxed, even if one of the four were starting to hit 70% I would completely agree, but all of the physical cores are at about 50% and only slightly better for the GPUs at times when I'm getting 30fps. Can you think of another game which displays such behavior?

i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard

Posted 12/08/2011 02:03 AM   
[quote name='Cheezeman' date='07 December 2011 - 09:03 PM' timestamp='1323309812' post='1339222']
Those are very interesting reads (thanks) but neither is really relevant. The GDC slides talk about how instancing [i]is [/i]a CPU burden but how their engine allows for greater instancing without extra CPU overhead associated with that extra instancing. The second article only relates to GPU rendering/multithreading.[/quote]
The GDC slide was more to illustrate how many draw calls Frostbite 2 was still making and that the extra instancing wasn't actually a CPU performance saver, but a performance enabler that allowed them to make more draw calls (more eye candy) with the same general CPU/GPU budget (as Frostbite 1/BFBC2). With Stereo 3D they're once again increasing the number of draw calls without accounting for the increased rendering load. I guess an easy test would be to compare CPU load in 2D vs. 3D. While I wouldn't expect a 2x increase in CPU usage, I would at least expect to see utilization increase by the approximate amount for the rendering thread while in 2D, but from what I've seen its pretty much the same CPU utlization in 2D and 3D.

[quote]The fact remains that since not a single CPU core (and it's not hyperthreaded) [b]nor[/b] the GPUs are even close to becoming saturated that they are not bottlenecks. As far as VRAM, changing the texture quality from Ultra to Low and everything in between has literally zero affect on the framerate. That all points to a HUGE inefficiency somewhere, either in DICE's 3D rendering techniques or nvidia's drivers - at least as far as I can think of. If any of my CPU cores were near being maxed, even if one of the four were starting to hit 70% I would completely agree, but all of the physical cores are at about 50% and only slightly better for the GPUs at times when I'm getting 30fps. Can you think of another game which displays such behavior?
[/quote]
A single thread is by no means guaranteed to completely utilize a single CPU core, so that's hardly any evidence there's no bottleneck. Now more than ever games and engines depend on multiple helper threads to supplement the main rendering thread which creates dependencies between threads. Then you have the rendering threads themselves that can become so bloated that they end up choking on themselves. Faster CPUs can help to a degree but only faster clocks on a single core help and its clearly a brute force approach to the problem. Going parallel over multiple CPU cores is clearly the answer for the kind of performance increases we'll need to keep up with games and multi-GPU set-ups.

But to illustrate, here's a really nice comparison from Tom's Hardware that shows CPU utilization across cores with an i5 4 core CPU @3-4GHz with 1-4 cores enabled. You can see with 1 core its always pegged to 100% with all the threads using all CPU time. With 2 cores, 1 core usage is generally still very high with the main rendering thread, but not always 100%. With 3 and 4 cores, the main rendering thread is rarely 100% utilized (more in the 60-70% range) and helper threads barely use any CPU time at all on the other 2-3 cores. Even with a modest GTX 460 there's evidence of CPU bottlenecking (lower GPU utilization) with games that use a single threaded rendering engine, so imagine what happens with faster set-ups or SLI.

http://www.tomshardware.com/reviews/game-performance-bottleneck,2738-5.html

As for the VRAM/texture changes, you need to completely close out the client for some setting changes like texture/shader/FOV changes to take hold. Some settings like shadows adjust on the fly, some settings require you to change map and reload once before taking hold, but its just safer to close things out completely and restart the client. I've adjusted down to mostly Medium and I can get 60FPS in that part of Operation Swordbreaker that's ~40 on Ultra, so it should work, and texture quality definitely has an impact on VRAM use from what I've seen.

As for the CPU usage thingy again, like I said most any MMO exhibits that kind of behavior, really low FPS and modest CPU usage in some areas like big cities or faced in a certain direction with extended draw distances. Another example of a game a lot of us are playing through right now is Skyrim. Apparently only dual-core optimized and very lightly threaded, so it really struggles to keep even mid-range GPUs utilized when outdoors or facing certain extended draw directions.

One last thing you can try to satisfy your own curiosity. Since we know 2D performance is fine, performance and GPU utilization is as expected, you should try the registry hack method of using Nvidia's stereo drivers to see if they improve your FPS in 3D. If FB2's stereo rendering method is indeed the bottleneck, you should see immediate improvement when Nvidia's stereo service is handling the stereo rendering. Not a long term solution ofc, but should certainly help give you some answers.
[quote name='Cheezeman' date='07 December 2011 - 09:03 PM' timestamp='1323309812' post='1339222']

Those are very interesting reads (thanks) but neither is really relevant. The GDC slides talk about how instancing is a CPU burden but how their engine allows for greater instancing without extra CPU overhead associated with that extra instancing. The second article only relates to GPU rendering/multithreading.

The GDC slide was more to illustrate how many draw calls Frostbite 2 was still making and that the extra instancing wasn't actually a CPU performance saver, but a performance enabler that allowed them to make more draw calls (more eye candy) with the same general CPU/GPU budget (as Frostbite 1/BFBC2). With Stereo 3D they're once again increasing the number of draw calls without accounting for the increased rendering load. I guess an easy test would be to compare CPU load in 2D vs. 3D. While I wouldn't expect a 2x increase in CPU usage, I would at least expect to see utilization increase by the approximate amount for the rendering thread while in 2D, but from what I've seen its pretty much the same CPU utlization in 2D and 3D.



The fact remains that since not a single CPU core (and it's not hyperthreaded) nor the GPUs are even close to becoming saturated that they are not bottlenecks. As far as VRAM, changing the texture quality from Ultra to Low and everything in between has literally zero affect on the framerate. That all points to a HUGE inefficiency somewhere, either in DICE's 3D rendering techniques or nvidia's drivers - at least as far as I can think of. If any of my CPU cores were near being maxed, even if one of the four were starting to hit 70% I would completely agree, but all of the physical cores are at about 50% and only slightly better for the GPUs at times when I'm getting 30fps. Can you think of another game which displays such behavior?



A single thread is by no means guaranteed to completely utilize a single CPU core, so that's hardly any evidence there's no bottleneck. Now more than ever games and engines depend on multiple helper threads to supplement the main rendering thread which creates dependencies between threads. Then you have the rendering threads themselves that can become so bloated that they end up choking on themselves. Faster CPUs can help to a degree but only faster clocks on a single core help and its clearly a brute force approach to the problem. Going parallel over multiple CPU cores is clearly the answer for the kind of performance increases we'll need to keep up with games and multi-GPU set-ups.



But to illustrate, here's a really nice comparison from Tom's Hardware that shows CPU utilization across cores with an i5 4 core CPU @3-4GHz with 1-4 cores enabled. You can see with 1 core its always pegged to 100% with all the threads using all CPU time. With 2 cores, 1 core usage is generally still very high with the main rendering thread, but not always 100%. With 3 and 4 cores, the main rendering thread is rarely 100% utilized (more in the 60-70% range) and helper threads barely use any CPU time at all on the other 2-3 cores. Even with a modest GTX 460 there's evidence of CPU bottlenecking (lower GPU utilization) with games that use a single threaded rendering engine, so imagine what happens with faster set-ups or SLI.



http://www.tomshardware.com/reviews/game-performance-bottleneck,2738-5.html



As for the VRAM/texture changes, you need to completely close out the client for some setting changes like texture/shader/FOV changes to take hold. Some settings like shadows adjust on the fly, some settings require you to change map and reload once before taking hold, but its just safer to close things out completely and restart the client. I've adjusted down to mostly Medium and I can get 60FPS in that part of Operation Swordbreaker that's ~40 on Ultra, so it should work, and texture quality definitely has an impact on VRAM use from what I've seen.



As for the CPU usage thingy again, like I said most any MMO exhibits that kind of behavior, really low FPS and modest CPU usage in some areas like big cities or faced in a certain direction with extended draw distances. Another example of a game a lot of us are playing through right now is Skyrim. Apparently only dual-core optimized and very lightly threaded, so it really struggles to keep even mid-range GPUs utilized when outdoors or facing certain extended draw directions.



One last thing you can try to satisfy your own curiosity. Since we know 2D performance is fine, performance and GPU utilization is as expected, you should try the registry hack method of using Nvidia's stereo drivers to see if they improve your FPS in 3D. If FB2's stereo rendering method is indeed the bottleneck, you should see immediate improvement when Nvidia's stereo service is handling the stereo rendering. Not a long term solution ofc, but should certainly help give you some answers.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

Posted 12/08/2011 04:25 AM   
Its still no good for me. I still get the red "attempt to..." message despite ensuring its running at 720 resolution and if I ignore the message the game is very laggy which makes it unplayayble. I am running a GTX 580 and in i7 Extreme processor so there should be no reason for this
Its still no good for me. I still get the red "attempt to..." message despite ensuring its running at 720 resolution and if I ignore the message the game is very laggy which makes it unplayayble. I am running a GTX 580 and in i7 Extreme processor so there should be no reason for this

Posted 12/08/2011 02:49 PM   
If people are having performance issues, maybe try using this utility. I found it on the BF3 UK forums. Apparently, bf3.exe is a 32 bit .exe and this utility makes it a 64 bit.exe.

http://www.mediafire.com/?mq0rmcib2wi4afw

I run at roughly 30fps in 3D at 1080p on a single gtx460 (OC) and a Phenom x6 1090T processor.

All "AA" (post and otherwise) off, "HBAO" on, and all settings at medium except for "effects" and "textures" which are at ultra.
If people are having performance issues, maybe try using this utility. I found it on the BF3 UK forums. Apparently, bf3.exe is a 32 bit .exe and this utility makes it a 64 bit.exe.



http://www.mediafire.com/?mq0rmcib2wi4afw



I run at roughly 30fps in 3D at 1080p on a single gtx460 (OC) and a Phenom x6 1090T processor.



All "AA" (post and otherwise) off, "HBAO" on, and all settings at medium except for "effects" and "textures" which are at ultra.

Posted 12/08/2011 05:09 PM   
[quote name='GordyMeow' date='06 December 2011 - 04:11 AM' timestamp='1323166295' post='1338315']
Hey... I think I figured out how to adjust convergence...

in the console, type the command "list" and there should be a list of commands that pop up, one of which being something like:

"RenderDevice.StereoConvergenceScale"

It's dynamic as well!

They did it.

Thanks DICE / NVidia!

EDIT: I set the value to about 3.75, to give you some idea of where to start, maybe.
[/quote]

Hi

You guys beat me to it, I was going to post on these this week!
[quote name='GordyMeow' date='06 December 2011 - 04:11 AM' timestamp='1323166295' post='1338315']

Hey... I think I figured out how to adjust convergence...



in the console, type the command "list" and there should be a list of commands that pop up, one of which being something like:



"RenderDevice.StereoConvergenceScale"



It's dynamic as well!



They did it.



Thanks DICE / NVidia!



EDIT: I set the value to about 3.75, to give you some idea of where to start, maybe.





Hi



You guys beat me to it, I was going to post on these this week!

Posted 12/08/2011 09:01 PM   
[quote name='chiz' date='06 December 2011 - 11:33 AM' timestamp='1323192792' post='1338465']


Thanks again to DICE and Nvidia (and Andrew) for sticking with it and getting 3D right in BF3...finally.
[/quote]

Thanks to DICE! We gave them the community feedback and they took it to heart. Thanks for posting your guide too. I was out of the office earlier this week and was going to document it for everyone, but looks like you did that.
[quote name='chiz' date='06 December 2011 - 11:33 AM' timestamp='1323192792' post='1338465']





Thanks again to DICE and Nvidia (and Andrew) for sticking with it and getting 3D right in BF3...finally.





Thanks to DICE! We gave them the community feedback and they took it to heart. Thanks for posting your guide too. I was out of the office earlier this week and was going to document it for everyone, but looks like you did that.

Posted 12/08/2011 09:02 PM   
Quality posting Chiz. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' /> Here's what I just tested which I think covers the majority of the situations you discussed. I found a scene in single player in which I was getting a constant 40fps exactly (which seems to be a common framerate that I get pinned at in many different situations, although I see the whole gamut from 20 to 60fps). Just sitting there not moving, perfect 40fps rock solid all the time. This scene also easily fit within my VRAM (990MB used of 1280MB). So to test CPU bottlenecking I decreased my CPU from 4.6 to 3.3GHz (because I'm not going to OC it any more /unsure.gif' class='bbc_emoticon' alt=':unsure:' /> ). If CPU bottlenecking were a problem we should expect to see the FPS drop by a fair margin. But it was still a rock solid 40fps. The CPU usage in task manager did go up accordingly, but the FPS was still a rock solid 40fps. I note that bf3.exe is running fourty-four threads.

Then to test GPU bottlenecking I reverted to the 4.6GHz on the CPU (to further remove it as a potential bottleneck) and then OC'ed my 570s from their base 723/1900 cor/mem to 810/2050. Again the FPS stayed exactly the same at a solid 40fps. Also remember this scene is easily fitting within the VRAM. If GPU bottlenecking were a problem the framerate should have budged at least 2-3fps with the OC but it was still a rock solid 40fps.

Finally to test 2D vs 3D I let the scene sit there for a few minutes to get a nice clean line on the graphs and then switched to 2D and let it run for a minute as well. The framerate exactly doubled to a rock solid 80fps and as you can see in the below screen capture the resource utilization is the same between 2D and 3D (the spike in between was an alt-tab, just ignore the circled part and look at how similar the resource utilization is between 2D and 3D):

[IMG]http://i61.photobucket.com/albums/h79/pts121/Temp/bf3_utilization.png[/IMG]



Final conclusion that I'm still drawing from this is that there is a fundamental limit to FB2's rendering capabilities (not just in 3D as I thought before) - and this, to me at least, is a problem -- that it is [b]impossible [/b]to run this game at its own max settings at a solid 60fps (and sometimes forced as low as the high-teens) [i]when hardware is not a limiting factor[/i] (although I will say that it is impressive that rendering in 3D takes [i][u]exactly[/u] half[/i] of the total rendering budget). To me, that's like saying you can buy a Ferrari but you can never drive it faster than 40mph. It's almost false advertising.

Just to be sure, is there [b]anyone [/b]that [i]can [/i] play BF3 at Ultra settings and be locked at a solid 60fps?

Also, thank you Andrew for voicing our feedback and helping to get convergence unlocked (and to DICE for unlocking it). It made a huge difference in the game. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
Quality posting Chiz. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' /> Here's what I just tested which I think covers the majority of the situations you discussed. I found a scene in single player in which I was getting a constant 40fps exactly (which seems to be a common framerate that I get pinned at in many different situations, although I see the whole gamut from 20 to 60fps). Just sitting there not moving, perfect 40fps rock solid all the time. This scene also easily fit within my VRAM (990MB used of 1280MB). So to test CPU bottlenecking I decreased my CPU from 4.6 to 3.3GHz (because I'm not going to OC it any more /unsure.gif' class='bbc_emoticon' alt=':unsure:' /> ). If CPU bottlenecking were a problem we should expect to see the FPS drop by a fair margin. But it was still a rock solid 40fps. The CPU usage in task manager did go up accordingly, but the FPS was still a rock solid 40fps. I note that bf3.exe is running fourty-four threads.



Then to test GPU bottlenecking I reverted to the 4.6GHz on the CPU (to further remove it as a potential bottleneck) and then OC'ed my 570s from their base 723/1900 cor/mem to 810/2050. Again the FPS stayed exactly the same at a solid 40fps. Also remember this scene is easily fitting within the VRAM. If GPU bottlenecking were a problem the framerate should have budged at least 2-3fps with the OC but it was still a rock solid 40fps.



Finally to test 2D vs 3D I let the scene sit there for a few minutes to get a nice clean line on the graphs and then switched to 2D and let it run for a minute as well. The framerate exactly doubled to a rock solid 80fps and as you can see in the below screen capture the resource utilization is the same between 2D and 3D (the spike in between was an alt-tab, just ignore the circled part and look at how similar the resource utilization is between 2D and 3D):



Image







Final conclusion that I'm still drawing from this is that there is a fundamental limit to FB2's rendering capabilities (not just in 3D as I thought before) - and this, to me at least, is a problem -- that it is impossible to run this game at its own max settings at a solid 60fps (and sometimes forced as low as the high-teens) when hardware is not a limiting factor (although I will say that it is impressive that rendering in 3D takes exactly half of the total rendering budget). To me, that's like saying you can buy a Ferrari but you can never drive it faster than 40mph. It's almost false advertising.



Just to be sure, is there anyone that can play BF3 at Ultra settings and be locked at a solid 60fps?



Also, thank you Andrew for voicing our feedback and helping to get convergence unlocked (and to DICE for unlocking it). It made a huge difference in the game. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />

i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard

Posted 12/09/2011 03:18 AM   
[quote name='GordyMeow' date='08 December 2011 - 12:09 PM' timestamp='1323364148' post='1339482']
If people are having performance issues, maybe try using this utility. I found it on the BF3 UK forums. Apparently, bf3.exe is a 32 bit .exe and this utility makes it a 64 bit.exe.

http://www.mediafire.com/?mq0rmcib2wi4afw

I run at roughly 30fps in 3D at 1080p on a single gtx460 (OC) and a Phenom x6 1090T processor.

All "AA" (post and otherwise) off, "HBAO" on, and all settings at medium except for "effects" and "textures" which are at ultra.
[/quote]

Could I trouble you for a link to the thread with more details on this?
[quote name='GordyMeow' date='08 December 2011 - 12:09 PM' timestamp='1323364148' post='1339482']

If people are having performance issues, maybe try using this utility. I found it on the BF3 UK forums. Apparently, bf3.exe is a 32 bit .exe and this utility makes it a 64 bit.exe.



http://www.mediafire.com/?mq0rmcib2wi4afw



I run at roughly 30fps in 3D at 1080p on a single gtx460 (OC) and a Phenom x6 1090T processor.



All "AA" (post and otherwise) off, "HBAO" on, and all settings at medium except for "effects" and "textures" which are at ultra.





Could I trouble you for a link to the thread with more details on this?

i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard

Posted 12/09/2011 03:23 AM   
http://forums.electronicarts.co.uk/battlefield-3-pc/1458791-4gb-patch-tool-x64-system.html

Could be a placebo effect, but the thread seemed inconclusive to me.
http://forums.electronicarts.co.uk/battlefield-3-pc/1458791-4gb-patch-tool-x64-system.html



Could be a placebo effect, but the thread seemed inconclusive to me.

Posted 12/09/2011 03:29 AM   
[quote name='GordyMeow' date='08 December 2011 - 10:29 PM' timestamp='1323401395' post='1339761']
http://forums.electronicarts.co.uk/battlefield-3-pc/1458791-4gb-patch-tool-x64-system.html

Could be a placebo effect, but the thread seemed inconclusive to me.
[/quote]

Thanks. Placebo effect though. /wink.gif' class='bbc_emoticon' alt=';)' /> that patch is to make it LAA (Large Address space Aware i.e. allow it to use more than 2GB of RAM) but the game is already LAA by default.
[quote name='GordyMeow' date='08 December 2011 - 10:29 PM' timestamp='1323401395' post='1339761']

http://forums.electronicarts.co.uk/battlefield-3-pc/1458791-4gb-patch-tool-x64-system.html



Could be a placebo effect, but the thread seemed inconclusive to me.





Thanks. Placebo effect though. /wink.gif' class='bbc_emoticon' alt=';)' /> that patch is to make it LAA (Large Address space Aware i.e. allow it to use more than 2GB of RAM) but the game is already LAA by default.

i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard

Posted 12/09/2011 05:09 AM   
Excellent posts chiz and cheese.

If CPU and GPU are not bottlenecks for this part of game, perhaps main memory bandwidth is?
Excellent posts chiz and cheese.



If CPU and GPU are not bottlenecks for this part of game, perhaps main memory bandwidth is?

Posted 12/09/2011 01:17 PM   
[quote name='andrewf@nvidia' date='08 December 2011 - 04:02 PM' timestamp='1323378163' post='1339602']
Thanks to DICE! We gave them the community feedback and they took it to heart. Thanks for posting your guide too. I was out of the office earlier this week and was going to document it for everyone, but looks like you did that.
[/quote]
Np, happy to share my findings since I've benefitted from many similar tips/tweaks posted by you and others here. :)

Thanks for passing the info along to DICE so they could fix this, but as is often the case, it seems the applause and reception for the fix is far less than the ruckus stirred up when it was broken. /ermm.gif' class='bbc_emoticon' alt=':ermm:' />

Also Andrew can you ask DICE or your own engineers to investigate the performance issues some people with VERY high-end systems are experiencing in 3D? Basically, on high-end SLI rigs (2xGF110-based rigs and greater), people are getting lower than expected performance while in S3D, so low FPS (below 60 cap) and low GPU utilization. Switching to 2D often reverts to more than double the FPS for them at near 100% utilization.

The problems remind me a lot of SLI performance issues with Avatar the Game. I recently revisited the game to see GPU utilization was fixed, but I'm not sure if that fix was by Nvidia or Ubisoft. Do you remember what you guys did to fix performance in Avatar? Maybe an SLI profile update for games that used quad buffer stereo? BF3 may need a similar fix if that's the case.
[quote name='andrewf@nvidia' date='08 December 2011 - 04:02 PM' timestamp='1323378163' post='1339602']

Thanks to DICE! We gave them the community feedback and they took it to heart. Thanks for posting your guide too. I was out of the office earlier this week and was going to document it for everyone, but looks like you did that.



Np, happy to share my findings since I've benefitted from many similar tips/tweaks posted by you and others here. :)



Thanks for passing the info along to DICE so they could fix this, but as is often the case, it seems the applause and reception for the fix is far less than the ruckus stirred up when it was broken. /ermm.gif' class='bbc_emoticon' alt=':ermm:' />



Also Andrew can you ask DICE or your own engineers to investigate the performance issues some people with VERY high-end systems are experiencing in 3D? Basically, on high-end SLI rigs (2xGF110-based rigs and greater), people are getting lower than expected performance while in S3D, so low FPS (below 60 cap) and low GPU utilization. Switching to 2D often reverts to more than double the FPS for them at near 100% utilization.



The problems remind me a lot of SLI performance issues with Avatar the Game. I recently revisited the game to see GPU utilization was fixed, but I'm not sure if that fix was by Nvidia or Ubisoft. Do you remember what you guys did to fix performance in Avatar? Maybe an SLI profile update for games that used quad buffer stereo? BF3 may need a similar fix if that's the case.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

Posted 12/09/2011 04:57 PM   
[quote name='Cheezeman' date='08 December 2011 - 10:18 PM' timestamp='1323400725' post='1339756']
Quality posting Chiz. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' /> Here's what I just tested which I think covers the majority of the situations you discussed. I found a scene in single player in which I was getting a constant 40fps exactly (which seems to be a common framerate that I get pinned at in many different situations, although I see the whole gamut from 20 to 60fps). Just sitting there not moving, perfect 40fps rock solid all the time. This scene also easily fit within my VRAM (990MB used of 1280MB). So to test CPU bottlenecking I decreased my CPU from 4.6 to 3.3GHz (because I'm not going to OC it any more /unsure.gif' class='bbc_emoticon' alt=':unsure:' /> ). If CPU bottlenecking were a problem we should expect to see the FPS drop by a fair margin. But it was still a rock solid 40fps. The CPU usage in task manager did go up accordingly, but the FPS was still a rock solid 40fps. I note that bf3.exe is running fourty-four threads.

Then to test GPU bottlenecking I reverted to the 4.6GHz on the CPU (to further remove it as a potential bottleneck) and then OC'ed my 570s from their base 723/1900 cor/mem to 810/2050. Again the FPS stayed exactly the same at a solid 40fps. Also remember this scene is easily fitting within the VRAM. If GPU bottlenecking were a problem the framerate should have budged at least 2-3fps with the OC but it was still a rock solid 40fps.

Finally to test 2D vs 3D I let the scene sit there for a few minutes to get a nice clean line on the graphs and then switched to 2D and let it run for a minute as well. The framerate exactly doubled to a rock solid 80fps and as you can see in the below screen capture the resource utilization is the same between 2D and 3D (the spike in between was an alt-tab, just ignore the circled part and look at how similar the resource utilization is between 2D and 3D):


Final conclusion that I'm still drawing from this is that there is a fundamental limit to FB2's rendering capabilities (not just in 3D as I thought before) - and this, to me at least, is a problem -- that it is [b]impossible [/b]to run this game at its own max settings at a solid 60fps (and sometimes forced as low as the high-teens) [i]when hardware is not a limiting factor[/i] (although I will say that it is impressive that rendering in 3D takes [i][u]exactly[/u] half[/i] of the total rendering budget). To me, that's like saying you can buy a Ferrari but you can never drive it faster than 40mph. It's almost false advertising.

Just to be sure, is there [b]anyone [/b]that [i]can [/i] play BF3 at Ultra settings and be locked at a solid 60fps?

Also, thank you Andrew for voicing our feedback and helping to get convergence unlocked (and to DICE for unlocking it). It made a huge difference in the game. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
[/quote]
Very interesting findings and testing Cheezeman, thanks for taking the time to drill down a bit further. Definitely starting to narrow down the performance issues I think.

I too noticed the recurrence of certain frame intervals BF3 tends to "settle" at, 30, 40, 60 etc, similar to Vsync but not quite the same intervals. I'm starting to think there's some kind of internal frame rate limiter/smoother at play here that caps frame rates dynamically based on scene load, but I think this again goes to dependencies and time limits set for certain stages of the rendering process. So for example, if FB2 can finish rendering geometry in 1.2ms or less the frame target is set for 60FPS. If FB2 can't finish in 1.2ms but can finish in 1.5ms, then the frame target is set for 40FPS and so on. I'm not a developer but I've read enough of their blogs and tech interviews to know performance for them comes down to ms and how fast a pass/process can be accomplished and combined with other passes/processes and still maintain playable framerates. Keep in mind, 16.67ms is the amount of time between frames to maintain 60FPS, so all those ms count!

Or it could be a much simpler problem, that they are basing framerates/intervals off 120Hz refresh rate and only using double buffers. I guess quad buffer stereo by definition is just two double-buffered stereo images. I will try this later, but I don't think it could be as easy as trying to turn on triple buffering in the command console to get some of those incremental FPS back between 40 and 60 (I know you mentioned you used d3doverrider too). 40FPS isn't an increment of Vsync at 60Hz, but it is if you are using a 120Hz tick like you would be in 2D. Also as a bit of an aside, I can get 60FPS but only in certain non-demanding situations, mainly indoors.

But all of this really reminds me of the same problems we saw with Avatar the Game when it was released. The problems there were much more obvious though as the game didn't make good use of SLI at all, generally only 50% load on both GPU but still low FPS that hit the same kinds of intervals depending on which direction you faced...20,30,40,60. I went back recently and tested, DX9 was much better with nearly 60FPS capped and 100% utilization, DX10 had 100% utilization but still poor framerates. It makes me wonder if its a quad buffer stereo issue with SLI, maybe a simple profile change could fix performance here. Hopefully Andrew can chime in on what they did to fix Avatar performance with SLI.
[quote name='Cheezeman' date='08 December 2011 - 10:18 PM' timestamp='1323400725' post='1339756']

Quality posting Chiz. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' /> Here's what I just tested which I think covers the majority of the situations you discussed. I found a scene in single player in which I was getting a constant 40fps exactly (which seems to be a common framerate that I get pinned at in many different situations, although I see the whole gamut from 20 to 60fps). Just sitting there not moving, perfect 40fps rock solid all the time. This scene also easily fit within my VRAM (990MB used of 1280MB). So to test CPU bottlenecking I decreased my CPU from 4.6 to 3.3GHz (because I'm not going to OC it any more /unsure.gif' class='bbc_emoticon' alt=':unsure:' /> ). If CPU bottlenecking were a problem we should expect to see the FPS drop by a fair margin. But it was still a rock solid 40fps. The CPU usage in task manager did go up accordingly, but the FPS was still a rock solid 40fps. I note that bf3.exe is running fourty-four threads.



Then to test GPU bottlenecking I reverted to the 4.6GHz on the CPU (to further remove it as a potential bottleneck) and then OC'ed my 570s from their base 723/1900 cor/mem to 810/2050. Again the FPS stayed exactly the same at a solid 40fps. Also remember this scene is easily fitting within the VRAM. If GPU bottlenecking were a problem the framerate should have budged at least 2-3fps with the OC but it was still a rock solid 40fps.



Finally to test 2D vs 3D I let the scene sit there for a few minutes to get a nice clean line on the graphs and then switched to 2D and let it run for a minute as well. The framerate exactly doubled to a rock solid 80fps and as you can see in the below screen capture the resource utilization is the same between 2D and 3D (the spike in between was an alt-tab, just ignore the circled part and look at how similar the resource utilization is between 2D and 3D):





Final conclusion that I'm still drawing from this is that there is a fundamental limit to FB2's rendering capabilities (not just in 3D as I thought before) - and this, to me at least, is a problem -- that it is impossible to run this game at its own max settings at a solid 60fps (and sometimes forced as low as the high-teens) when hardware is not a limiting factor (although I will say that it is impressive that rendering in 3D takes exactly half of the total rendering budget). To me, that's like saying you can buy a Ferrari but you can never drive it faster than 40mph. It's almost false advertising.



Just to be sure, is there anyone that can play BF3 at Ultra settings and be locked at a solid 60fps?



Also, thank you Andrew for voicing our feedback and helping to get convergence unlocked (and to DICE for unlocking it). It made a huge difference in the game. /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />



Very interesting findings and testing Cheezeman, thanks for taking the time to drill down a bit further. Definitely starting to narrow down the performance issues I think.



I too noticed the recurrence of certain frame intervals BF3 tends to "settle" at, 30, 40, 60 etc, similar to Vsync but not quite the same intervals. I'm starting to think there's some kind of internal frame rate limiter/smoother at play here that caps frame rates dynamically based on scene load, but I think this again goes to dependencies and time limits set for certain stages of the rendering process. So for example, if FB2 can finish rendering geometry in 1.2ms or less the frame target is set for 60FPS. If FB2 can't finish in 1.2ms but can finish in 1.5ms, then the frame target is set for 40FPS and so on. I'm not a developer but I've read enough of their blogs and tech interviews to know performance for them comes down to ms and how fast a pass/process can be accomplished and combined with other passes/processes and still maintain playable framerates. Keep in mind, 16.67ms is the amount of time between frames to maintain 60FPS, so all those ms count!



Or it could be a much simpler problem, that they are basing framerates/intervals off 120Hz refresh rate and only using double buffers. I guess quad buffer stereo by definition is just two double-buffered stereo images. I will try this later, but I don't think it could be as easy as trying to turn on triple buffering in the command console to get some of those incremental FPS back between 40 and 60 (I know you mentioned you used d3doverrider too). 40FPS isn't an increment of Vsync at 60Hz, but it is if you are using a 120Hz tick like you would be in 2D. Also as a bit of an aside, I can get 60FPS but only in certain non-demanding situations, mainly indoors.



But all of this really reminds me of the same problems we saw with Avatar the Game when it was released. The problems there were much more obvious though as the game didn't make good use of SLI at all, generally only 50% load on both GPU but still low FPS that hit the same kinds of intervals depending on which direction you faced...20,30,40,60. I went back recently and tested, DX9 was much better with nearly 60FPS capped and 100% utilization, DX10 had 100% utilization but still poor framerates. It makes me wonder if its a quad buffer stereo issue with SLI, maybe a simple profile change could fix performance here. Hopefully Andrew can chime in on what they did to fix Avatar performance with SLI.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

Posted 12/09/2011 05:24 PM   
[quote name='baragon' date='09 December 2011 - 08:17 AM' timestamp='1323436678' post='1339964']
Excellent posts chiz and cheese.

If CPU and GPU are not bottlenecks for this part of game, perhaps main memory bandwidth is?
[/quote]

Good idea, I should have done that for thoroughness before. I just now tried reducing my DDR3 speed from 1600mhz to 1033hmz but still got the same rock-solid 40fps.


[quote name='chiz' date='09 December 2011 - 11:57 AM' timestamp='1323449837' post='1340036']
Also Andrew can you ask DICE or your own engineers to investigate the performance issues some people with VERY high-end systems are experiencing in 3D? Basically, on high-end SLI rigs (2xGF110-based rigs and greater), people are getting lower than expected performance while in S3D, so low FPS (below 60 cap) and low GPU utilization.

The problems remind me a lot of SLI performance issues with Avatar the Game. I recently revisited the game to see GPU utilization was fixed, but I'm not sure if that fix was by Nvidia or Ubisoft. Do you remember what you guys did to fix performance in Avatar? Maybe an SLI profile update for games that used quad buffer stereo? BF3 may need a similar fix if that's the case.
[/quote]

[b]This would be greatly appreciated Andrew!![/b] I upgraded from 2x 460GTXs and bought brand new 2x 570GTXs just to get this game to play at a solid 60fps... only to have zero difference made by $700 in NVIDIA hardware. /confused.gif' class='bbc_emoticon' alt=':confused:' /> (and then a new CPU and motherboard after that!)


[quote name='chiz' date='09 December 2011 - 12:24 PM' timestamp='1323451470' post='1340047']

I too noticed the recurrence of certain frame intervals BF3 tends to "settle" at, 30, 40, 60 etc, similar to Vsync but not quite the same intervals. I'm starting to think there's some kind of internal frame rate limiter/smoother at play here that caps frame rates dynamically based on scene load, but I think this again goes to dependencies and time limits set for certain stages of the rendering process. So for example, if FB2 can finish rendering geometry in 1.2ms or less the frame target is set for 60FPS. If FB2 can't finish in 1.2ms but can finish in 1.5ms, then the frame target is set for 40FPS and so on. I'm not a developer but I've read enough of their blogs and tech interviews to know performance for them comes down to ms and how fast a pass/process can be accomplished and combined with other passes/processes and still maintain playable framerates. Keep in mind, 16.67ms is the amount of time between frames to maintain 60FPS, so all those ms count![/quote]

The 'settling' at certain fps sure does seem like a telltale sign, and I see it often (most often at 40fps). Although I understand the principles behind the timings on the various threads/rendering processes and how they must complete in X amount of time, I must admit I haven't delved into it too deep. However, wouldn't that still be affected by big changes in hardware speeds?

[quote]Or it could be a much simpler problem, that they are basing framerates/intervals off 120Hz refresh rate and only using double buffers. I guess quad buffer stereo by definition is just two double-buffered stereo images. I will try this later, but I don't think it could be as easy as trying to turn on triple buffering in the command console to get some of those incremental FPS back between 40 and 60 (I know you mentioned you used d3doverrider too). 40FPS isn't an increment of Vsync at 60Hz, but it is if you are using a 120Hz tick like you would be in 2D. Also as a bit of an aside, I can get 60FPS but only in certain non-demanding situations, mainly indoors.[/quote]

I'll also admit that the buffering stuff is just way beyond me. Would it matter that I'm seeing odd-numbered FPS though? I too can get 60fps for short periods in certain situations (mainly indoors as well) and when that is the case the resource utlization seems like it usually goes [b][u]up[/u][/b], which again would seem to point to some holdup within the FB2 engine or drivers (i.e. whatever that bottleneck is, it gets removed, allowing the engine to run more efficiently and utilize more resources). Every other game I've seen when the framerate limit is hit, the resource utilization goes down.


[quote]But all of this really reminds me of the same problems we saw with Avatar the Game when it was released. The problems there were much more obvious though as the game didn't make good use of SLI at all, generally only 50% load on both GPU but still low FPS that hit the same kinds of intervals depending on which direction you faced...20,30,40,60. I went back recently and tested, DX9 was much better with nearly 60FPS capped and 100% utilization, DX10 had 100% utilization but still poor framerates. It makes me wonder if its a quad buffer stereo issue with SLI, maybe a simple profile change could fix performance here. Hopefully Andrew can chime in on what they did to fix Avatar performance with SLI.
[/quote]

QFT
[quote name='baragon' date='09 December 2011 - 08:17 AM' timestamp='1323436678' post='1339964']

Excellent posts chiz and cheese.



If CPU and GPU are not bottlenecks for this part of game, perhaps main memory bandwidth is?





Good idea, I should have done that for thoroughness before. I just now tried reducing my DDR3 speed from 1600mhz to 1033hmz but still got the same rock-solid 40fps.





[quote name='chiz' date='09 December 2011 - 11:57 AM' timestamp='1323449837' post='1340036']

Also Andrew can you ask DICE or your own engineers to investigate the performance issues some people with VERY high-end systems are experiencing in 3D? Basically, on high-end SLI rigs (2xGF110-based rigs and greater), people are getting lower than expected performance while in S3D, so low FPS (below 60 cap) and low GPU utilization.



The problems remind me a lot of SLI performance issues with Avatar the Game. I recently revisited the game to see GPU utilization was fixed, but I'm not sure if that fix was by Nvidia or Ubisoft. Do you remember what you guys did to fix performance in Avatar? Maybe an SLI profile update for games that used quad buffer stereo? BF3 may need a similar fix if that's the case.





This would be greatly appreciated Andrew!! I upgraded from 2x 460GTXs and bought brand new 2x 570GTXs just to get this game to play at a solid 60fps... only to have zero difference made by $700 in NVIDIA hardware. /confused.gif' class='bbc_emoticon' alt=':confused:' /> (and then a new CPU and motherboard after that!)





[quote name='chiz' date='09 December 2011 - 12:24 PM' timestamp='1323451470' post='1340047']



I too noticed the recurrence of certain frame intervals BF3 tends to "settle" at, 30, 40, 60 etc, similar to Vsync but not quite the same intervals. I'm starting to think there's some kind of internal frame rate limiter/smoother at play here that caps frame rates dynamically based on scene load, but I think this again goes to dependencies and time limits set for certain stages of the rendering process. So for example, if FB2 can finish rendering geometry in 1.2ms or less the frame target is set for 60FPS. If FB2 can't finish in 1.2ms but can finish in 1.5ms, then the frame target is set for 40FPS and so on. I'm not a developer but I've read enough of their blogs and tech interviews to know performance for them comes down to ms and how fast a pass/process can be accomplished and combined with other passes/processes and still maintain playable framerates. Keep in mind, 16.67ms is the amount of time between frames to maintain 60FPS, so all those ms count!



The 'settling' at certain fps sure does seem like a telltale sign, and I see it often (most often at 40fps). Although I understand the principles behind the timings on the various threads/rendering processes and how they must complete in X amount of time, I must admit I haven't delved into it too deep. However, wouldn't that still be affected by big changes in hardware speeds?



Or it could be a much simpler problem, that they are basing framerates/intervals off 120Hz refresh rate and only using double buffers. I guess quad buffer stereo by definition is just two double-buffered stereo images. I will try this later, but I don't think it could be as easy as trying to turn on triple buffering in the command console to get some of those incremental FPS back between 40 and 60 (I know you mentioned you used d3doverrider too). 40FPS isn't an increment of Vsync at 60Hz, but it is if you are using a 120Hz tick like you would be in 2D. Also as a bit of an aside, I can get 60FPS but only in certain non-demanding situations, mainly indoors.




I'll also admit that the buffering stuff is just way beyond me. Would it matter that I'm seeing odd-numbered FPS though? I too can get 60fps for short periods in certain situations (mainly indoors as well) and when that is the case the resource utlization seems like it usually goes up, which again would seem to point to some holdup within the FB2 engine or drivers (i.e. whatever that bottleneck is, it gets removed, allowing the engine to run more efficiently and utilize more resources). Every other game I've seen when the framerate limit is hit, the resource utilization goes down.





But all of this really reminds me of the same problems we saw with Avatar the Game when it was released. The problems there were much more obvious though as the game didn't make good use of SLI at all, generally only 50% load on both GPU but still low FPS that hit the same kinds of intervals depending on which direction you faced...20,30,40,60. I went back recently and tested, DX9 was much better with nearly 60FPS capped and 100% utilization, DX10 had 100% utilization but still poor framerates. It makes me wonder if its a quad buffer stereo issue with SLI, maybe a simple profile change could fix performance here. Hopefully Andrew can chime in on what they did to fix Avatar performance with SLI.





QFT

i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard

Posted 12/09/2011 08:56 PM   
Hope it will be sorted for Christmas break


Thanks all for posting re.convergence, it does make a nice diference


......right, now all we need is decent running performance & we are there!

******* currently running at :

In game Resolution 3850x720 , every in game setting set to lowest ( so that's off where possible or low )bed pet showdows : medium , AF x16, 3d mode , (always multiplayer)
(In nivida control panel profile set to performance)

My system spec :

I7 2600 @ 4.6ghz watercooled
2x gtx580's @ 940mhz core watercooled
16gb ram ( set at 1600mhz )
Gigabyte Z68 UD7 motherboard ( 2x 16's for gpu's)
Duel psu



I'm getting near solid 60fps (60 per eye as 3d ) it does drop to 49 sometimes but mainly smooth gameplay ( only play multiplayer )

Really need to raise resolution ASAP as can only play short while as very careful with my eyes, 4800x900 would be preferred

Hope it will be sorted for Christmas break


Thanks all for posting re.convergence, it does make a nice diference


......right, now all we need is decent running performance & we are there!

******* currently running at :

In game Resolution 3850x720 , every in game setting set to lowest ( so that's off where possible or low )bed pet showdows : medium , AF x16
(In nivida control panel profile set to performance)

My system spec :

I7 2600 @ 4.6ghz watercooled
2x gtx580's @ 940mhz core watercooled
16gb ram ( set at 1600mhz )
Gigabyte Z68 UD7 motherboard ( 2x 16's for gpu's)
Duel psu



I'm getting near solid 60fps (60 per eye as 3d ) it does drop to 49 sometimes but mainly smooth gameplay ( only play multiplayer )

Really need to raise resolution ASAP as can only play short while as very careful with my eyes, 4800x900 would be preferred
Hope it will be sorted for Christmas break





Thanks all for posting re.convergence, it does make a nice diference





......right, now all we need is decent running performance & we are there!



******* currently running at :



In game Resolution 3850x720 , every in game setting set to lowest ( so that's off where possible or low )bed pet showdows : medium , AF x16, 3d mode , (always multiplayer)

(In nivida control panel profile set to performance)



My system spec :



I7 2600 @ 4.6ghz watercooled

2x gtx580's @ 940mhz core watercooled

16gb ram ( set at 1600mhz )

Gigabyte Z68 UD7 motherboard ( 2x 16's for gpu's)

Duel psu







I'm getting near solid 60fps (60 per eye as 3d ) it does drop to 49 sometimes but mainly smooth gameplay ( only play multiplayer )



Really need to raise resolution ASAP as can only play short while as very careful with my eyes, 4800x900 would be preferred



Hope it will be sorted for Christmas break





Thanks all for posting re.convergence, it does make a nice diference





......right, now all we need is decent running performance & we are there!



******* currently running at :



In game Resolution 3850x720 , every in game setting set to lowest ( so that's off where possible or low )bed pet showdows : medium , AF x16

(In nivida control panel profile set to performance)



My system spec :



I7 2600 @ 4.6ghz watercooled

2x gtx580's @ 940mhz core watercooled

16gb ram ( set at 1600mhz )

Gigabyte Z68 UD7 motherboard ( 2x 16's for gpu's)

Duel psu







I'm getting near solid 60fps (60 per eye as 3d ) it does drop to 49 sometimes but mainly smooth gameplay ( only play multiplayer )



Really need to raise resolution ASAP as can only play short while as very careful with my eyes, 4800x900 would be preferred

Posted 12/10/2011 01:44 AM   
  31 / 43    
Scroll To Top