[quote name='LM_3D' date='21 April 2012 - 06:51 PM' timestamp='1335034308' post='1399187']
Hi there,
i got a new PC component especially for Battlefield 3, i got:
- 2600 K processor, overclocked to 4.4GHz
- ASRock Z668 Extreme4 Gen3 Motherboard
- G.Skill Sniper 8GB DDR3 1600MHz (2x4GB) CL9 1.5v
- ASUS 27" LED Monitor 120Hz + nVIDIA 3D Vision 2
some other pretty stuff such as NZXT 850W Bronze PSU, SSD, professional gaming mouse & keyboard
my nightmares are now about the graphics cards, i'm wondering what should i get!
after searching the web, i found that 2 GTX 570 SLI would do fine for 3D and Battlefield 3, but i think i should wait for the GTX 670Ti and get two of these monsters, my questions is: are there really so many problems related with Battlefield 3 and SLI? because i read so many threads on internet of which some people with GTX SLI encountering problems with Battlefield 3 such as freezing, crashes and most likely stuttering
thanks in advance
[/quote]
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. You just have to create a .cfg file with your prefered 3D command settings.
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
some other pretty stuff such as NZXT 850W Bronze PSU, SSD, professional gaming mouse & keyboard
my nightmares are now about the graphics cards, i'm wondering what should i get!
after searching the web, i found that 2 GTX 570 SLI would do fine for 3D and Battlefield 3, but i think i should wait for the GTX 670Ti and get two of these monsters, my questions is: are there really so many problems related with Battlefield 3 and SLI? because i read so many threads on internet of which some people with GTX SLI encountering problems with Battlefield 3 such as freezing, crashes and most likely stuttering
thanks in advance
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. You just have to create a .cfg file with your prefered 3D command settings.
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
[quote name='Badelhas' date='21 April 2012 - 11:28 PM' timestamp='1335050912' post='1399250']
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. [color="#000080"][u][b]You just have to create a .cfg file with your prefered 3D command settings.[/b][/u][/color]
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
[/quote]
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those [color="#000080"].cfg[/color] files and how to do it and where to put them?
[quote name='Badelhas' date='21 April 2012 - 11:28 PM' timestamp='1335050912' post='1399250']
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. You just have to create a .cfg file with your prefered 3D command settings.
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those .cfg files and how to do it and where to put them?
[quote name='LM_3D' date='22 April 2012 - 10:58 AM' timestamp='1335092288' post='1399372']
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those [color="#000080"].cfg[/color] files and how to do it and where to put them?
[/quote]
see page 28 of this thread. The user "chiz2 as even kind enough to attach a .cfg file so you just have to change with the text editor the settings for your prefered ones and put it in the battlefield 3 case: https://skydrive.live.com/redir.aspx?cid=1a21782f49102306&resid=1A21782F49102306!635&parid=1A21782F49102306!106&authkey=!ALjghwiWZ8Y_8MU
[quote name='LM_3D' date='22 April 2012 - 10:58 AM' timestamp='1335092288' post='1399372']
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those .cfg files and how to do it and where to put them?
see page 28 of this thread. The user "chiz2 as even kind enough to attach a .cfg file so you just have to change with the text editor the settings for your prefered ones and put it in the battlefield 3 case: https://skydrive.live.com/redir.aspx?cid=1a21782f49102306&resid=1A21782F49102306!635&parid=1A21782F49102306!106&authkey=!ALjghwiWZ8Y_8MU
Mine´s are:
renderdevice.stereoseparationscale 1.38
renderdevice.stereoconvergencescale 2.2
renderdevice.stereosoldierzoomconvergencescale 0
render.drawfps 1
The last command is for always showing the framerate ur having in the top right corner of the screen.
Cheers,
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
Thank you very much [b]Badelhas[/b], you helped a lot, i read the whole case and somehow i started to understand the issue here, its a bit new stuff for me since it is my first time to do SLI and 3D Vision together. Thanks again for your help, much appreciated.
Thank you very much Badelhas, you helped a lot, i read the whole case and somehow i started to understand the issue here, its a bit new stuff for me since it is my first time to do SLI and 3D Vision together. Thanks again for your help, much appreciated.
I just got my Nvidia 3D Vision 2 Kit and I must say Battlefield 3 looks stunning, the game has alot of depth and the gun convergence is incredible.
Right now I am using these stereo settings:
Nvidia Depth 100%
stereoseparationscale 1
stereoconvergencescale 2
stereosoldierzoomconvergence 1 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
I just got my Nvidia 3D Vision 2 Kit and I must say Battlefield 3 looks stunning, the game has alot of depth and the gun convergence is incredible.
Right now I am using these stereo settings:
Nvidia Depth 100%
stereoseparationscale 1
stereoconvergencescale 2
stereosoldierzoomconvergence 1 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
Cpu: Intel i7 3930K @ 4.8GHz
Cooler: Corsair H100
Mobo: ASUS Rampage IV Formula
Ram: GSkill 16GB Quad DDR3 @ 2050Mhz
GPU: 2 Nvidia GTX 970's in SLI
Monitor: Samsung u28d590d 4K 3840 x 2160
HD: 2 OCZ Vertex 4 SSDs in Raid 0
Case: Cooler Master Cosmos II
PSU: Pc Power And Cooling Silencer MK II 950W
OS: Windows 8.1 64-Bit
Hey guys, I'm new to the Nvidia camp, so bear with me. I have 1 680 and when I run BF3 maxed out I will get "Double" hitmarkers. Everything else works fine in 3D but my hitmarkers show up in two places. I'm running the latest beta drivers. Any fixes?
Edit: It seems it worked it self out by a restart. 3D is definitely not a gimmick :D
Good day everyone.
Hey guys, I'm new to the Nvidia camp, so bear with me. I have 1 680 and when I run BF3 maxed out I will get "Double" hitmarkers. Everything else works fine in 3D but my hitmarkers show up in two places. I'm running the latest beta drivers. Any fixes?
Edit: It seems it worked it self out by a restart. 3D is definitely not a gimmick :D
[quote name='Pelter' date='25 April 2012 - 04:46 AM' timestamp='1335329215' post='1400656']
I just got my Nvidia 3D Vision 2 Kit and I must say Battlefield 3 looks stunning, the game has alot of depth and the gun convergence is incredible.
Right now I am using these stereo settings:
Nvidia Depth 95%
stereoseparationscale 1
stereoconvergencescale 1.45
stereosoldierzoomconvergence 0.5 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
[/quote]
We´ve been wanting for that fix for ages, if it didnt happen until now I am quite sure it will never happen. With my rig and playing at 720p low rez I still have that 40 fps cap when I´m outside in big maps.
Still regarding BF3, is there any graphic MOD that makes the game look better, like in Crysis?
[quote name='Pelter' date='25 April 2012 - 04:46 AM' timestamp='1335329215' post='1400656']
I just got my Nvidia 3D Vision 2 Kit and I must say Battlefield 3 looks stunning, the game has alot of depth and the gun convergence is incredible.
Right now I am using these stereo settings:
Nvidia Depth 95%
stereoseparationscale 1
stereoconvergencescale 1.45
stereosoldierzoomconvergence 0.5 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
We´ve been wanting for that fix for ages, if it didnt happen until now I am quite sure it will never happen. With my rig and playing at 720p low rez I still have that 40 fps cap when I´m outside in big maps.
Still regarding BF3, is there any graphic MOD that makes the game look better, like in Crysis?
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
[u][b]
You just have to set the Mesh quality to low[/b][/u], the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
You just have to set the Mesh quality to low, the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
Cpu: Intel i7 3930K @ 4.8GHz
Cooler: Corsair H100
Mobo: ASUS Rampage IV Formula
Ram: GSkill 16GB Quad DDR3 @ 2050Mhz
GPU: 2 Nvidia GTX 970's in SLI
Monitor: Samsung u28d590d 4K 3840 x 2160
HD: 2 OCZ Vertex 4 SSDs in Raid 0
Case: Cooler Master Cosmos II
PSU: Pc Power And Cooling Silencer MK II 950W
OS: Windows 8.1 64-Bit
[quote name='Pelter' date='04 May 2012 - 07:10 AM' timestamp='1336111827' post='1404116']
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
[u][b]
You just have to set the Mesh quality to low[/b][/u], the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
[/quote]
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
[quote name='Pelter' date='04 May 2012 - 07:10 AM' timestamp='1336111827' post='1404116']
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
You just have to set the Mesh quality to low, the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote name='Pelter' date='04 May 2012 - 02:10 AM' timestamp='1336111827' post='1404116']
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
[u][b]
You just have to set the Mesh quality to low[/b][/u], the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
[/quote]
Thanks, I will give this a try tonight. I thought I had even tried setting ALL settings to low and still ran into the 40fps cap, but I'll try again with this, it's been so long I may be mistaken. Thanks for the effort.
[quote name='rustyk' date='04 May 2012 - 08:21 AM' timestamp='1336134061' post='1404190']
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
[/quote]
See this post and countless others in this thread where people with ridiculously powerful rigs can get no more than EXACTLY 40 fps in outdoors areas with low resource utilization...
[quote name='Pelter' date='04 May 2012 - 02:10 AM' timestamp='1336111827' post='1404116']
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
You just have to set the Mesh quality to low, the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
Thanks, I will give this a try tonight. I thought I had even tried setting ALL settings to low and still ran into the 40fps cap, but I'll try again with this, it's been so long I may be mistaken. Thanks for the effort.
[quote name='rustyk' date='04 May 2012 - 08:21 AM' timestamp='1336134061' post='1404190']
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
See this post and countless others in this thread where people with ridiculously powerful rigs can get no more than EXACTLY 40 fps in outdoors areas with low resource utilization...
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote name='rustyk' date='04 May 2012 - 09:04 PM' timestamp='1336165486' post='1404398']
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
[/quote]
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
[quote name='rustyk' date='04 May 2012 - 09:04 PM' timestamp='1336165486' post='1404398']
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
[quote name='Badelhas' date='05 May 2012 - 02:41 AM' timestamp='1336182090' post='1404469']
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
[/quote]
Maybe it's simply that SLI 570's aren't powerful enough? It seems possible to get a decent framerates with 680 SLI http://forums.nvidia.com/index.php?showtopic=225774&st=0&p=1388865&hl=battlefield%203&fromsearch=1&#entry1388865
Wouldn't big outdoor areas have more for the engine to do? I've never heard of a game introducing an artificial framerate cap and it's common on most games to get worse framerates outside as opposed to inside, that's all. I'd be disappointed if I wasn't getting the framerate I thought that I *should* get, but no configuration can guarantee a specific level of performance.
[quote name='Badelhas' date='05 May 2012 - 02:41 AM' timestamp='1336182090' post='1404469']
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
Maybe it's simply that SLI 570's aren't powerful enough? It seems possible to get a decent framerates with 680 SLI http://forums.nvidia.com/index.php?showtopic=225774&st=0&p=1388865&hl=battlefield%203&fromsearch=1&#entry1388865
Wouldn't big outdoor areas have more for the engine to do? I've never heard of a game introducing an artificial framerate cap and it's common on most games to get worse framerates outside as opposed to inside, that's all. I'd be disappointed if I wasn't getting the framerate I thought that I *should* get, but no configuration can guarantee a specific level of performance.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote name='rustyk' date='04 May 2012 - 05:04 PM' timestamp='1336165486' post='1404398']
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
[/quote]
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
[quote name='rustyk' date='04 May 2012 - 05:04 PM' timestamp='1336165486' post='1404398']
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
[quote name='Cheezeman' date='05 May 2012 - 08:58 PM' timestamp='1336247928' post='1404714']
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
[/quote]
Yes, I hope that works. I'm probably being a bit pedantic to be honest. It's like with skyrim, it just wasn't very efficient coding and once they patched it everyones framerates improved. I was just challenging the assertion that it was a framerate cap. PC gamers have a bad habit of screaming 'console port', or 'crap code' just because they spent x amount of money on a system and y game doesn't run the way they expect. It's like a false sense of entitlement.
[quote name='Cheezeman' date='05 May 2012 - 08:58 PM' timestamp='1336247928' post='1404714']
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
Yes, I hope that works. I'm probably being a bit pedantic to be honest. It's like with skyrim, it just wasn't very efficient coding and once they patched it everyones framerates improved. I was just challenging the assertion that it was a framerate cap. PC gamers have a bad habit of screaming 'console port', or 'crap code' just because they spent x amount of money on a system and y game doesn't run the way they expect. It's like a false sense of entitlement.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Hi there,
i got a new PC component especially for Battlefield 3, i got:
- 2600 K processor, overclocked to 4.4GHz
- ASRock Z668 Extreme4 Gen3 Motherboard
- G.Skill Sniper 8GB DDR3 1600MHz (2x4GB) CL9 1.5v
- ASUS 27" LED Monitor 120Hz + nVIDIA 3D Vision 2
some other pretty stuff such as NZXT 850W Bronze PSU, SSD, professional gaming mouse & keyboard
my nightmares are now about the graphics cards, i'm wondering what should i get!
after searching the web, i found that 2 GTX 570 SLI would do fine for 3D and Battlefield 3, but i think i should wait for the GTX 670Ti and get two of these monsters, my questions is: are there really so many problems related with Battlefield 3 and SLI? because i read so many threads on internet of which some people with GTX SLI encountering problems with Battlefield 3 such as freezing, crashes and most likely stuttering
thanks in advance
[/quote]
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. You just have to create a .cfg file with your prefered 3D command settings.
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
Hi there,
i got a new PC component especially for Battlefield 3, i got:
- 2600 K processor, overclocked to 4.4GHz
- ASRock Z668 Extreme4 Gen3 Motherboard
- G.Skill Sniper 8GB DDR3 1600MHz (2x4GB) CL9 1.5v
- ASUS 27" LED Monitor 120Hz + nVIDIA 3D Vision 2
some other pretty stuff such as NZXT 850W Bronze PSU, SSD, professional gaming mouse & keyboard
my nightmares are now about the graphics cards, i'm wondering what should i get!
after searching the web, i found that 2 GTX 570 SLI would do fine for 3D and Battlefield 3, but i think i should wait for the GTX 670Ti and get two of these monsters, my questions is: are there really so many problems related with Battlefield 3 and SLI? because i read so many threads on internet of which some people with GTX SLI encountering problems with Battlefield 3 such as freezing, crashes and most likely stuttering
thanks in advance
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. You just have to create a .cfg file with your prefered 3D command settings.
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. [color="#000080"][u][b]You just have to create a .cfg file with your prefered 3D command settings.[/b][/u][/color]
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
[/quote]
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those [color="#000080"].cfg[/color] files and how to do it and where to put them?
I have a similar rig (see my sig) with 2 gtx570 in sli and I have no problems what so ever, execpt that sometimes my framerate drops from 60 to 40fps (I play in 3D Vision), but its a game engine cap in some parts of big maps. You just have to create a .cfg file with your prefered 3D command settings.
Regarding which cards you should buy, I would buy 2 gtx570, they overclock to 580 clocks and in my country they cost half of the 680 (260€ vs 500€). The gain in performance does not justify the price dif, in my opinion.
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those .cfg files and how to do it and where to put them?
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those [color="#000080"].cfg[/color] files and how to do it and where to put them?
[/quote]
see page 28 of this thread. The user "chiz2 as even kind enough to attach a .cfg file so you just have to change with the text editor the settings for your prefered ones and put it in the battlefield 3 case: https://skydrive.live.com/redir.aspx?cid=1a21782f49102306&resid=1A21782F49102306!635&parid=1A21782F49102306!106&authkey=!ALjghwiWZ8Y_8MU
Mine´s are:
renderdevice.stereoseparationscale 1.38
renderdevice.stereoconvergencescale 2.2
renderdevice.stereosoldierzoomconvergencescale 0
render.drawfps 1
The last command is for always showing the framerate ur having in the top right corner of the screen.
Cheers,
thanks alot man, this is enough for me, now i feel comfortable going out and buying 2 GTX 670 the next month.
by the way, is there any thread here talking about creating those .cfg files and how to do it and where to put them?
see page 28 of this thread. The user "chiz2 as even kind enough to attach a .cfg file so you just have to change with the text editor the settings for your prefered ones and put it in the battlefield 3 case: https://skydrive.live.com/redir.aspx?cid=1a21782f49102306&resid=1A21782F49102306!635&parid=1A21782F49102306!106&authkey=!ALjghwiWZ8Y_8MU
Mine´s are:
renderdevice.stereoseparationscale 1.38
renderdevice.stereoconvergencescale 2.2
renderdevice.stereosoldierzoomconvergencescale 0
render.drawfps 1
The last command is for always showing the framerate ur having in the top right corner of the screen.
Cheers,
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
Right now I am using these stereo settings:
Nvidia Depth 100%
stereoseparationscale 1
stereoconvergencescale 2
stereosoldierzoomconvergence 1 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
Right now I am using these stereo settings:
Nvidia Depth 100%
stereoseparationscale 1
stereoconvergencescale 2
stereosoldierzoomconvergence 1 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
Cpu: Intel i7 3930K @ 4.8GHz
Cooler: Corsair H100
Mobo: ASUS Rampage IV Formula
Ram: GSkill 16GB Quad DDR3 @ 2050Mhz
GPU: 2 Nvidia GTX 970's in SLI
Monitor: Samsung u28d590d 4K 3840 x 2160
HD: 2 OCZ Vertex 4 SSDs in Raid 0
Case: Cooler Master Cosmos II
PSU: Pc Power And Cooling Silencer MK II 950W
OS: Windows 8.1 64-Bit
Helix Mod 3D Vision Fixes
Edit: It seems it worked it self out by a restart. 3D is definitely not a gimmick :D
Good day everyone.
Edit: It seems it worked it self out by a restart. 3D is definitely not a gimmick :D
Good day everyone.
I just got my Nvidia 3D Vision 2 Kit and I must say Battlefield 3 looks stunning, the game has alot of depth and the gun convergence is incredible.
Right now I am using these stereo settings:
Nvidia Depth 95%
stereoseparationscale 1
stereoconvergencescale 1.45
stereosoldierzoomconvergence 0.5 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
[/quote]
We´ve been wanting for that fix for ages, if it didnt happen until now I am quite sure it will never happen. With my rig and playing at 720p low rez I still have that 40 fps cap when I´m outside in big maps.
Still regarding BF3, is there any graphic MOD that makes the game look better, like in Crysis?
I just got my Nvidia 3D Vision 2 Kit and I must say Battlefield 3 looks stunning, the game has alot of depth and the gun convergence is incredible.
Right now I am using these stereo settings:
Nvidia Depth 95%
stereoseparationscale 1
stereoconvergencescale 1.45
stereosoldierzoomconvergence 0.5 (Yes I know some people say its harder to aim far away with this but I seem to have no issue shooting people accurately, however I never go sniper anymore.)
And now for the bad...
The framerate is really bad due to low GPU usage. I am running the game all ultra settings with AA off and I mostly get 40-45fps with only 70% gpu usage, very irritating because I know that with 100% usage I could pull off a smooth 60fps.
I really desperately hope DICE/Nvidia fixes this issue because It ALMOST makes 3D not worth it in this game but I will still stick with playing 3D due to how good it looks.
We´ve been wanting for that fix for ages, if it didnt happen until now I am quite sure it will never happen. With my rig and playing at 720p low rez I still have that 40 fps cap when I´m outside in big maps.
Still regarding BF3, is there any graphic MOD that makes the game look better, like in Crysis?
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
[u][b]
You just have to set the Mesh quality to low[/b][/u], the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
You just have to set the Mesh quality to low, the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
Cpu: Intel i7 3930K @ 4.8GHz
Cooler: Corsair H100
Mobo: ASUS Rampage IV Formula
Ram: GSkill 16GB Quad DDR3 @ 2050Mhz
GPU: 2 Nvidia GTX 970's in SLI
Monitor: Samsung u28d590d 4K 3840 x 2160
HD: 2 OCZ Vertex 4 SSDs in Raid 0
Case: Cooler Master Cosmos II
PSU: Pc Power And Cooling Silencer MK II 950W
OS: Windows 8.1 64-Bit
Helix Mod 3D Vision Fixes
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
[u][b]
You just have to set the Mesh quality to low[/b][/u], the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
[/quote]
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
You just have to set the Mesh quality to low, the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
[u][b]
You just have to set the Mesh quality to low[/b][/u], the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
[/quote]
Thanks, I will give this a try tonight. I thought I had even tried setting ALL settings to low and still ran into the 40fps cap, but I'll try again with this, it's been so long I may be mistaken. Thanks for the effort.
[quote name='rustyk' date='04 May 2012 - 08:21 AM' timestamp='1336134061' post='1404190']
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
[/quote]
See this post and countless others in this thread where people with ridiculously powerful rigs can get no more than EXACTLY 40 fps in outdoors areas with low resource utilization...
http://forums.nvidia.com/index.php?showtopic=213426&view=findpost&p=1339756
I discovered a temporary and partial fix, it's not the best but to me it makes the game playable in 3D as I will get an avg of 55fps now with it hitting 60fps in most parts and only dropping to 45 fps when I am looking at big expanses on the large maps.
You just have to set the Mesh quality to low, the pop-in becomes annoying yes but I rather play in smoother framerate. My theory on why this happens after running benchmark after benchmark and keeping an eye on the in-game performance overlay, I have found that the mesh causes a skyrim-esque low-gpu usage bottleneck for framerates past 80. I have noticed this when playing in 2D 120HZ mode, I was unable to achieve 120fps yet my gpu usage remained only moderate. After turning mesh to low, the gpu usage shot up to max and I was able to achieve 100+ fps with 120 fps in some areas (+ Mesh quality being at low lessens the gpu load of course). In 3D, you are essentially rendering 120fps but 60 for each eye so the bottleneck remains in 3D. It seems only while running 60- fps is where the bottleneck is not seen so that's why the developers didn't catch this as most people do not run 3D/120Hz.
It's pretty silly why such an advanced DX11 engine has a bottleneck like this but this is just what I have observed after spending entire nights trying to figure it out.
Hopefully this can help some people.
Thanks, I will give this a try tonight. I thought I had even tried setting ALL settings to low and still ran into the 40fps cap, but I'll try again with this, it's been so long I may be mistaken. Thanks for the effort.
[quote name='rustyk' date='04 May 2012 - 08:21 AM' timestamp='1336134061' post='1404190']
Useful information thanks! It makes sense. I'm not sure why people think this game has an artifical framerate cap.
See this post and countless others in this thread where people with ridiculously powerful rigs can get no more than EXACTLY 40 fps in outdoors areas with low resource utilization...
http://forums.nvidia.com/index.php?showtopic=213426&view=findpost&p=1339756
i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
[/quote]
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
Asus P8Z68-V Pro; i5 2500k@4.3Ghz; Inno3d iChill X3 gtx 1070 ; 8GB DDR3 1600Mhz; Asus Xonar DX; Vertex 3 120Gb SSD + 1TB HDD; Corsair Tx750w; CoolerMaster Storm Scout Case ; Benq W700 720p 3D Vision Projector & 88 inch Screen, Sony 5.1 Home Cinema. HTC Vive
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
[/quote]
Maybe it's simply that SLI 570's aren't powerful enough? It seems possible to get a decent framerates with 680 SLI http://forums.nvidia.com/index.php?showtopic=225774&st=0&p=1388865&hl=battlefield%203&fromsearch=1&#entry1388865
Wouldn't big outdoor areas have more for the engine to do? I've never heard of a game introducing an artificial framerate cap and it's common on most games to get worse framerates outside as opposed to inside, that's all. I'd be disappointed if I wasn't getting the framerate I thought that I *should* get, but no configuration can guarantee a specific level of performance.
I have 2x gtx570 in sli oc´ed (10.700 points in 3d mark) and play @ a mere 720p. Dont you think I should be having 60fps, all the time? I also have that 40fps drop outside, in big 64 players maps. How do you explain that?
Maybe it's simply that SLI 570's aren't powerful enough? It seems possible to get a decent framerates with 680 SLI http://forums.nvidia.com/index.php?showtopic=225774&st=0&p=1388865&hl=battlefield%203&fromsearch=1&#entry1388865
Wouldn't big outdoor areas have more for the engine to do? I've never heard of a game introducing an artificial framerate cap and it's common on most games to get worse framerates outside as opposed to inside, that's all. I'd be disappointed if I wasn't getting the framerate I thought that I *should* get, but no configuration can guarantee a specific level of performance.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
[/quote]
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
I looked at the thread and what you've done in terms of testing is very accurate, but ask yourself this, why would they artificially cap the framerate at 40fps? I think it's more likely that you basically don't have the GPU power to maintain 60fps (or 120fps) and the driver or game (due to vsync issues) is dropping down from 60fps to 40fps.
You could argue it's a bottleneck in the code or whatever, and I'm not one to start a pointless argument, but my own personal opinion is that you don't have the power to maintain the framerate. The fact it drops to 40fps is not a an artificial cap, it's a byproduct of the way the code, both driver and game is written.
I'd love to get an official word from Dice or Nvidia, but I don't think that's very forthcoming! The GPU usage is a useful tool and it's nice that Nvidia provided it, I just think that sometimes people look at the GPU usage and if it's lower that 100% they automatically assume it's a cap or that it's crappy code. I think the truth is a bit more complicated than that.
ps. I'm more than happy to be proved wrong, I'm just putting forward another view. Have you tried dropping the resolution to a minimum 3d supported one? That would probably rule out a lack of GPU power.
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
i7-6700k @ 4.5GHz, 2x 970 GTX SLI, 16GB DDR4 @ 3000mhz, MSI Gaming M7, Samsung 950 Pro m.2 SSD 512GB, 2x 1TB RAID 1, 850w EVGA, Corsair RGB 90 keyboard
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
[/quote]
Yes, I hope that works. I'm probably being a bit pedantic to be honest. It's like with skyrim, it just wasn't very efficient coding and once they patched it everyones framerates improved. I was just challenging the assertion that it was a framerate cap. PC gamers have a bad habit of screaming 'console port', or 'crap code' just because they spent x amount of money on a system and y game doesn't run the way they expect. It's like a false sense of entitlement.
I hear what you're saying and I agree less than 100% utilization does not mean automatically mean there is a problem in a code, but even people with a 680 or a 590 (i.e. 2x as powerful as me) are also hitting this 40 fps cap. Like you mention I did also try dropping from 1920x1080 to something like 1200x800 (can't remember now but it was a pretty low resolution) and still got the exact same 40fps in these areas. I think Chiz had the closest explanation to the problem, something around certain aspects of the engine only have X amount of time to finish their work before it MUST move on in order to complete a frame on time. It's not necessarily a cap they imposed on purpose, just a cap imposed by the way the engine is handling things. Anyways I'm just now about to go try out this mesh setting so will report back soon.
Yes, I hope that works. I'm probably being a bit pedantic to be honest. It's like with skyrim, it just wasn't very efficient coding and once they patched it everyones framerates improved. I was just challenging the assertion that it was a framerate cap. PC gamers have a bad habit of screaming 'console port', or 'crap code' just because they spent x amount of money on a system and y game doesn't run the way they expect. It's like a false sense of entitlement.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310