gtx760 4Gb sli powerful enough for 3d vision surround
3 / 3
(1) No. I ran in 2D to remove the extra variable of 3D. Though now you mention it, I should have tested in 3D as well. I'll do it and update this thread when I do.
I'd never really thought about how AFR affects PhysX. Now you mention it, it's in interesting question (how can you calculate dynamic physics 2 frames in advance if you don't know what happened 1 frame earlier - that's like asking the GPU to predict the future). Judging by my MLL graphs, perhaps the answer is "terribly".
If I had to guess, I'd think that 3Dvision would improve the SLI+PhysX relationship, since PhysX would no longer need to try to 'predict the future' and work 2 frames ahead, but would always be working just one frame ahead.
(2) Yeah, that bloody graph is so unclear. I first thought the red was the minimum and the black was a maximum. I posted about it on the MLL forum to find out how it works, and someone told me that the red is just a background.
I'm not sure how your interpretation of it would work though. Since the black line goes over the top of the orange/red part, how would you know what the average framerate is (since part of the orange/red line is obscured by the lower portion of the black line)?
[quote]
GTX 285 as PhysX card, 4x PCI slot. Lowish end, but I didn't really expect PhysX to be large data intensive, or need more than the fairly competent 285.
[/quote]
PhysX can be a real killer, but I would also think that a 285 handling it on its own would do just fine in most circumstances. When I had a 680 and 580 combination, it worked fantastically - and that scenario is not too dissimilar from yours (high-end card with high end card from previous generation for PhysX....though I wasn't doing SLI I guess).
[quote]
https://forums.geforce.com/default/topic/571270/3d-vision/-iquest-gtx-780-or-sli-gtx-760-for-3d-vision-1080p-/post/3885684/#3885684
[/quote]
Those graphs are so different from mine! Much fuzzier in general, but also much tighter towards the end. Also a slightly different shape. Perhaps it's because we were using different graphics settings. I was totally maxxed out, including tesselation, motion blur, and SSAA.
Not knowing how else to account for our different results with PhysX cards, I'm going to assume that the 4x speed was a big culprit here. From what I've read, the difference from 16x to 8x is slight, but the difference from 8x to 4x is major.
SLI doesn't even officially support 4x speeds so, officially, your motherboard wasn't really intended for 3 graphics cards (I felt I needed to buy a new high-end mobo that supports 3 way SLI for my physX card). Your results probably aren't indicative of what a physX card could achieve in a best case scenario. Maybe that's the problem here. The fact that your 285 actually failed in your first test perhaps supports this.
Also, heat may have been an issue. I specially bought a low-profile gtx 650 card for PhysX, so both of my Titans' cooling abilities are unaffected. Even if I had the money to spend on, say, a 780 instead of my 650, I don't think I would do it, because the tight squeeze wouldn't be worth it.
Though even so, it could be argued that your 285 test showed slightly better performance: aside from that one massive frame drop, its dips tended to be slightly less severe than the dips in your 580 SLI test.
By the way, how many runs did you do? I found that the first 2 runs gave consistently poorer results than subsequent runs. That might have influenced something too.
(1) No. I ran in 2D to remove the extra variable of 3D. Though now you mention it, I should have tested in 3D as well. I'll do it and update this thread when I do.
I'd never really thought about how AFR affects PhysX. Now you mention it, it's in interesting question (how can you calculate dynamic physics 2 frames in advance if you don't know what happened 1 frame earlier - that's like asking the GPU to predict the future). Judging by my MLL graphs, perhaps the answer is "terribly".
If I had to guess, I'd think that 3Dvision would improve the SLI+PhysX relationship, since PhysX would no longer need to try to 'predict the future' and work 2 frames ahead, but would always be working just one frame ahead.
(2) Yeah, that bloody graph is so unclear. I first thought the red was the minimum and the black was a maximum. I posted about it on the MLL forum to find out how it works, and someone told me that the red is just a background.
I'm not sure how your interpretation of it would work though. Since the black line goes over the top of the orange/red part, how would you know what the average framerate is (since part of the orange/red line is obscured by the lower portion of the black line)?
GTX 285 as PhysX card, 4x PCI slot. Lowish end, but I didn't really expect PhysX to be large data intensive, or need more than the fairly competent 285.
PhysX can be a real killer, but I would also think that a 285 handling it on its own would do just fine in most circumstances. When I had a 680 and 580 combination, it worked fantastically - and that scenario is not too dissimilar from yours (high-end card with high end card from previous generation for PhysX....though I wasn't doing SLI I guess).
Those graphs are so different from mine! Much fuzzier in general, but also much tighter towards the end. Also a slightly different shape. Perhaps it's because we were using different graphics settings. I was totally maxxed out, including tesselation, motion blur, and SSAA.
Not knowing how else to account for our different results with PhysX cards, I'm going to assume that the 4x speed was a big culprit here. From what I've read, the difference from 16x to 8x is slight, but the difference from 8x to 4x is major.
SLI doesn't even officially support 4x speeds so, officially, your motherboard wasn't really intended for 3 graphics cards (I felt I needed to buy a new high-end mobo that supports 3 way SLI for my physX card). Your results probably aren't indicative of what a physX card could achieve in a best case scenario. Maybe that's the problem here. The fact that your 285 actually failed in your first test perhaps supports this.
Also, heat may have been an issue. I specially bought a low-profile gtx 650 card for PhysX, so both of my Titans' cooling abilities are unaffected. Even if I had the money to spend on, say, a 780 instead of my 650, I don't think I would do it, because the tight squeeze wouldn't be worth it.
Though even so, it could be argued that your 285 test showed slightly better performance: aside from that one massive frame drop, its dips tended to be slightly less severe than the dips in your 580 SLI test.
By the way, how many runs did you do? I found that the first 2 runs gave consistently poorer results than subsequent runs. That might have influenced something too.
No, I'm nearly certain that the orange line in the graphs is the average. I also couldn't get a straight answer out of anyone who knew what they were talking about, so here I used deduction.
If it were just a background, then why does it vary at all? Why the big gap at the end on your experiment where it wildly varies in look from the first part? I think we can conclude from this that it is not a background.
Seems like an average. When I look at a well behaved case, where things are turned down, the orange line pretty much is horizontal, no big spikes, and most importantly, it matches the same average reported on the left.
Also, if you were writing a benchmark, would you forget or ignore average? If the orange line is not it, then what is the average?
Presuming that line of thought is right, then the upper spikes are max spikes, which you can also read off the left hand summary. And the lower spikes are min spikes, that also match the left hand summary. It looks to me that the min spikes only show up over the orange average, if they are of sufficient duration to be shown. A one or two frame drop is not enough to draw the black line down.
Same for max, it only shows a spike line up if meets the criteria for drawing a spike up. For this test of 3 minutes long, it's gotta be something sustained for it to show up or down. Each single line spike is roughly 10 frames (about 150 ms for a spike).
No, I'm nearly certain that the orange line in the graphs is the average. I also couldn't get a straight answer out of anyone who knew what they were talking about, so here I used deduction.
If it were just a background, then why does it vary at all? Why the big gap at the end on your experiment where it wildly varies in look from the first part? I think we can conclude from this that it is not a background.
Seems like an average. When I look at a well behaved case, where things are turned down, the orange line pretty much is horizontal, no big spikes, and most importantly, it matches the same average reported on the left.
Also, if you were writing a benchmark, would you forget or ignore average? If the orange line is not it, then what is the average?
Presuming that line of thought is right, then the upper spikes are max spikes, which you can also read off the left hand summary. And the lower spikes are min spikes, that also match the left hand summary. It looks to me that the min spikes only show up over the orange average, if they are of sufficient duration to be shown. A one or two frame drop is not enough to draw the black line down.
Same for max, it only shows a spike line up if meets the criteria for drawing a spike up. For this test of 3 minutes long, it's gotta be something sustained for it to show up or down. Each single line spike is roughly 10 frames (about 150 ms for a spike).
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]Also, if you were writing a benchmark, would you forget or ignore average? If the orange line is not it, then what is the average?[/quote]The black line is the average. Or, to be more accurate, the black line is almost the *exact* framerate. (Unlike a graph that only shows one data point per second, it shows lots of data points and just maps them out as is, without needing to display an average per se)
Think of it like a sound wave or lie detector:
[img]http://cs.smith.edu/dftwiki/images/0/04/SoundWave.jpg[/img]
As it goes from left to right, it jumps up and down, creating a thick, jagged black zigzag. The more variation in data points (eg. in the latter half of the benchmark where there are lots of explosions, AI calculations and physX), the wider it's going to look. If you coloured the white area underneath that black line orange, you'd basically have your MLL graph.
The red/orange/green area underneath the black line is a guide. That's why it goes from red to orange to green: it's kind of like a ruler that you measure the black line up against. If the black line goes so high that it reveals some green, it means it's a good framerate in that portion. If all you see is red, it means you're getting a poor framerate.
Basically, IMO the red/green/orange was a poor design choice that was supposed to be useful, but added a layer of confusion instead.
......
OK, I've taken one of the original .svg files into Illustrator. I've zoomed way in, and found the separate layers:
[url]http://pbrd.co/18WQT92[/url]
There are two graph layers in that SVG. One is the red/orange/green fill, and the other is the black outline, which is superimposed over the top. I've moved the black layer to the right, so you can compare it next to the red/orange/green layer.
As you can see, they are essentially exactly the same. They show exactly the same data, just in different colours. So there can't be an average, minimum, or maximum, because there's just one data stream: a single jagged line that goes across.
And as you can see, it shows a crapload of data. You can zoom in and in and in, and it still shows you intricate data down to the pixel. For all intents and purposes, it's showing you all of the data in as much detail as possible....averages and approximations don't come into play.
My guess is it isn't measuring FPS at all (though it pretends to, for our benefit), but rather the number of milliseconds it takes to spit out a frame. This may also explain why your graphs are fuzzier than mine. Mine have more frames in them, and therefore more data points. So my line ends up looking more dense and 'solid', whereas yours is more sparse and 'fuzzy'
bo3b said:Also, if you were writing a benchmark, would you forget or ignore average? If the orange line is not it, then what is the average?
The black line is the average. Or, to be more accurate, the black line is almost the *exact* framerate. (Unlike a graph that only shows one data point per second, it shows lots of data points and just maps them out as is, without needing to display an average per se)
Think of it like a sound wave or lie detector:
As it goes from left to right, it jumps up and down, creating a thick, jagged black zigzag. The more variation in data points (eg. in the latter half of the benchmark where there are lots of explosions, AI calculations and physX), the wider it's going to look. If you coloured the white area underneath that black line orange, you'd basically have your MLL graph.
The red/orange/green area underneath the black line is a guide. That's why it goes from red to orange to green: it's kind of like a ruler that you measure the black line up against. If the black line goes so high that it reveals some green, it means it's a good framerate in that portion. If all you see is red, it means you're getting a poor framerate.
Basically, IMO the red/green/orange was a poor design choice that was supposed to be useful, but added a layer of confusion instead.
......
OK, I've taken one of the original .svg files into Illustrator. I've zoomed way in, and found the separate layers:
There are two graph layers in that SVG. One is the red/orange/green fill, and the other is the black outline, which is superimposed over the top. I've moved the black layer to the right, so you can compare it next to the red/orange/green layer.
As you can see, they are essentially exactly the same. They show exactly the same data, just in different colours. So there can't be an average, minimum, or maximum, because there's just one data stream: a single jagged line that goes across.
And as you can see, it shows a crapload of data. You can zoom in and in and in, and it still shows you intricate data down to the pixel. For all intents and purposes, it's showing you all of the data in as much detail as possible....averages and approximations don't come into play.
My guess is it isn't measuring FPS at all (though it pretends to, for our benefit), but rather the number of milliseconds it takes to spit out a frame. This may also explain why your graphs are fuzzier than mine. Mine have more frames in them, and therefore more data points. So my line ends up looking more dense and 'solid', whereas yours is more sparse and 'fuzzy'
What if it is a combination of those ideas? That there is in fact no average being shown, and that the color is meaningful, not just decoration?
Seems like it makes more sense if we look at the colored area as Min, and the black area as Max.
A wide variance would be stutter, so Helifax's pic at the end would be stuttering from fast to slow.
Mine would be a lot more variability, giving it that fuzzy look, but not exactly stutter. Maybe this would be microstutter.
Yours without PhysX card I would also expect to be pretty much stutter. Top right of your pic:
[img]http://postachio-images.s3-website-us-east-1.amazonaws.com/700adbbdfac613bd3a6c994890af82a8/eb95c1d7c0ac450dfe4c2303b4c94f96/5ce8453a8f86576a7f5a2fcde7653b31.png[/img]
[quote="bo3b"]Interesting idea, but if that is true, how do you explain helifax's results:
[/quote]Simple. He got lots of stutter in the 2nd half, when all the bullets started firing (his graph looks pretty similar to my top-right one actually - in fact the latter halves are almost identical but for a slightly higher fps on mine and slightly more dips on mine)
Because the black line went up and down so violently and the graph is so small, the black line got so squished that it effectively appeared to create a large black solid mass.
This solid mass obscured much of the red/orange layer below, making the red/orange layer **appear** to be significantly lower than it is.
But if helifax opens up his .svg file and compares the black layer with the red/orange layer, he'll find that they're actually identical, just like mine were here: http://pbrd.co/18WQT92 The only difference is that one is filled in with a orange/red/green gradient, while the other one has zero fill but a black stroke (outline).*
Remember: I moved the black one to the right so you can compare it to the coloured one. What you see here is a very tight zoom over a very small portion of the graph (which you can see by the fact that the colour is almost all green - that only occurs right at the top of the graph)
Even at that tiny zoomed in level the black and coloured lines are identical. And no, I didn't specially choose a non-stuttering part of the graph - I chose a part with a wide black area.
Look, you're sort of right. If we look at the top of the coloured area, it effectively shows us the min. Just as if we look at the bottom of the white area, it shows us the max. But these are both just negative spaces. The only data actually plotted on the graph is that jagged black line. And because it goes up and down like crazy, it's tempting to think that there's more going on than just a single line. This is actually easiest to see on yours, because even unzoomed yours already looks kind of like a zig zag (because I believe it has less data points to plot)
* In actual fact, the coloured line has a *blue* stroke in these .svg files. But this blue stroke is not visible, because it is completely obscured by the black line which is overlaid on top of it (and because the black one is identical to the blue one, it covers it up perfectly).
bo3b said:Interesting idea, but if that is true, how do you explain helifax's results:
Simple. He got lots of stutter in the 2nd half, when all the bullets started firing (his graph looks pretty similar to my top-right one actually - in fact the latter halves are almost identical but for a slightly higher fps on mine and slightly more dips on mine)
Because the black line went up and down so violently and the graph is so small, the black line got so squished that it effectively appeared to create a large black solid mass.
This solid mass obscured much of the red/orange layer below, making the red/orange layer **appear** to be significantly lower than it is.
But if helifax opens up his .svg file and compares the black layer with the red/orange layer, he'll find that they're actually identical, just like mine were here: http://pbrd.co/18WQT92 The only difference is that one is filled in with a orange/red/green gradient, while the other one has zero fill but a black stroke (outline).*
Remember: I moved the black one to the right so you can compare it to the coloured one. What you see here is a very tight zoom over a very small portion of the graph (which you can see by the fact that the colour is almost all green - that only occurs right at the top of the graph)
Even at that tiny zoomed in level the black and coloured lines are identical. And no, I didn't specially choose a non-stuttering part of the graph - I chose a part with a wide black area.
Look, you're sort of right. If we look at the top of the coloured area, it effectively shows us the min. Just as if we look at the bottom of the white area, it shows us the max. But these are both just negative spaces. The only data actually plotted on the graph is that jagged black line. And because it goes up and down like crazy, it's tempting to think that there's more going on than just a single line. This is actually easiest to see on yours, because even unzoomed yours already looks kind of like a zig zag (because I believe it has less data points to plot)
* In actual fact, the coloured line has a *blue* stroke in these .svg files. But this blue stroke is not visible, because it is completely obscured by the black line which is overlaid on top of it (and because the black one is identical to the blue one, it covers it up perfectly).
I haven't tried it lately with newer drivers but when I did that test.. I don't remember having FPS dips in the last part...ofc this doesnt mean stuttering wasn't there (as shown by the graph)
Last I checked the Metro LL profile it uses AFR2 mode and we all know that AFR mode is prone to micro stuttering.
What I do remember is that I got worst performance in the benchmark than the actual game... I test it on purpose on the same D6 last level as in the benchmark...
I still think that all the performance updates that were pushed are found only in the main game and not the benchmark... but I could be wrong:))
I haven't tried it lately with newer drivers but when I did that test.. I don't remember having FPS dips in the last part...ofc this doesnt mean stuttering wasn't there (as shown by the graph)
Last I checked the Metro LL profile it uses AFR2 mode and we all know that AFR mode is prone to micro stuttering.
What I do remember is that I got worst performance in the benchmark than the actual game... I test it on purpose on the same D6 last level as in the benchmark...
I still think that all the performance updates that were pushed are found only in the main game and not the benchmark... but I could be wrong:))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
OK, I understand what you are saying. It's a way for them to emphasize the mins, and try to show good/bad with color. (Since top of color is the same as bottom of actual line.)
I attach one of my fuzzy svgs for you to inspect at detail if you like.
[url]http://bo3b.net/MetroLL/SLI%20optimizer%20-%20Run%201.svg[/url]
BTW, in the benchmark folder, there is also a CSV file that seems to be the master data for the graph. It looks like your assumption is right, that they spit out frames as fast as possible, and report the ms delta between frames, then convert that to FPS. In the example I posted, there are 7752 frames reported.
OK, I understand what you are saying. It's a way for them to emphasize the mins, and try to show good/bad with color. (Since top of color is the same as bottom of actual line.)
I attach one of my fuzzy svgs for you to inspect at detail if you like.
BTW, in the benchmark folder, there is also a CSV file that seems to be the master data for the graph. It looks like your assumption is right, that they spit out frames as fast as possible, and report the ms delta between frames, then convert that to FPS. In the example I posted, there are 7752 frames reported.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Good spot with the csv files. I've looked at mine. It's another win for the PhysX card: with only Titans, I got around 8,600 frames on each run. With Titans+650, I got around 10,050!
Good spot with the csv files. I've looked at mine. It's another win for the PhysX card: with only Titans, I got around 8,600 frames on each run. With Titans+650, I got around 10,050!
I'd never really thought about how AFR affects PhysX. Now you mention it, it's in interesting question (how can you calculate dynamic physics 2 frames in advance if you don't know what happened 1 frame earlier - that's like asking the GPU to predict the future). Judging by my MLL graphs, perhaps the answer is "terribly".
If I had to guess, I'd think that 3Dvision would improve the SLI+PhysX relationship, since PhysX would no longer need to try to 'predict the future' and work 2 frames ahead, but would always be working just one frame ahead.
(2) Yeah, that bloody graph is so unclear. I first thought the red was the minimum and the black was a maximum. I posted about it on the MLL forum to find out how it works, and someone told me that the red is just a background.
I'm not sure how your interpretation of it would work though. Since the black line goes over the top of the orange/red part, how would you know what the average framerate is (since part of the orange/red line is obscured by the lower portion of the black line)?
PhysX can be a real killer, but I would also think that a 285 handling it on its own would do just fine in most circumstances. When I had a 680 and 580 combination, it worked fantastically - and that scenario is not too dissimilar from yours (high-end card with high end card from previous generation for PhysX....though I wasn't doing SLI I guess).
Those graphs are so different from mine! Much fuzzier in general, but also much tighter towards the end. Also a slightly different shape. Perhaps it's because we were using different graphics settings. I was totally maxxed out, including tesselation, motion blur, and SSAA.
Not knowing how else to account for our different results with PhysX cards, I'm going to assume that the 4x speed was a big culprit here. From what I've read, the difference from 16x to 8x is slight, but the difference from 8x to 4x is major.
SLI doesn't even officially support 4x speeds so, officially, your motherboard wasn't really intended for 3 graphics cards (I felt I needed to buy a new high-end mobo that supports 3 way SLI for my physX card). Your results probably aren't indicative of what a physX card could achieve in a best case scenario. Maybe that's the problem here. The fact that your 285 actually failed in your first test perhaps supports this.
Also, heat may have been an issue. I specially bought a low-profile gtx 650 card for PhysX, so both of my Titans' cooling abilities are unaffected. Even if I had the money to spend on, say, a 780 instead of my 650, I don't think I would do it, because the tight squeeze wouldn't be worth it.
Though even so, it could be argued that your 285 test showed slightly better performance: aside from that one massive frame drop, its dips tended to be slightly less severe than the dips in your 580 SLI test.
By the way, how many runs did you do? I found that the first 2 runs gave consistently poorer results than subsequent runs. That might have influenced something too.
If it were just a background, then why does it vary at all? Why the big gap at the end on your experiment where it wildly varies in look from the first part? I think we can conclude from this that it is not a background.
Seems like an average. When I look at a well behaved case, where things are turned down, the orange line pretty much is horizontal, no big spikes, and most importantly, it matches the same average reported on the left.
Also, if you were writing a benchmark, would you forget or ignore average? If the orange line is not it, then what is the average?
Presuming that line of thought is right, then the upper spikes are max spikes, which you can also read off the left hand summary. And the lower spikes are min spikes, that also match the left hand summary. It looks to me that the min spikes only show up over the orange average, if they are of sufficient duration to be shown. A one or two frame drop is not enough to draw the black line down.
Same for max, it only shows a spike line up if meets the criteria for drawing a spike up. For this test of 3 minutes long, it's gotta be something sustained for it to show up or down. Each single line spike is roughly 10 frames (about 150 ms for a spike).
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Think of it like a sound wave or lie detector:
As it goes from left to right, it jumps up and down, creating a thick, jagged black zigzag. The more variation in data points (eg. in the latter half of the benchmark where there are lots of explosions, AI calculations and physX), the wider it's going to look. If you coloured the white area underneath that black line orange, you'd basically have your MLL graph.
The red/orange/green area underneath the black line is a guide. That's why it goes from red to orange to green: it's kind of like a ruler that you measure the black line up against. If the black line goes so high that it reveals some green, it means it's a good framerate in that portion. If all you see is red, it means you're getting a poor framerate.
Basically, IMO the red/green/orange was a poor design choice that was supposed to be useful, but added a layer of confusion instead.
......
OK, I've taken one of the original .svg files into Illustrator. I've zoomed way in, and found the separate layers:
http://pbrd.co/18WQT92
There are two graph layers in that SVG. One is the red/orange/green fill, and the other is the black outline, which is superimposed over the top. I've moved the black layer to the right, so you can compare it next to the red/orange/green layer.
As you can see, they are essentially exactly the same. They show exactly the same data, just in different colours. So there can't be an average, minimum, or maximum, because there's just one data stream: a single jagged line that goes across.
And as you can see, it shows a crapload of data. You can zoom in and in and in, and it still shows you intricate data down to the pixel. For all intents and purposes, it's showing you all of the data in as much detail as possible....averages and approximations don't come into play.
My guess is it isn't measuring FPS at all (though it pretends to, for our benefit), but rather the number of milliseconds it takes to spit out a frame. This may also explain why your graphs are fuzzier than mine. Mine have more frames in them, and therefore more data points. So my line ends up looking more dense and 'solid', whereas yours is more sparse and 'fuzzy'
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Seems like it makes more sense if we look at the colored area as Min, and the black area as Max.
A wide variance would be stutter, so Helifax's pic at the end would be stuttering from fast to slow.
Mine would be a lot more variability, giving it that fuzzy look, but not exactly stutter. Maybe this would be microstutter.
Yours without PhysX card I would also expect to be pretty much stutter. Top right of your pic:
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Because the black line went up and down so violently and the graph is so small, the black line got so squished that it effectively appeared to create a large black solid mass.
This solid mass obscured much of the red/orange layer below, making the red/orange layer **appear** to be significantly lower than it is.
But if helifax opens up his .svg file and compares the black layer with the red/orange layer, he'll find that they're actually identical, just like mine were here: http://pbrd.co/18WQT92 The only difference is that one is filled in with a orange/red/green gradient, while the other one has zero fill but a black stroke (outline).*
Remember: I moved the black one to the right so you can compare it to the coloured one. What you see here is a very tight zoom over a very small portion of the graph (which you can see by the fact that the colour is almost all green - that only occurs right at the top of the graph)
Even at that tiny zoomed in level the black and coloured lines are identical. And no, I didn't specially choose a non-stuttering part of the graph - I chose a part with a wide black area.
Look, you're sort of right. If we look at the top of the coloured area, it effectively shows us the min. Just as if we look at the bottom of the white area, it shows us the max. But these are both just negative spaces. The only data actually plotted on the graph is that jagged black line. And because it goes up and down like crazy, it's tempting to think that there's more going on than just a single line. This is actually easiest to see on yours, because even unzoomed yours already looks kind of like a zig zag (because I believe it has less data points to plot)
* In actual fact, the coloured line has a *blue* stroke in these .svg files. But this blue stroke is not visible, because it is completely obscured by the black line which is overlaid on top of it (and because the black one is identical to the blue one, it covers it up perfectly).
Last I checked the Metro LL profile it uses AFR2 mode and we all know that AFR mode is prone to micro stuttering.
What I do remember is that I got worst performance in the benchmark than the actual game... I test it on purpose on the same D6 last level as in the benchmark...
I still think that all the performance updates that were pushed are found only in the main game and not the benchmark... but I could be wrong:))
1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc
My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com
(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)
I attach one of my fuzzy svgs for you to inspect at detail if you like.
http://bo3b.net/MetroLL/SLI%20optimizer%20-%20Run%201.svg
BTW, in the benchmark folder, there is also a CSV file that seems to be the master data for the graph. It looks like your assumption is right, that they spit out frames as fast as possible, and report the ms delta between frames, then convert that to FPS. In the example I posted, there are 7752 frames reported.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers