Well, all I can say is that Asylum runs flawlessly on my current system. No stuttering, 60 fps (120 in stereo) with no dips during fights, as shown with the graph. Minimum frame rate is 120 for me in this game, running at 1600x900 with everything on. This is DX9, the default for this game.
I left PhysX enabled at max, because my goal was to stress the GPU enough to make it matter.
The big difference in my test, is it's idling at a single view, Batman looking up the elevator. If I can get a single spot to perform well, it's more likely I can get it to work in general.
The comments about it not being optimized, and just a port, and on and on are just the usual internet rubbish. The game runs completely fine.
The reason I chose Arkham was because your tests showed that you also have the game, and can directly test it and compare with my results, and more importantly, your old results. Based on your system specs, you should be able to get the same results as my system.
I actually think it's a pretty good benchmark for our conversation here because without high resolutions it easily runs fast enough to be smooth. I can only get dips in frame rate at outrageous resolutions for my system.
It depends upon what you are trying to measure though. I was trying to demonstrate that there is 2x performance loss for going into 3D, as long as you aren't bottlenecked elsewhere. Metro is therefore not as good a test because it's CPU heavy for example. Arkham is graphically heavy, but not CPU heavy.
Well, all I can say is that Asylum runs flawlessly on my current system. No stuttering, 60 fps (120 in stereo) with no dips during fights, as shown with the graph. Minimum frame rate is 120 for me in this game, running at 1600x900 with everything on. This is DX9, the default for this game.
I left PhysX enabled at max, because my goal was to stress the GPU enough to make it matter.
The big difference in my test, is it's idling at a single view, Batman looking up the elevator. If I can get a single spot to perform well, it's more likely I can get it to work in general.
The comments about it not being optimized, and just a port, and on and on are just the usual internet rubbish. The game runs completely fine.
The reason I chose Arkham was because your tests showed that you also have the game, and can directly test it and compare with my results, and more importantly, your old results. Based on your system specs, you should be able to get the same results as my system.
I actually think it's a pretty good benchmark for our conversation here because without high resolutions it easily runs fast enough to be smooth. I can only get dips in frame rate at outrageous resolutions for my system.
It depends upon what you are trying to measure though. I was trying to demonstrate that there is 2x performance loss for going into 3D, as long as you aren't bottlenecked elsewhere. Metro is therefore not as good a test because it's CPU heavy for example. Arkham is graphically heavy, but not CPU heavy.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]
It depends upon what you are trying to measure though. I was trying to demonstrate that there is 2x performance loss for going into 3D, as long as you aren't bottlenecked elsewhere. Metro is therefore not as good a test because it's CPU heavy for example. Arkham is graphically heavy, but not CPU heavy.[/quote]
I think your testing is sound even if I disagree with some of your conclusions.
If we want to test the impact of different CPU's and their speeds we'd use very low or medium graphics settings to ensure that the GPU was not limiting the frame rate. Conversely if we want to test the impact of a graphics setting on the GPU we'd want to load the GPU up as much as we can and then measure the effect(s) of toggling just that one setting.
Your >=50% impact of 3DVision is in line with everything I've read about Nvidia's 3DVision actually rendering two distinct viewpoint/frames. (vs less GPU intensive solutions being used)
My disagreement stems from my thinking that a ~35% cost is a better and more desirable result than >=50%.
bo3b said:
It depends upon what you are trying to measure though. I was trying to demonstrate that there is 2x performance loss for going into 3D, as long as you aren't bottlenecked elsewhere. Metro is therefore not as good a test because it's CPU heavy for example. Arkham is graphically heavy, but not CPU heavy.
I think your testing is sound even if I disagree with some of your conclusions.
If we want to test the impact of different CPU's and their speeds we'd use very low or medium graphics settings to ensure that the GPU was not limiting the frame rate. Conversely if we want to test the impact of a graphics setting on the GPU we'd want to load the GPU up as much as we can and then measure the effect(s) of toggling just that one setting.
Your >=50% impact of 3DVision is in line with everything I've read about Nvidia's 3DVision actually rendering two distinct viewpoint/frames. (vs less GPU intensive solutions being used)
My disagreement stems from my thinking that a ~35% cost is a better and more desirable result than >=50%.
No real disagreement there, I'd like 35% instead of 50% too, as long as we didn't get to 35% by capping the 2D top. Because of the 2x hit for 3D, anything less than 50% is going to be demonstrating a different bottleneck.
This is why I'm not sure we can use Metro as a test, even running low resolution I had cases where the CPU was the limit. I'll see if I can find a solid test case there.
The whole trend toward multiple cores put a cap on our CPU performance. Moore's law went out the window once they decided to do multiple cores instead of actual performance. Metro LL is better, but Metro 2033 uses a single core to the max, and so I've seen it limiting performance.
No real disagreement there, I'd like 35% instead of 50% too, as long as we didn't get to 35% by capping the 2D top. Because of the 2x hit for 3D, anything less than 50% is going to be demonstrating a different bottleneck.
This is why I'm not sure we can use Metro as a test, even running low resolution I had cases where the CPU was the limit. I'll see if I can find a solid test case there.
The whole trend toward multiple cores put a cap on our CPU performance. Moore's law went out the window once they decided to do multiple cores instead of actual performance. Metro LL is better, but Metro 2033 uses a single core to the max, and so I've seen it limiting performance.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Humm... maybe a GPU bound demo like [url]http://www.geforce.com/games-applications/pc-applications/rocket-sled[/url] might be useful or [url]http://www.geforce.com/games-applications/pc-applications/Stone-Giant-Demo[/url] or plain old [url]http://www.geforce.com/games-applications/pc-applications/Unigine-Heaven-2[/url].
The features that are interesting is being automatic/scripted so its repeatable, not CPU dependent so that only GPU load/FPS cost is measured, free to install use so multiple people and data points can be collected.
Here's my Metro2033 results:
2D no vsync:
[img]http://farm4.staticflickr.com/3718/11547989913_d0214434b9_z.jpg[/img]
3DVision no vsync:
[img]http://farm4.staticflickr.com/3799/11547875844_69da47e7b6_z.jpg[/img]
Actually, except for the unexplained speedup/spike at the end of the run the 3DVision impact is much worse than 35% and the lowest frame rate is still unacceptable for both tests even for this old game on modern GPU hardware.
The features that are interesting is being automatic/scripted so its repeatable, not CPU dependent so that only GPU load/FPS cost is measured, free to install use so multiple people and data points can be collected.
Here's my Metro2033 results:
2D no vsync:
3DVision no vsync:
Actually, except for the unexplained speedup/spike at the end of the run the 3DVision impact is much worse than 35% and the lowest frame rate is still unacceptable for both tests even for this old game on modern GPU hardware.
Thanks for taking a look.
Strange anomaly at the end. Probably worth a retest to see.
Outside of that, the early part of the test is roughly 50% scaling from 75fps to 150fps.
The Min they report is bogus unfortunately, because they take one-off spurious problems and keep them as Min. We are after a sort of average of the min, not necessarily worst case scenario that can happen when loading.
I'm also not sure how to measure those down spikes in 3D, where it's a legit min/stall of some form. (15s in on 3D graph) The low spikes at 35s on 2D are probably not a concern, this benchmark has some stutters that aren't seen in game.
Seems like the actual running Min for 2D is 70fps, and for 3D it's maybe 40 fps, depending on what we count.
Strange anomaly at the end. Probably worth a retest to see.
Outside of that, the early part of the test is roughly 50% scaling from 75fps to 150fps.
The Min they report is bogus unfortunately, because they take one-off spurious problems and keep them as Min. We are after a sort of average of the min, not necessarily worst case scenario that can happen when loading.
I'm also not sure how to measure those down spikes in 3D, where it's a legit min/stall of some form. (15s in on 3D graph) The low spikes at 35s on 2D are probably not a concern, this benchmark has some stutters that aren't seen in game.
Seems like the actual running Min for 2D is 70fps, and for 3D it's maybe 40 fps, depending on what we count.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="mike_ar69"][quote="TsaebehT"]Who else remembers typing this after every new GPU purchase? :)
timedemo 1
map demo1.dm2[/quote]
Me! :-)[/quote]
Here here :)
I still maintain that gsync will probably improve animation in a max-framerate scenario. (though I also maintain that it's largely a moot point, since a steady 120fps at max settings is unfortunately a very rare thing with most new games even on high-end hardware such as mine)
I still haven't found any conclusive evidence for yay or nay. Tech Report (arguably the birthplace of the internet's recent fascination with frame timings) recently wrote a good article about gsync, with slow-mo video comparisons. Unfortunately, they didn't actually measure a max-framerate scenario. Still, they veered close to it in the quote below, where the author greatly praises the difference gsync makes in an [i]almost[/i]-max-framerate scenario (and which suggests to me that gsync probably also improves the experience in a purely-max-framerate scenario)
The guy who wrote these words has intimate experience with the benefits of 120 or 144hz displays, so his over-the-top enthusiasm can't be chalked up to a lack of familiarity with high refresh rate monitors.
http://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech/4
[quote]....I played [Borderlands 2 on various cards, including] a GeForce GTX 780 Ti, where frame rates went as high as 120-144 FPS. Holy crap, it was awesome in every case, and more so with the faster graphics cards.
I tend to have a wee bit of an addictive personality, and playing BL2 with G-Sync's creamy smoothness fed that tendency in wonderful and dangerous ways........The speed and fluidity of that game on this G-Sync display is like a combo meth/crack IV drip.[/quote]
Regardless, the bottom line is that - as long as it works well in 3D - gsync will surely be a good improvement. Few of us ever see rock solid max framerates anyway, unless we turn down settings or play old games. So, most of the time, if not all of the time, gsync will help.
I still maintain that gsync will probably improve animation in a max-framerate scenario. (though I also maintain that it's largely a moot point, since a steady 120fps at max settings is unfortunately a very rare thing with most new games even on high-end hardware such as mine)
I still haven't found any conclusive evidence for yay or nay. Tech Report (arguably the birthplace of the internet's recent fascination with frame timings) recently wrote a good article about gsync, with slow-mo video comparisons. Unfortunately, they didn't actually measure a max-framerate scenario. Still, they veered close to it in the quote below, where the author greatly praises the difference gsync makes in an almost-max-framerate scenario (and which suggests to me that gsync probably also improves the experience in a purely-max-framerate scenario)
....I played [Borderlands 2 on various cards, including] a GeForce GTX 780 Ti, where frame rates went as high as 120-144 FPS. Holy crap, it was awesome in every case, and more so with the faster graphics cards.
I tend to have a wee bit of an addictive personality, and playing BL2 with G-Sync's creamy smoothness fed that tendency in wonderful and dangerous ways........The speed and fluidity of that game on this G-Sync display is like a combo meth/crack IV drip.
Regardless, the bottom line is that - as long as it works well in 3D - gsync will surely be a good improvement. Few of us ever see rock solid max framerates anyway, unless we turn down settings or play old games. So, most of the time, if not all of the time, gsync will help.
Techreport does sound good.
/\/\
Not sure why you're expecting g-sync to work with 3D Vision using shutter glasses though. The specs for 3D Vision show 100, 120hz as the supported rates. That's not variable at all. And you certainly couldn't drop to 60hz on the glasses (to accommodate drops to 30fps) without it being torture for the user. Which probably explains why Nvidia won't bother to redesign glasses that could sync at variable hz rates.
Been googling some CES stuff lately (in anticipation), and it seems there are some cheap 4K HDTV coming that have fantastic passive 3D. A high resolution, passive monitor that did G-Sync would be very awesome.
As much as I like 3D Vision, I can't lie and say my eyes don't thank me when I go periods with no 3D gaming. So I'd be all over this solution.
Not sure why you're expecting g-sync to work with 3D Vision using shutter glasses though. The specs for 3D Vision show 100, 120hz as the supported rates. That's not variable at all. And you certainly couldn't drop to 60hz on the glasses (to accommodate drops to 30fps) without it being torture for the user. Which probably explains why Nvidia won't bother to redesign glasses that could sync at variable hz rates.
Been googling some CES stuff lately (in anticipation), and it seems there are some cheap 4K HDTV coming that have fantastic passive 3D. A high resolution, passive monitor that did G-Sync would be very awesome.
As much as I like 3D Vision, I can't lie and say my eyes don't thank me when I go periods with no 3D gaming. So I'd be all over this solution.
Well ATI have just let known about something they've had for 3 generations of cards, that is basically Gsync for free, another high-end fail for us Nvidia buyers then.....
[url]http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech[/url]
Well ATI have just let known about something they've had for 3 generations of cards, that is basically Gsync for free, another high-end fail for us Nvidia buyers then.....
[quote="ZedEx48K"]Well ATI have just let known about something they've had for 3 generations of cards, that is basically Gsync for free, another high-end fail for us Nvidia buyers then.....
[/quote]
lol free, please explain how all AMD users will get dynamically synced refresh rates on their static monitors for free. This article was already posted here and seems like a big joke:
https://forums.geforce.com/default/topic/670119/the-geforce-lounge/amd-could-counter-nvidias-g-sync-with-simpler-free-sync-tech/
ZedEx48K said:Well ATI have just let known about something they've had for 3 generations of cards, that is basically Gsync for free, another high-end fail for us Nvidia buyers then.....
lol free, please explain how all AMD users will get dynamically synced refresh rates on their static monitors for free. This article was already posted here and seems like a big joke:
Well, it looks like AMD and not Nvidia has at least shown something minimally related with 3d.
http://www.mtbs3d.com/index.php?option=com_content&view=article&id=13658:amd-talks-vr-at-ces-2014&catid=35&Itemid=73
Basically, they are trying to make their own holodeck with Oculus, and also working on 3d sound.
Basically, they are trying to make their own holodeck with Oculus, and also working on 3d sound.
All hail 3d modders DHR, MasterOtaku, Losti, Necropants, Helifax, bo3b, mike_ar69, Flugan, DarkStarSword, 4everAwake, 3d4dd and so many more helping to keep the 3d dream alive, find their 3d fixes at http://helixmod.blogspot.com/ Also check my site for spanish VR and mobile gaming news: www.gamermovil.com
I left PhysX enabled at max, because my goal was to stress the GPU enough to make it matter.
The big difference in my test, is it's idling at a single view, Batman looking up the elevator. If I can get a single spot to perform well, it's more likely I can get it to work in general.
The comments about it not being optimized, and just a port, and on and on are just the usual internet rubbish. The game runs completely fine.
The reason I chose Arkham was because your tests showed that you also have the game, and can directly test it and compare with my results, and more importantly, your old results. Based on your system specs, you should be able to get the same results as my system.
I actually think it's a pretty good benchmark for our conversation here because without high resolutions it easily runs fast enough to be smooth. I can only get dips in frame rate at outrageous resolutions for my system.
It depends upon what you are trying to measure though. I was trying to demonstrate that there is 2x performance loss for going into 3D, as long as you aren't bottlenecked elsewhere. Metro is therefore not as good a test because it's CPU heavy for example. Arkham is graphically heavy, but not CPU heavy.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I think your testing is sound even if I disagree with some of your conclusions.
If we want to test the impact of different CPU's and their speeds we'd use very low or medium graphics settings to ensure that the GPU was not limiting the frame rate. Conversely if we want to test the impact of a graphics setting on the GPU we'd want to load the GPU up as much as we can and then measure the effect(s) of toggling just that one setting.
Your >=50% impact of 3DVision is in line with everything I've read about Nvidia's 3DVision actually rendering two distinct viewpoint/frames. (vs less GPU intensive solutions being used)
My disagreement stems from my thinking that a ~35% cost is a better and more desirable result than >=50%.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
This is why I'm not sure we can use Metro as a test, even running low resolution I had cases where the CPU was the limit. I'll see if I can find a solid test case there.
The whole trend toward multiple cores put a cap on our CPU performance. Moore's law went out the window once they decided to do multiple cores instead of actual performance. Metro LL is better, but Metro 2033 uses a single core to the max, and so I've seen it limiting performance.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
The features that are interesting is being automatic/scripted so its repeatable, not CPU dependent so that only GPU load/FPS cost is measured, free to install use so multiple people and data points can be collected.
Here's my Metro2033 results:
2D no vsync:
3DVision no vsync:
Actually, except for the unexplained speedup/spike at the end of the run the 3DVision impact is much worse than 35% and the lowest frame rate is still unacceptable for both tests even for this old game on modern GPU hardware.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
Strange anomaly at the end. Probably worth a retest to see.
Outside of that, the early part of the test is roughly 50% scaling from 75fps to 150fps.
The Min they report is bogus unfortunately, because they take one-off spurious problems and keep them as Min. We are after a sort of average of the min, not necessarily worst case scenario that can happen when loading.
I'm also not sure how to measure those down spikes in 3D, where it's a legit min/stall of some form. (15s in on 3D graph) The low spikes at 35s on 2D are probably not a concern, this benchmark has some stutters that aren't seen in game.
Seems like the actual running Min for 2D is 70fps, and for 3D it's maybe 40 fps, depending on what we count.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
timedemo 1
map demo1.dm2
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
Me! :-)
Rig: Intel i7-8700K @4.7GHz, 16Gb Ram, SSD, GTX 1080Ti, Win10x64, Asus VG278
Here here :)
https://steamcommunity.com/profiles/76561198014296177/
Oh god.. yeah.
AsRock X58 Extreme6 mobo
Intel Core-i7 950 @ 4ghz
12gb Corsair Dominator DDR3 1600
ASUS DirectCU II GTX 780 3gb
Corsair TX 950w PSU
NZXT Phantom Red/Black Case
3d Vision 1 w/ Samsung 2233rz Monitor
3d Vision 2 w/ ASUS VG278HE Monitor
timedemo 1
demomap demo1.dm2
400fps with what I think was a GeForce4 Ti 4600 :p
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I still haven't found any conclusive evidence for yay or nay. Tech Report (arguably the birthplace of the internet's recent fascination with frame timings) recently wrote a good article about gsync, with slow-mo video comparisons. Unfortunately, they didn't actually measure a max-framerate scenario. Still, they veered close to it in the quote below, where the author greatly praises the difference gsync makes in an almost-max-framerate scenario (and which suggests to me that gsync probably also improves the experience in a purely-max-framerate scenario)
The guy who wrote these words has intimate experience with the benefits of 120 or 144hz displays, so his over-the-top enthusiasm can't be chalked up to a lack of familiarity with high refresh rate monitors.
http://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech/4
Regardless, the bottom line is that - as long as it works well in 3D - gsync will surely be a good improvement. Few of us ever see rock solid max framerates anyway, unless we turn down settings or play old games. So, most of the time, if not all of the time, gsync will help.
/\/\
Not sure why you're expecting g-sync to work with 3D Vision using shutter glasses though. The specs for 3D Vision show 100, 120hz as the supported rates. That's not variable at all. And you certainly couldn't drop to 60hz on the glasses (to accommodate drops to 30fps) without it being torture for the user. Which probably explains why Nvidia won't bother to redesign glasses that could sync at variable hz rates.
Been googling some CES stuff lately (in anticipation), and it seems there are some cheap 4K HDTV coming that have fantastic passive 3D. A high resolution, passive monitor that did G-Sync would be very awesome.
As much as I like 3D Vision, I can't lie and say my eyes don't thank me when I go periods with no 3D gaming. So I'd be all over this solution.
http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech
lol free, please explain how all AMD users will get dynamically synced refresh rates on their static monitors for free. This article was already posted here and seems like a big joke:
https://forums.geforce.com/default/topic/670119/the-geforce-lounge/amd-could-counter-nvidias-g-sync-with-simpler-free-sync-tech/
Intel i7-2600K OC 4.5 GHZ CPU | EVGA Titan X SC GPU | Corsair RM 1000 GOLD PSU
G.SKILL Ripjaws X Series 16GB DDR3 1600 RAM | ASRock Z68 Extreme3 Gen3 MOBO |
ASUS ROG Swift PG278Q Monitor | Corsair Carbide Air 540 Case
http://www.mtbs3d.com/index.php?option=com_content&view=article&id=13658:amd-talks-vr-at-ces-2014&catid=35&Itemid=73
Basically, they are trying to make their own holodeck with Oculus, and also working on 3d sound.
All hail 3d modders DHR, MasterOtaku, Losti, Necropants, Helifax, bo3b, mike_ar69, Flugan, DarkStarSword, 4everAwake, 3d4dd and so many more helping to keep the 3d dream alive, find their 3d fixes at http://helixmod.blogspot.com/ Also check my site for spanish VR and mobile gaming news: www.gamermovil.com