While marketing might make G-Sync seem like the greatest thing since presliced bread the realities might be something different. (at least for us 'tower' pc users)
If G-Sync were a 'cure all' it would be as if Nvidia were "shooting itself in the foot" as where would the incentive be to upgrade to more powerful (and expensive) hardware be if their invention smoothed out the stuttering and removed screen tearing from game play entirely?
Granted on the one hand PC gaming CAN be very expensive and likely more kids are introduced to gaming via a console system (living room or handheld/portable) then with a PC outfitted with modern up to date high end hardware. If changing/redesigning the hardware that a PC uses makes a investment in PC gaming hardware last longer or at least appear to be less of a "black hole" in the consumers wallet it may be more attractive to more users. On the other hand if a gaming PC "build" was viable for the same 8-10yrs that the consoles seem to have they'd sell a lot less desktop video cards.
The problem Nvidia faces is that it requires not only a Nvidia GPU but a special Nvidia G-Sync enabled monitor. So while some consumers may finally decide they want a new 'tower' they would also have to purchase a new Nvidia GPU card AND a new monitor to go with it to get any benefit. The current trend is that consumers forget about getting a tower altogether and purchase ether a laptop or more likely than not a tablet or hand held device. (most of which don't have Nvidia GPU HW included)
Which ultimately MIGHT make Kepler based laptops more attractive! Laptop GPU's are generally to almost always underpowered CPU/GPU wise and struggle with providing a smooth frame rates. The REAL impact is if they were able to convince OEM's to build in the G-Sync hardware into Nvidia powered laptops!! Suddenly they'd have something that Intel and AMD don't have - a smooth portable game play experience! I can see a real market for that - I'd buy one!
A OEM could couple a very inexpensive AMD mobile CPU chip with a Nvidia mobile Kepler GPU and G-Sync enabled IPS monitor and market it as a "gaming ultralight" laptop. G-Sync on a laptop could completely change portable/mobile gaming as we know it today.
So I think the real market for G-SYNC is not with hardcore desktop gamers (be it 2D OR 3D) but with the laptop market. Us 3D users would benefit as well if a G-Sync enabled laptop was able to provide a similar game play experience as we get on our high powered (and costly) desktop systems.
IMHO/YMMV
While marketing might make G-Sync seem like the greatest thing since presliced bread the realities might be something different. (at least for us 'tower' pc users)
If G-Sync were a 'cure all' it would be as if Nvidia were "shooting itself in the foot" as where would the incentive be to upgrade to more powerful (and expensive) hardware be if their invention smoothed out the stuttering and removed screen tearing from game play entirely?
Granted on the one hand PC gaming CAN be very expensive and likely more kids are introduced to gaming via a console system (living room or handheld/portable) then with a PC outfitted with modern up to date high end hardware. If changing/redesigning the hardware that a PC uses makes a investment in PC gaming hardware last longer or at least appear to be less of a "black hole" in the consumers wallet it may be more attractive to more users. On the other hand if a gaming PC "build" was viable for the same 8-10yrs that the consoles seem to have they'd sell a lot less desktop video cards.
The problem Nvidia faces is that it requires not only a Nvidia GPU but a special Nvidia G-Sync enabled monitor. So while some consumers may finally decide they want a new 'tower' they would also have to purchase a new Nvidia GPU card AND a new monitor to go with it to get any benefit. The current trend is that consumers forget about getting a tower altogether and purchase ether a laptop or more likely than not a tablet or hand held device. (most of which don't have Nvidia GPU HW included)
Which ultimately MIGHT make Kepler based laptops more attractive! Laptop GPU's are generally to almost always underpowered CPU/GPU wise and struggle with providing a smooth frame rates. The REAL impact is if they were able to convince OEM's to build in the G-Sync hardware into Nvidia powered laptops!! Suddenly they'd have something that Intel and AMD don't have - a smooth portable game play experience! I can see a real market for that - I'd buy one!
A OEM could couple a very inexpensive AMD mobile CPU chip with a Nvidia mobile Kepler GPU and G-Sync enabled IPS monitor and market it as a "gaming ultralight" laptop. G-Sync on a laptop could completely change portable/mobile gaming as we know it today.
So I think the real market for G-SYNC is not with hardcore desktop gamers (be it 2D OR 3D) but with the laptop market. Us 3D users would benefit as well if a G-Sync enabled laptop was able to provide a similar game play experience as we get on our high powered (and costly) desktop systems.
[quote="Paul33993"]
That may be true with a small percentage of games, but it's much more common for a game to require a doubling of needed power to do 3D @60fps. I'd be great if every game only had a 20 - 40 percent penalty, but that's sure not my experience. It's a rarity.[/quote]
I have thoroughly investaged the performance impact due to S3D and have posted results from many games in the old iZ3D forums as well as the MTBS forums. Please find my chart below:
[img]http://www.shahzad.aquiss.com/s3d.png[/img]
http://forum.iz3d.com/viewtopic.php?t=1815
My conclusion was:
"The most important conclusion is that the drop due to S3D on Both ATi and nVidia cards is about the same: 35%"
If you are having a bigger performance impact than ~35%, it may be another problem, such as a lack of RAM/Video RAM which S3D needs more of. In such a case, your FPS will drop dramatically when the GPU hits the memory limit wall, usually manifested as stutter and microstutter; which is again interesting due to your interest in gsync which is meant to reduce stutter/microstutter. I wonder if this is in fact your problem?
Paul33993 said:
That may be true with a small percentage of games, but it's much more common for a game to require a doubling of needed power to do 3D @60fps. I'd be great if every game only had a 20 - 40 percent penalty, but that's sure not my experience. It's a rarity.
I have thoroughly investaged the performance impact due to S3D and have posted results from many games in the old iZ3D forums as well as the MTBS forums. Please find my chart below:
"The most important conclusion is that the drop due to S3D on Both ATi and nVidia cards is about the same: 35%"
If you are having a bigger performance impact than ~35%, it may be another problem, such as a lack of RAM/Video RAM which S3D needs more of. In such a case, your FPS will drop dramatically when the GPU hits the memory limit wall, usually manifested as stutter and microstutter; which is again interesting due to your interest in gsync which is meant to reduce stutter/microstutter. I wonder if this is in fact your problem?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
That's a very small and selective group. And many of them are old. Sorry. I just don't buy it. It's even an old GPU. Maybe that's the problem. Most of those games have lousy framerates even in 2D. Maybe there's excess power to spare due to bottlenecks elsewhere. TR has the best framerate and it's almost halved in 3D. So, I don't know, just not buying it. I could produce a small chart where almost every game had halved performance. That wouldn't prove anything either.
I honestly don't look at FPS much, because I don't wanna know the difference. But in some of the games I spend my most time, race sims, all of the ones I play literally double my framerate if I turn it off. And it's not memory, because I have 2GB and it doesn't matter what the settings are. If I turn everything way down, I could get 300 in 2D and 150 in 3D. And if I keep it cranked up (within reason to keep close to 60), it could be 50 - 60 in 3D and 100 - 120 in 2D. (Unless we're talking iracing, then I can crank everything up and still get 150 - 200 in 3D and 300+ in 2D. That game has low requirements.)
And lol if that's my problem and reason for interest in G-Sync. I've said before I don't run v-sync in 2D (it's mandatory in 3D). Stutter only occurs with v-sync on. I'm interested in g-sync because in theory, it's the way LCD should have worked from day one. In practice, I'm skeptical. Because I don't understand how you can get perfect motion resolution (like in the text on swinging pendulum) when the refresh rate is, say, 37hz. It doesn't make sense to me. LCD has historically sucked in motion resolution. That's why all these HDTVs use the ridiculous interpolation. To get the motion resolution above a pathetic 300 lines of resolution.
That's a very small and selective group. And many of them are old. Sorry. I just don't buy it. It's even an old GPU. Maybe that's the problem. Most of those games have lousy framerates even in 2D. Maybe there's excess power to spare due to bottlenecks elsewhere. TR has the best framerate and it's almost halved in 3D. So, I don't know, just not buying it. I could produce a small chart where almost every game had halved performance. That wouldn't prove anything either.
I honestly don't look at FPS much, because I don't wanna know the difference. But in some of the games I spend my most time, race sims, all of the ones I play literally double my framerate if I turn it off. And it's not memory, because I have 2GB and it doesn't matter what the settings are. If I turn everything way down, I could get 300 in 2D and 150 in 3D. And if I keep it cranked up (within reason to keep close to 60), it could be 50 - 60 in 3D and 100 - 120 in 2D. (Unless we're talking iracing, then I can crank everything up and still get 150 - 200 in 3D and 300+ in 2D. That game has low requirements.)
And lol if that's my problem and reason for interest in G-Sync. I've said before I don't run v-sync in 2D (it's mandatory in 3D). Stutter only occurs with v-sync on. I'm interested in g-sync because in theory, it's the way LCD should have worked from day one. In practice, I'm skeptical. Because I don't understand how you can get perfect motion resolution (like in the text on swinging pendulum) when the refresh rate is, say, 37hz. It doesn't make sense to me. LCD has historically sucked in motion resolution. That's why all these HDTVs use the ridiculous interpolation. To get the motion resolution above a pathetic 300 lines of resolution.
[quote="Paul33993"]That's a very small and selective group. And many of them are old. Sorry. I just don't buy it. It's even an old GPU. Maybe that's the problem. Most of those games have lousy framerates even in 2D. Maybe there's excess power to spare due to bottlenecks elsewhere. TR has the best framerate and it's almost halved in 3D. So, I don't know, just not buying it. I could produce a small chart where almost every game had halved performance. That wouldn't prove anything either.
I honestly don't look at FPS much, because I don't wanna know the difference. But in some of the games I spend my most time, race sims, all of the ones I play literally double my framerate if I turn it off. And it's not memory, because I have 2GB and it doesn't matter what the settings are. If I turn everything way down, I could get 300 in 2D and 150 in 3D. And if I keep it cranked up (within reason to keep close to 60), it could be 50 - 60 in 3D and 100 - 120 in 2D. (Unless we're talking iracing, then I can crank everything up and still get 150 - 200 in 3D and 300+ in 2D. That game has low requirements.)
And lol if that's my problem and reason for interest in G-Sync. I've said before I don't run v-sync in 2D (it's mandatory in 3D). Stutter only occurs with v-sync on. I'm interested in g-sync because in theory, it's the way LCD should have worked from day one. In practice, I'm skeptical. Because I don't understand how you can get perfect motion resolution (like in the text on swinging pendulum) when the refresh rate is, say, 37hz. It doesn't make sense to me. LCD has historically sucked in motion resolution. That's why all these HDTVs use the ridiculous interpolation. To get the motion resolution above a pathetic 300 lines of resolution.[/quote]
Can you give me a list of games you have problems with? I will try to test specifically these as from some quick benchmarks, modern games on my rig have only ~35% drop. There is something very strange going on with your system if you are getting a bigger performance hit.
Also, What has motion resolution got to do with vsync? They are quite different.
Please would you explain to me your interpretation of motion resolution, and how it conflicts with vsync/gsync?
Paul33993 said:That's a very small and selective group. And many of them are old. Sorry. I just don't buy it. It's even an old GPU. Maybe that's the problem. Most of those games have lousy framerates even in 2D. Maybe there's excess power to spare due to bottlenecks elsewhere. TR has the best framerate and it's almost halved in 3D. So, I don't know, just not buying it. I could produce a small chart where almost every game had halved performance. That wouldn't prove anything either.
I honestly don't look at FPS much, because I don't wanna know the difference. But in some of the games I spend my most time, race sims, all of the ones I play literally double my framerate if I turn it off. And it's not memory, because I have 2GB and it doesn't matter what the settings are. If I turn everything way down, I could get 300 in 2D and 150 in 3D. And if I keep it cranked up (within reason to keep close to 60), it could be 50 - 60 in 3D and 100 - 120 in 2D. (Unless we're talking iracing, then I can crank everything up and still get 150 - 200 in 3D and 300+ in 2D. That game has low requirements.)
And lol if that's my problem and reason for interest in G-Sync. I've said before I don't run v-sync in 2D (it's mandatory in 3D). Stutter only occurs with v-sync on. I'm interested in g-sync because in theory, it's the way LCD should have worked from day one. In practice, I'm skeptical. Because I don't understand how you can get perfect motion resolution (like in the text on swinging pendulum) when the refresh rate is, say, 37hz. It doesn't make sense to me. LCD has historically sucked in motion resolution. That's why all these HDTVs use the ridiculous interpolation. To get the motion resolution above a pathetic 300 lines of resolution.
Can you give me a list of games you have problems with? I will try to test specifically these as from some quick benchmarks, modern games on my rig have only ~35% drop. There is something very strange going on with your system if you are getting a bigger performance hit.
Also, What has motion resolution got to do with vsync? They are quite different.
Please would you explain to me your interpretation of motion resolution, and how it conflicts with vsync/gsync?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="RAGEdemon"]bo3b, I see your misconception too.
3D vison doesn't render 120fps. It renders 60 for the first eye. The other 60 for the other eye are the exact same frames but from a different perspective. This costs the GPU a fraction of the power above generating only 60fps.[/quote]I understand your point, and I also read your prior thread about 120fps gaming. I'm sorry, but this is at least partially incorrect.
NVidia 3D Vision specifically does render two full frames for every 60Hz. It's 120fps worth of work for the GPU, not a fraction of the power as you suggest.
See this whitepaper for how it works in NVidia's automatic mode:
[url]http://www.nvidia.com/content/GTC-2010/pdfs/2010_GTC2010.pdf[/url]
Now there are two different ways of generating 3D, by geometry, or by depth buffer. Depth buffer is typically used by Tridef, and is used by games like Crysis 2 and 3. This technique does in fact take only a small amount more GPU power because the two perspectives are rendered from a single depth-buffer, as a sort of post-process technique. By most comments here, depth buffer 3D is a pale second to true geometry rendering.
In geometry rendering another entire frame is rendered, from the other eye's perspective. This allows you to sort of look around things, and fixes the biggest problem with depth buffer- which is that it tends to make things look like flat planes or billboards.
[s]For your examples of SLI scaling, I'm not sure what is different in my setup as well. Typically I get close 100% improvement in SLI. It's a rare game that even gets only 50% improvement from SLI. [/s] Sorry, you are just talking about S3D impact, not SLI.
One big difference is our resolutions. I run at 1280x720@120 for a projector, and your results are 1680x1050@120.
Also your tests were done with a perfectly good CPU of the time, but I would suggest that the earlier test cases were CPU limited, not GPU limited. With SLI, I found that my older CPU was really holding me back.
RAGEdemon said:bo3b, I see your misconception too.
3D vison doesn't render 120fps. It renders 60 for the first eye. The other 60 for the other eye are the exact same frames but from a different perspective. This costs the GPU a fraction of the power above generating only 60fps.
I understand your point, and I also read your prior thread about 120fps gaming. I'm sorry, but this is at least partially incorrect.
NVidia 3D Vision specifically does render two full frames for every 60Hz. It's 120fps worth of work for the GPU, not a fraction of the power as you suggest.
See this whitepaper for how it works in NVidia's automatic mode:
Now there are two different ways of generating 3D, by geometry, or by depth buffer. Depth buffer is typically used by Tridef, and is used by games like Crysis 2 and 3. This technique does in fact take only a small amount more GPU power because the two perspectives are rendered from a single depth-buffer, as a sort of post-process technique. By most comments here, depth buffer 3D is a pale second to true geometry rendering.
In geometry rendering another entire frame is rendered, from the other eye's perspective. This allows you to sort of look around things, and fixes the biggest problem with depth buffer- which is that it tends to make things look like flat planes or billboards.
For your examples of SLI scaling, I'm not sure what is different in my setup as well. Typically I get close 100% improvement in SLI. It's a rare game that even gets only 50% improvement from SLI. Sorry, you are just talking about S3D impact, not SLI.
One big difference is our resolutions. I run at 1280x720@120 for a projector, and your results are 1680x1050@120.
Also your tests were done with a perfectly good CPU of the time, but I would suggest that the earlier test cases were CPU limited, not GPU limited. With SLI, I found that my older CPU was really holding me back.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
In my experience depthbuffered stereo takes close to no fps-hit in "stereo" while the ~30% frameratehit in "brute-force"/"dual rendering"-stereo matches my results as well. This goes for iz3d-drivers in particular while i sometimes can get a bigger fps-hit with tridef. My rig is quite old though.
I have never touched 3d-vision because i will never support a company who behaves like nvidia to their customers, period!
If you get a bigger fps-hit with 3d-vision enabled it's then because it poorly optimized or that other softwares (helix mod, don't know if it has impact during actual gaming?) interferes.
In my experience depthbuffered stereo takes close to no fps-hit in "stereo" while the ~30% frameratehit in "brute-force"/"dual rendering"-stereo matches my results as well. This goes for iz3d-drivers in particular while i sometimes can get a bigger fps-hit with tridef. My rig is quite old though.
I have never touched 3d-vision because i will never support a company who behaves like nvidia to their customers, period!
If you get a bigger fps-hit with 3d-vision enabled it's then because it poorly optimized or that other softwares (helix mod, don't know if it has impact during actual gaming?) interferes.
@Likay, I understand hating the corporate weazels, but the only person you hurt is yourself. You are missing out on the most impressive and fun gaming I've ever experienced. Expensive, yes, no doubt. But for me at least, worth it.
How about some actual data from today? In the above list, you have Arkham Asylum which I have on hand. I disabled SLI so it's a single GTX 580 GPU. We know that Arkham also has a 60fps cap.
NVidia 3D Vision OFF (yellow highlight area):
[img]http://bo3b.net/bataa/3D%20off.JPG[/img]
NVidia 3D Vision ON (yellow highlight area):
[img]http://bo3b.net/bataa/3D%20on.JPG[/img]
The same graph is for both, idling with 3D ON, then idling with 3D OFF. I set the cursor over the specific areas so you can read the numbers in the flat spots.
Compare frame rates: 57 fps vs. 32 fps
@Likay, I understand hating the corporate weazels, but the only person you hurt is yourself. You are missing out on the most impressive and fun gaming I've ever experienced. Expensive, yes, no doubt. But for me at least, worth it.
How about some actual data from today? In the above list, you have Arkham Asylum which I have on hand. I disabled SLI so it's a single GTX 580 GPU. We know that Arkham also has a 60fps cap.
NVidia 3D Vision OFF (yellow highlight area):
NVidia 3D Vision ON (yellow highlight area):
The same graph is for both, idling with 3D ON, then idling with 3D OFF. I set the cursor over the specific areas so you can read the numbers in the flat spots.
Compare frame rates: 57 fps vs. 32 fps
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="Likay"]In my experience depthbuffered stereo takes close to no fps-hit in "stereo" while the ~30% frameratehit in "brute-force"/"dual rendering"-stereo matches my results as well. This goes for iz3d-drivers in particular while i sometimes can get a bigger fps-hit with tridef. My rig is quite old though.
I have never touched 3d-vision because i will never support a company who behaves like nvidia to their customers, period!
If you get a bigger fps-hit with 3d-vision enabled it's then because it poorly optimized or that other softwares (helix mod, don't know if it has impact during actual gaming?) interferes.[/quote]
Are you saying that depth buffered stereo can match the quality of dual view rendered stereo and that Nvidia could do this if they wanted?
Likay said:In my experience depthbuffered stereo takes close to no fps-hit in "stereo" while the ~30% frameratehit in "brute-force"/"dual rendering"-stereo matches my results as well. This goes for iz3d-drivers in particular while i sometimes can get a bigger fps-hit with tridef. My rig is quite old though.
I have never touched 3d-vision because i will never support a company who behaves like nvidia to their customers, period!
If you get a bigger fps-hit with 3d-vision enabled it's then because it poorly optimized or that other softwares (helix mod, don't know if it has impact during actual gaming?) interferes.
Are you saying that depth buffered stereo can match the quality of dual view rendered stereo and that Nvidia could do this if they wanted?
Deptbuffered stereo will never match the quality of dual view stereo. It's quite ok when playing a game where everything is at a far distance. The technique shows it weakness when coming up close to objects and the distortions will be really disturbing. I'm personally not fond of "depth-only"-faraway sceneries so i skip this option in all. I can just as well play without glasses.
Nvidia can of course implement this at any time but imo it should be the last thing on a to-do-list (if they have any).
Ps and edit: There seems to be a general impression that tridef ignition only support depthbuffered stereo which of course is not true. This mode can be used for not configured games or games that needs a tremendous effort to fix with profiles.
Deptbuffered stereo will never match the quality of dual view stereo. It's quite ok when playing a game where everything is at a far distance. The technique shows it weakness when coming up close to objects and the distortions will be really disturbing. I'm personally not fond of "depth-only"-faraway sceneries so i skip this option in all. I can just as well play without glasses.
Nvidia can of course implement this at any time but imo it should be the last thing on a to-do-list (if they have any).
Ps and edit: There seems to be a general impression that tridef ignition only support depthbuffered stereo which of course is not true. This mode can be used for not configured games or games that needs a tremendous effort to fix with profiles.
bo3b,
What results do you get with 3d disabled in the nvidia control panel?
IIRC, there used to be a known issue in the old days where having the 3D vision driver disabled in the control panel produced better performance than having it enabled but toggled off.
What games do you have installed? It would be interesting to do a comparison of different game (engines) to see what effect enabling the 3D vision driver has.
From your preliminary results, I find it curious that the 3D vision driver is indeed less efficient than the iZ3D driver I had tested all those years ago.
My results from the other thread were as follows:
[quote="RAGEdemon"] GPU usages are measured by MSI Afterburner's On Screen Display server...
Painkiller:
120FPS
2D, GPU1 = 60%, GPU2 = 40%. Total GPU = 100
S3D, GPU1 = 33%, GPU2 = 32%. Total GPU = 65
Dirt3:
120FPS
2D, GPU1 = 65%, GPU2 = 65%. Total GPU = 130
S3D, GPU1 = 55%, GPU2 = 55%. Total GPU = 110
Trackmania Forever:
120FPS
2D, GPU1 = 45%, GPU2 = 40%. Total GPU = 85
S3D, GPU1 = 31%, GPU2 = 19%. Total GPU = 50
Crysis 3:
120FPS
2D, GPU1 = 90%, GPU2 = 70%. Total GPU = 160
S3D, GPU1 = 37%, GPU2 = 50%. Total GPU = 87
OK, now, from the above games, we can clearly see that the GPU usage is significantly lower in S3D, compared to 2D. If the game was running at 120FPS, you would expect that the GPU usage would be at least exactly the same, but most probably ~30% higher due to the demand that S3D puts onto the GPU.
If I remember correctly, nVidia uses a Z buffer to shift perspective of a given scene. What the 3D vision driver does is it shifts the perspective left, and outputs to the left eye, then it shifts the perspective of the same scene right, and outputs to the right eye. This is of course less GPU intensive than calculating the right eye perspective from a completely new scene.
Yes, it's 120Hz, and 120FPS, but every frame is being cloned, effectively outputting only 60 real new FPS. [/quote]
Regarding Depthbuffered stereo, of course no-one can argue that it can compare to real Stereo.
But, it was also created for the purpose of compatibility, not just performance.
What results do you get with 3d disabled in the nvidia control panel?
IIRC, there used to be a known issue in the old days where having the 3D vision driver disabled in the control panel produced better performance than having it enabled but toggled off.
What games do you have installed? It would be interesting to do a comparison of different game (engines) to see what effect enabling the 3D vision driver has.
From your preliminary results, I find it curious that the 3D vision driver is indeed less efficient than the iZ3D driver I had tested all those years ago.
My results from the other thread were as follows:
RAGEdemon said: GPU usages are measured by MSI Afterburner's On Screen Display server...
OK, now, from the above games, we can clearly see that the GPU usage is significantly lower in S3D, compared to 2D. If the game was running at 120FPS, you would expect that the GPU usage would be at least exactly the same, but most probably ~30% higher due to the demand that S3D puts onto the GPU.
If I remember correctly, nVidia uses a Z buffer to shift perspective of a given scene. What the 3D vision driver does is it shifts the perspective left, and outputs to the left eye, then it shifts the perspective of the same scene right, and outputs to the right eye. This is of course less GPU intensive than calculating the right eye perspective from a completely new scene.
Yes, it's 120Hz, and 120FPS, but every frame is being cloned, effectively outputting only 60 real new FPS.
Regarding Depthbuffered stereo, of course no-one can argue that it can compare to real Stereo.
But, it was also created for the purpose of compatibility, not just performance.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="RAGEdemon"]
If I remember correctly, nVidia uses a Z buffer to shift perspective of a given scene. What the 3D vision driver does is it shifts the perspective left, and outputs to the left eye, then it shifts the perspective of the same scene right, and outputs to the right eye. This is of course less GPU intensive than calculating the right eye perspective from a completely new scene.
Yes, it's 120Hz, and 120FPS, but every frame is being cloned, effectively outputting only 60 real new FPS. [/quote]
What your describing here looks like 2D+Depth or what Crysis and Tridef "Power3D" does to conserve resources while rendering 3D.
Using Tridef as an example: "Power3D" is ~25% (or worse) FPS hit and regular Tridef is ~50% (or worse) FPS hit. (edit: or at least it *seems* like it)
A MUCH better measure of performance is the Benchmark built in to a few modern titles OR a graphics specific benchmark. (anyone up for 3DVision on/off 3Dmark11, Firestrike, Valley or other number crunching?)
(edit)
I ran some benchmarks on a number of games that I have installed. However I was unable to unlink the Vsync while in 3D mode in all but one of them. Metro2033.
Test AVG FPS/Score difference in % (2D vs 3D)
Metro2033 37%
Heaven 33%
While I'm a bit surprised at the numbers, they seem to validate the 35% 3D cost mentioned earlier. Granted, it may depend on which "numbers" one looks at. The "Heaven" benchmark the Max FPS's were 338.1 & 244.3 which equate to an even lower difference (28%) than the 'score' (2D=5143, 3D=3443).
RAGEdemon said:
If I remember correctly, nVidia uses a Z buffer to shift perspective of a given scene. What the 3D vision driver does is it shifts the perspective left, and outputs to the left eye, then it shifts the perspective of the same scene right, and outputs to the right eye. This is of course less GPU intensive than calculating the right eye perspective from a completely new scene.
Yes, it's 120Hz, and 120FPS, but every frame is being cloned, effectively outputting only 60 real new FPS.
What your describing here looks like 2D+Depth or what Crysis and Tridef "Power3D" does to conserve resources while rendering 3D.
Using Tridef as an example: "Power3D" is ~25% (or worse) FPS hit and regular Tridef is ~50% (or worse) FPS hit. (edit: or at least it *seems* like it)
A MUCH better measure of performance is the Benchmark built in to a few modern titles OR a graphics specific benchmark. (anyone up for 3DVision on/off 3Dmark11, Firestrike, Valley or other number crunching?)
(edit)
I ran some benchmarks on a number of games that I have installed. However I was unable to unlink the Vsync while in 3D mode in all but one of them. Metro2033.
Test AVG FPS/Score difference in % (2D vs 3D)
Metro2033 37%
Heaven 33%
While I'm a bit surprised at the numbers, they seem to validate the 35% 3D cost mentioned earlier. Granted, it may depend on which "numbers" one looks at. The "Heaven" benchmark the Max FPS's were 338.1 & 244.3 which equate to an even lower difference (28%) than the 'score' (2D=5143, 3D=3443).
I think Ragedemon tries to explain that the nvidiadriver renders a "capture" of the scene by rendering it twice from different cameraviews but the left and right eyeviews are shot at the [b]same moment[/b]. When the driver renders stereo, it will capture the necessary parameters and use those to render both left and right eyeviews with the difference of changing just the camera a bit sideways. This will give a true stereoshot. I'm quite certain that this method is used in the legacy stereodrivers because otherwise panning would be unbearable and totally break immersion, especially at lower framerates. I can not tell if 3d-vision uses the same principle. It would be quite easy to check though. Load a game which is very heavy on the graphics (ie slideshow) and pan around a bit. If the immersion is totally broken, the driver does not capture the parameters for each frame (the drivers works sequentially). If each image in the "slideshow" shows perfect rendered stereo then it renders both views from the same parameters.
I'm lousy in describing but i'm pretty sure that this is what he means. There is no question whether or not nvidia uses bruteforce technique to render both frames (it does).
I think Ragedemon tries to explain that the nvidiadriver renders a "capture" of the scene by rendering it twice from different cameraviews but the left and right eyeviews are shot at the same moment. When the driver renders stereo, it will capture the necessary parameters and use those to render both left and right eyeviews with the difference of changing just the camera a bit sideways. This will give a true stereoshot. I'm quite certain that this method is used in the legacy stereodrivers because otherwise panning would be unbearable and totally break immersion, especially at lower framerates. I can not tell if 3d-vision uses the same principle. It would be quite easy to check though. Load a game which is very heavy on the graphics (ie slideshow) and pan around a bit. If the immersion is totally broken, the driver does not capture the parameters for each frame (the drivers works sequentially). If each image in the "slideshow" shows perfect rendered stereo then it renders both views from the same parameters.
I'm lousy in describing but i'm pretty sure that this is what he means. There is no question whether or not nvidia uses bruteforce technique to render both frames (it does).
Update with 3D Disabled in CP. Interestingly enough, even with it disabled, the game would go to 3D. I had to set the no-pass medical test to get it to stop.
Unfortunately the 60 fps game cap makes this less than clear. It idles at 62, which is not 60, but too close to accept as legit. Especially in that the GPU usage went from 99% to 93%.
[img]http://bo3b.net/bataa/3d%20disabled.JPG[/img]
Update with 3D Disabled in CP. Interestingly enough, even with it disabled, the game would go to 3D. I had to set the no-pass medical test to get it to stop.
Unfortunately the 60 fps game cap makes this less than clear. It idles at 62, which is not 60, but too close to accept as legit. Especially in that the GPU usage went from 99% to 93%.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"][quote="RAGEdemon"][i]Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.[/i] [/quote]You seem to be confusing us with other people. No one here is arguing this point.[/quote]I am. And I have no doubt that time will prove me right. I think people are right to be skeptical about just how much of an improvement it will make. But I haven't read any compelling arguments on this thread to convince me that it won't make any difference at all.
And I've yet to read a cogent rebuttal of the point I make in my diagram. It seems clear to me that there'll be a reduction in input lag and an improvement in frame pacing. How much remains to be seen. But to claim that there'll be none at all seems irrational to me.
If gsync truly won't make any difference in a 60fps & 60Hz environment, then all those glowing reviews from Carmack et al, from Anandtech and other review sites - from basically everyone who's ever seen it in action - are completely bogus: something akin to mass hallucination. Those people all said they saw unprecendented smoothness. And as I understand it, they were watching a 60fps scenario.
With all due to respect to everyone here, I think it makes much more sense to take the word of people who have tried it than to take the word of people who haven't and are trying to shoot it down with hypothetical arguments just for the sake of cynicism.
Likewise, anything I say is also pure speculation, and should be taken with a grain of salt. But at least I'm starting from a starting point of assuming that all the eye-witnesses are telling the truth, rather than assuming that the eye witnesses are all wrong and I know better than they do. That seems by far the more reasonable starting point to me.
RAGEdemon said:Once again, if you have a powerful enough GPU that is able to attain FPS at the refresh rate set, you will see no difference with GSync.
You seem to be confusing us with other people. No one here is arguing this point.
I am. And I have no doubt that time will prove me right. I think people are right to be skeptical about just how much of an improvement it will make. But I haven't read any compelling arguments on this thread to convince me that it won't make any difference at all.
And I've yet to read a cogent rebuttal of the point I make in my diagram. It seems clear to me that there'll be a reduction in input lag and an improvement in frame pacing. How much remains to be seen. But to claim that there'll be none at all seems irrational to me.
If gsync truly won't make any difference in a 60fps & 60Hz environment, then all those glowing reviews from Carmack et al, from Anandtech and other review sites - from basically everyone who's ever seen it in action - are completely bogus: something akin to mass hallucination. Those people all said they saw unprecendented smoothness. And as I understand it, they were watching a 60fps scenario.
With all due to respect to everyone here, I think it makes much more sense to take the word of people who have tried it than to take the word of people who haven't and are trying to shoot it down with hypothetical arguments just for the sake of cynicism.
Likewise, anything I say is also pure speculation, and should be taken with a grain of salt. But at least I'm starting from a starting point of assuming that all the eye-witnesses are telling the truth, rather than assuming that the eye witnesses are all wrong and I know better than they do. That seems by far the more reasonable starting point to me.
If G-Sync were a 'cure all' it would be as if Nvidia were "shooting itself in the foot" as where would the incentive be to upgrade to more powerful (and expensive) hardware be if their invention smoothed out the stuttering and removed screen tearing from game play entirely?
Granted on the one hand PC gaming CAN be very expensive and likely more kids are introduced to gaming via a console system (living room or handheld/portable) then with a PC outfitted with modern up to date high end hardware. If changing/redesigning the hardware that a PC uses makes a investment in PC gaming hardware last longer or at least appear to be less of a "black hole" in the consumers wallet it may be more attractive to more users. On the other hand if a gaming PC "build" was viable for the same 8-10yrs that the consoles seem to have they'd sell a lot less desktop video cards.
The problem Nvidia faces is that it requires not only a Nvidia GPU but a special Nvidia G-Sync enabled monitor. So while some consumers may finally decide they want a new 'tower' they would also have to purchase a new Nvidia GPU card AND a new monitor to go with it to get any benefit. The current trend is that consumers forget about getting a tower altogether and purchase ether a laptop or more likely than not a tablet or hand held device. (most of which don't have Nvidia GPU HW included)
Which ultimately MIGHT make Kepler based laptops more attractive! Laptop GPU's are generally to almost always underpowered CPU/GPU wise and struggle with providing a smooth frame rates. The REAL impact is if they were able to convince OEM's to build in the G-Sync hardware into Nvidia powered laptops!! Suddenly they'd have something that Intel and AMD don't have - a smooth portable game play experience! I can see a real market for that - I'd buy one!
A OEM could couple a very inexpensive AMD mobile CPU chip with a Nvidia mobile Kepler GPU and G-Sync enabled IPS monitor and market it as a "gaming ultralight" laptop. G-Sync on a laptop could completely change portable/mobile gaming as we know it today.
So I think the real market for G-SYNC is not with hardcore desktop gamers (be it 2D OR 3D) but with the laptop market. Us 3D users would benefit as well if a G-Sync enabled laptop was able to provide a similar game play experience as we get on our high powered (and costly) desktop systems.
IMHO/YMMV
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
I have thoroughly investaged the performance impact due to S3D and have posted results from many games in the old iZ3D forums as well as the MTBS forums. Please find my chart below:
http://forum.iz3d.com/viewtopic.php?t=1815
My conclusion was:
"The most important conclusion is that the drop due to S3D on Both ATi and nVidia cards is about the same: 35%"
If you are having a bigger performance impact than ~35%, it may be another problem, such as a lack of RAM/Video RAM which S3D needs more of. In such a case, your FPS will drop dramatically when the GPU hits the memory limit wall, usually manifested as stutter and microstutter; which is again interesting due to your interest in gsync which is meant to reduce stutter/microstutter. I wonder if this is in fact your problem?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I honestly don't look at FPS much, because I don't wanna know the difference. But in some of the games I spend my most time, race sims, all of the ones I play literally double my framerate if I turn it off. And it's not memory, because I have 2GB and it doesn't matter what the settings are. If I turn everything way down, I could get 300 in 2D and 150 in 3D. And if I keep it cranked up (within reason to keep close to 60), it could be 50 - 60 in 3D and 100 - 120 in 2D. (Unless we're talking iracing, then I can crank everything up and still get 150 - 200 in 3D and 300+ in 2D. That game has low requirements.)
And lol if that's my problem and reason for interest in G-Sync. I've said before I don't run v-sync in 2D (it's mandatory in 3D). Stutter only occurs with v-sync on. I'm interested in g-sync because in theory, it's the way LCD should have worked from day one. In practice, I'm skeptical. Because I don't understand how you can get perfect motion resolution (like in the text on swinging pendulum) when the refresh rate is, say, 37hz. It doesn't make sense to me. LCD has historically sucked in motion resolution. That's why all these HDTVs use the ridiculous interpolation. To get the motion resolution above a pathetic 300 lines of resolution.
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
Can you give me a list of games you have problems with? I will try to test specifically these as from some quick benchmarks, modern games on my rig have only ~35% drop. There is something very strange going on with your system if you are getting a bigger performance hit.
Also, What has motion resolution got to do with vsync? They are quite different.
Please would you explain to me your interpretation of motion resolution, and how it conflicts with vsync/gsync?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
NVidia 3D Vision specifically does render two full frames for every 60Hz. It's 120fps worth of work for the GPU, not a fraction of the power as you suggest.
See this whitepaper for how it works in NVidia's automatic mode:
http://www.nvidia.com/content/GTC-2010/pdfs/2010_GTC2010.pdf
Now there are two different ways of generating 3D, by geometry, or by depth buffer. Depth buffer is typically used by Tridef, and is used by games like Crysis 2 and 3. This technique does in fact take only a small amount more GPU power because the two perspectives are rendered from a single depth-buffer, as a sort of post-process technique. By most comments here, depth buffer 3D is a pale second to true geometry rendering.
In geometry rendering another entire frame is rendered, from the other eye's perspective. This allows you to sort of look around things, and fixes the biggest problem with depth buffer- which is that it tends to make things look like flat planes or billboards.
For your examples of SLI scaling, I'm not sure what is different in my setup as well. Typically I get close 100% improvement in SLI. It's a rare game that even gets only 50% improvement from SLI.Sorry, you are just talking about S3D impact, not SLI.One big difference is our resolutions. I run at 1280x720@120 for a projector, and your results are 1680x1050@120.
Also your tests were done with a perfectly good CPU of the time, but I would suggest that the earlier test cases were CPU limited, not GPU limited. With SLI, I found that my older CPU was really holding me back.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I have never touched 3d-vision because i will never support a company who behaves like nvidia to their customers, period!
If you get a bigger fps-hit with 3d-vision enabled it's then because it poorly optimized or that other softwares (helix mod, don't know if it has impact during actual gaming?) interferes.
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.
How about some actual data from today? In the above list, you have Arkham Asylum which I have on hand. I disabled SLI so it's a single GTX 580 GPU. We know that Arkham also has a 60fps cap.
NVidia 3D Vision OFF (yellow highlight area):
NVidia 3D Vision ON (yellow highlight area):
The same graph is for both, idling with 3D ON, then idling with 3D OFF. I set the cursor over the specific areas so you can read the numbers in the flat spots.
Compare frame rates: 57 fps vs. 32 fps
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Are you saying that depth buffered stereo can match the quality of dual view rendered stereo and that Nvidia could do this if they wanted?
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
Nvidia can of course implement this at any time but imo it should be the last thing on a to-do-list (if they have any).
Ps and edit: There seems to be a general impression that tridef ignition only support depthbuffered stereo which of course is not true. This mode can be used for not configured games or games that needs a tremendous effort to fix with profiles.
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.
What results do you get with 3d disabled in the nvidia control panel?
IIRC, there used to be a known issue in the old days where having the 3D vision driver disabled in the control panel produced better performance than having it enabled but toggled off.
What games do you have installed? It would be interesting to do a comparison of different game (engines) to see what effect enabling the 3D vision driver has.
From your preliminary results, I find it curious that the 3D vision driver is indeed less efficient than the iZ3D driver I had tested all those years ago.
My results from the other thread were as follows:
Regarding Depthbuffered stereo, of course no-one can argue that it can compare to real Stereo.
But, it was also created for the purpose of compatibility, not just performance.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
What your describing here looks like 2D+Depth or what Crysis and Tridef "Power3D" does to conserve resources while rendering 3D.
Using Tridef as an example: "Power3D" is ~25% (or worse) FPS hit and regular Tridef is ~50% (or worse) FPS hit. (edit: or at least it *seems* like it)
A MUCH better measure of performance is the Benchmark built in to a few modern titles OR a graphics specific benchmark. (anyone up for 3DVision on/off 3Dmark11, Firestrike, Valley or other number crunching?)
(edit)
I ran some benchmarks on a number of games that I have installed. However I was unable to unlink the Vsync while in 3D mode in all but one of them. Metro2033.
Test AVG FPS/Score difference in % (2D vs 3D)
Metro2033 37%
Heaven 33%
While I'm a bit surprised at the numbers, they seem to validate the 35% 3D cost mentioned earlier. Granted, it may depend on which "numbers" one looks at. The "Heaven" benchmark the Max FPS's were 338.1 & 244.3 which equate to an even lower difference (28%) than the 'score' (2D=5143, 3D=3443).
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
I'm lousy in describing but i'm pretty sure that this is what he means. There is no question whether or not nvidia uses bruteforce technique to render both frames (it does).
Mb: Asus P5W DH Deluxe
Cpu: C2D E6600
Gb: Nvidia 7900GT + 8800GTX
3D:100" passive projector polarized setup + 22" IZ3D
Stereodrivers: Iz3d & Tridef ignition and nvidia old school.
Unfortunately the 60 fps game cap makes this less than clear. It idles at 62, which is not 60, but too close to accept as legit. Especially in that the GPU usage went from 99% to 93%.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
And I've yet to read a cogent rebuttal of the point I make in my diagram. It seems clear to me that there'll be a reduction in input lag and an improvement in frame pacing. How much remains to be seen. But to claim that there'll be none at all seems irrational to me.
If gsync truly won't make any difference in a 60fps & 60Hz environment, then all those glowing reviews from Carmack et al, from Anandtech and other review sites - from basically everyone who's ever seen it in action - are completely bogus: something akin to mass hallucination. Those people all said they saw unprecendented smoothness. And as I understand it, they were watching a 60fps scenario.
With all due to respect to everyone here, I think it makes much more sense to take the word of people who have tried it than to take the word of people who haven't and are trying to shoot it down with hypothetical arguments just for the sake of cynicism.
Likewise, anything I say is also pure speculation, and should be taken with a grain of salt. But at least I'm starting from a starting point of assuming that all the eye-witnesses are telling the truth, rather than assuming that the eye witnesses are all wrong and I know better than they do. That seems by far the more reasonable starting point to me.