3D vision & GTX 470
  3 / 3    
[quote name='francomg' post='1040252' date='Apr 15 2010, 11:15 AM']For those of you who already have a GTX 480/470 single or SLI, is it true that you can turn AA8x, 4X without losing that much of frame rate???

I mean, let's pick two of the most GPU horsepower demanding games for 3D Vison, Metro 2033 and Just Cause 2. I have SLI GTX 280 and I play those games with an average of 30fps, 1680 x 1050 and all settings maximized, but AA is Off, AF I let 16x cause it doesn't affect the frame rate. I always turn Vsync on, always, cause I hate tearing...
If I turn AA2x or 4x in those games it just becomes unplayable, all the way.

How does those new Fermi works with Anti Aliasing on?? I read that if you play a game with AA off and your FPS count is 40 for example, if you turn AA to 8x, you will lose less than 10fps the most, is that true?? With GTX 280 SLi or not, AA kills the frame rate in some games, not all, but the graphics demanding ones.

I was a little bit curious about that. I'm just waiting to get 2 GTX 480 and replace my 280,s. I guess I'll gain at least 70% better performance.

Anyway, do you guys actually notice a difference from AA turned on 8x or 4x compared to AA off in 3D?? I don't see it, in 2D it was obvious the difference, but in 3D, I've never notice aliasing at all.
Even when I was playing Crysis in 2D last year, I never notice any difference from AA4x to AA8x or AAQ16x, it was all the same to me, despite the huge drop of performance. I could tell a huge difference in 2D from AA turned off to AA2x or 4X, but beyond that I could not tell any difference.[/quote]
Yes overall GF100 handles 8xAA better than GT200 due to the ROP enhancements Nvidia made this time around, so there is less performance drop using 8xMSAA or the additional CSAA modes on top of MSAA as previous parts.

The big key though is with 3D Vision in SLI, with the 60FPS cap you have a lot of performance headroom to crank up eye-candy with virtually no performance hit. This is only really for games that area already capping out at 60FPS per eye, some of the more demanding games you've mentioned will not get sustained 60FPS at 1080p, so enabling 8xAA over 4xAA will still result in a performance hit, just not nearly as much as with GT200 in SLI. But back to my point, with the Vsync in SLI you may see both of your GPU are somewhere around ~60-70% utilization, so you have another 30-40% headroom on each GPU before you start seeing FPS drops, which you can use for AA. With a single GTX 480 GPU you may see your utilization very close to 100% so not only are you getting less than 60FPS (closer to 40-50 maybe), but also enabling higher AA will also drop your performance.

I use EVGA Precision to monitor various GPU vitals while gaming to give me an idea of how the GPUs are performing and where additional eye candy can be enabled for certain games. After having gone through 2xGTX 280 SLI > 1xGTX 480 > 2xGTX 480 SLI I can confidently say now that 2xGF100 are ideal for 1080p 3D Vision. 2x470 I think would also give a similarly excellent experience, somewhere inbetween GTX 280 SLI and GTX 480 SLI of course. Even still there's some games that wont' cap at 60FPS per eye like Metro 2033 or BFBC2 with HBAO and AA enabled. Just Cause 2 isn't quite there either but its in the 45-50FPS range (demo version). But for most others its perfect. Dragon Age Origins with 8xAA + 4xTrSSAA with both eyes capped at 60FPS and camera fluid as an FPS is simply amazing. Same for AC2....running around Venice running roof to roof and dropping into the most crowded squares capped at 60FPS is completely different than gaming on GTX 280 SLI and getting ~30-35FPS.
[quote name='francomg' post='1040252' date='Apr 15 2010, 11:15 AM']For those of you who already have a GTX 480/470 single or SLI, is it true that you can turn AA8x, 4X without losing that much of frame rate???



I mean, let's pick two of the most GPU horsepower demanding games for 3D Vison, Metro 2033 and Just Cause 2. I have SLI GTX 280 and I play those games with an average of 30fps, 1680 x 1050 and all settings maximized, but AA is Off, AF I let 16x cause it doesn't affect the frame rate. I always turn Vsync on, always, cause I hate tearing...

If I turn AA2x or 4x in those games it just becomes unplayable, all the way.



How does those new Fermi works with Anti Aliasing on?? I read that if you play a game with AA off and your FPS count is 40 for example, if you turn AA to 8x, you will lose less than 10fps the most, is that true?? With GTX 280 SLi or not, AA kills the frame rate in some games, not all, but the graphics demanding ones.



I was a little bit curious about that. I'm just waiting to get 2 GTX 480 and replace my 280,s. I guess I'll gain at least 70% better performance.



Anyway, do you guys actually notice a difference from AA turned on 8x or 4x compared to AA off in 3D?? I don't see it, in 2D it was obvious the difference, but in 3D, I've never notice aliasing at all.

Even when I was playing Crysis in 2D last year, I never notice any difference from AA4x to AA8x or AAQ16x, it was all the same to me, despite the huge drop of performance. I could tell a huge difference in 2D from AA turned off to AA2x or 4X, but beyond that I could not tell any difference.

Yes overall GF100 handles 8xAA better than GT200 due to the ROP enhancements Nvidia made this time around, so there is less performance drop using 8xMSAA or the additional CSAA modes on top of MSAA as previous parts.



The big key though is with 3D Vision in SLI, with the 60FPS cap you have a lot of performance headroom to crank up eye-candy with virtually no performance hit. This is only really for games that area already capping out at 60FPS per eye, some of the more demanding games you've mentioned will not get sustained 60FPS at 1080p, so enabling 8xAA over 4xAA will still result in a performance hit, just not nearly as much as with GT200 in SLI. But back to my point, with the Vsync in SLI you may see both of your GPU are somewhere around ~60-70% utilization, so you have another 30-40% headroom on each GPU before you start seeing FPS drops, which you can use for AA. With a single GTX 480 GPU you may see your utilization very close to 100% so not only are you getting less than 60FPS (closer to 40-50 maybe), but also enabling higher AA will also drop your performance.



I use EVGA Precision to monitor various GPU vitals while gaming to give me an idea of how the GPUs are performing and where additional eye candy can be enabled for certain games. After having gone through 2xGTX 280 SLI > 1xGTX 480 > 2xGTX 480 SLI I can confidently say now that 2xGF100 are ideal for 1080p 3D Vision. 2x470 I think would also give a similarly excellent experience, somewhere inbetween GTX 280 SLI and GTX 480 SLI of course. Even still there's some games that wont' cap at 60FPS per eye like Metro 2033 or BFBC2 with HBAO and AA enabled. Just Cause 2 isn't quite there either but its in the 45-50FPS range (demo version). But for most others its perfect. Dragon Age Origins with 8xAA + 4xTrSSAA with both eyes capped at 60FPS and camera fluid as an FPS is simply amazing. Same for AC2....running around Venice running roof to roof and dropping into the most crowded squares capped at 60FPS is completely different than gaming on GTX 280 SLI and getting ~30-35FPS.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

#31
Posted 04/15/2010 04:51 PM   
[quote name='francomg' post='1040252' date='Apr 15 2010, 11:15 AM']For those of you who already have a GTX 480/470 single or SLI, is it true that you can turn AA8x, 4X without losing that much of frame rate???

I mean, let's pick two of the most GPU horsepower demanding games for 3D Vison, Metro 2033 and Just Cause 2. I have SLI GTX 280 and I play those games with an average of 30fps, 1680 x 1050 and all settings maximized, but AA is Off, AF I let 16x cause it doesn't affect the frame rate. I always turn Vsync on, always, cause I hate tearing...
If I turn AA2x or 4x in those games it just becomes unplayable, all the way.

How does those new Fermi works with Anti Aliasing on?? I read that if you play a game with AA off and your FPS count is 40 for example, if you turn AA to 8x, you will lose less than 10fps the most, is that true?? With GTX 280 SLi or not, AA kills the frame rate in some games, not all, but the graphics demanding ones.

I was a little bit curious about that. I'm just waiting to get 2 GTX 480 and replace my 280,s. I guess I'll gain at least 70% better performance.

Anyway, do you guys actually notice a difference from AA turned on 8x or 4x compared to AA off in 3D?? I don't see it, in 2D it was obvious the difference, but in 3D, I've never notice aliasing at all.
Even when I was playing Crysis in 2D last year, I never notice any difference from AA4x to AA8x or AAQ16x, it was all the same to me, despite the huge drop of performance. I could tell a huge difference in 2D from AA turned off to AA2x or 4X, but beyond that I could not tell any difference.[/quote]
Yes overall GF100 handles 8xAA better than GT200 due to the ROP enhancements Nvidia made this time around, so there is less performance drop using 8xMSAA or the additional CSAA modes on top of MSAA as previous parts.

The big key though is with 3D Vision in SLI, with the 60FPS cap you have a lot of performance headroom to crank up eye-candy with virtually no performance hit. This is only really for games that area already capping out at 60FPS per eye, some of the more demanding games you've mentioned will not get sustained 60FPS at 1080p, so enabling 8xAA over 4xAA will still result in a performance hit, just not nearly as much as with GT200 in SLI. But back to my point, with the Vsync in SLI you may see both of your GPU are somewhere around ~60-70% utilization, so you have another 30-40% headroom on each GPU before you start seeing FPS drops, which you can use for AA. With a single GTX 480 GPU you may see your utilization very close to 100% so not only are you getting less than 60FPS (closer to 40-50 maybe), but also enabling higher AA will also drop your performance.

I use EVGA Precision to monitor various GPU vitals while gaming to give me an idea of how the GPUs are performing and where additional eye candy can be enabled for certain games. After having gone through 2xGTX 280 SLI > 1xGTX 480 > 2xGTX 480 SLI I can confidently say now that 2xGF100 are ideal for 1080p 3D Vision. 2x470 I think would also give a similarly excellent experience, somewhere inbetween GTX 280 SLI and GTX 480 SLI of course. Even still there's some games that wont' cap at 60FPS per eye like Metro 2033 or BFBC2 with HBAO and AA enabled. Just Cause 2 isn't quite there either but its in the 45-50FPS range (demo version). But for most others its perfect. Dragon Age Origins with 8xAA + 4xTrSSAA with both eyes capped at 60FPS and camera fluid as an FPS is simply amazing. Same for AC2....running around Venice running roof to roof and dropping into the most crowded squares capped at 60FPS is completely different than gaming on GTX 280 SLI and getting ~30-35FPS.
[quote name='francomg' post='1040252' date='Apr 15 2010, 11:15 AM']For those of you who already have a GTX 480/470 single or SLI, is it true that you can turn AA8x, 4X without losing that much of frame rate???



I mean, let's pick two of the most GPU horsepower demanding games for 3D Vison, Metro 2033 and Just Cause 2. I have SLI GTX 280 and I play those games with an average of 30fps, 1680 x 1050 and all settings maximized, but AA is Off, AF I let 16x cause it doesn't affect the frame rate. I always turn Vsync on, always, cause I hate tearing...

If I turn AA2x or 4x in those games it just becomes unplayable, all the way.



How does those new Fermi works with Anti Aliasing on?? I read that if you play a game with AA off and your FPS count is 40 for example, if you turn AA to 8x, you will lose less than 10fps the most, is that true?? With GTX 280 SLi or not, AA kills the frame rate in some games, not all, but the graphics demanding ones.



I was a little bit curious about that. I'm just waiting to get 2 GTX 480 and replace my 280,s. I guess I'll gain at least 70% better performance.



Anyway, do you guys actually notice a difference from AA turned on 8x or 4x compared to AA off in 3D?? I don't see it, in 2D it was obvious the difference, but in 3D, I've never notice aliasing at all.

Even when I was playing Crysis in 2D last year, I never notice any difference from AA4x to AA8x or AAQ16x, it was all the same to me, despite the huge drop of performance. I could tell a huge difference in 2D from AA turned off to AA2x or 4X, but beyond that I could not tell any difference.

Yes overall GF100 handles 8xAA better than GT200 due to the ROP enhancements Nvidia made this time around, so there is less performance drop using 8xMSAA or the additional CSAA modes on top of MSAA as previous parts.



The big key though is with 3D Vision in SLI, with the 60FPS cap you have a lot of performance headroom to crank up eye-candy with virtually no performance hit. This is only really for games that area already capping out at 60FPS per eye, some of the more demanding games you've mentioned will not get sustained 60FPS at 1080p, so enabling 8xAA over 4xAA will still result in a performance hit, just not nearly as much as with GT200 in SLI. But back to my point, with the Vsync in SLI you may see both of your GPU are somewhere around ~60-70% utilization, so you have another 30-40% headroom on each GPU before you start seeing FPS drops, which you can use for AA. With a single GTX 480 GPU you may see your utilization very close to 100% so not only are you getting less than 60FPS (closer to 40-50 maybe), but also enabling higher AA will also drop your performance.



I use EVGA Precision to monitor various GPU vitals while gaming to give me an idea of how the GPUs are performing and where additional eye candy can be enabled for certain games. After having gone through 2xGTX 280 SLI > 1xGTX 480 > 2xGTX 480 SLI I can confidently say now that 2xGF100 are ideal for 1080p 3D Vision. 2x470 I think would also give a similarly excellent experience, somewhere inbetween GTX 280 SLI and GTX 480 SLI of course. Even still there's some games that wont' cap at 60FPS per eye like Metro 2033 or BFBC2 with HBAO and AA enabled. Just Cause 2 isn't quite there either but its in the 45-50FPS range (demo version). But for most others its perfect. Dragon Age Origins with 8xAA + 4xTrSSAA with both eyes capped at 60FPS and camera fluid as an FPS is simply amazing. Same for AC2....running around Venice running roof to roof and dropping into the most crowded squares capped at 60FPS is completely different than gaming on GTX 280 SLI and getting ~30-35FPS.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

#32
Posted 04/15/2010 04:51 PM   
Thanks for the input mate!!!

But do you owners of GTX480/470, do you actually notice a better eye candy with AA turned on and 3D Vision enabled??? I really don't see any aliasing with 3D enabled. In 2D I loved AA and always used to eliminate aliasing, but beyond 4x I could not tell the difference at all.
I never understood the AAQ16x that Crysis had, never....

I thought that EVGA Precision was bad for 3D Vision causing flickering and other bugs. I read a lot of comments here about this issue with EVGA Precision.
I always had it, but after I bought 3D Vision and read the comments I uninstalled it. I actually miss it cause I don't trust auto fans very much. Precision you can set 100% and have no worries while gaming...
Thanks for the input mate!!!



But do you owners of GTX480/470, do you actually notice a better eye candy with AA turned on and 3D Vision enabled??? I really don't see any aliasing with 3D enabled. In 2D I loved AA and always used to eliminate aliasing, but beyond 4x I could not tell the difference at all.

I never understood the AAQ16x that Crysis had, never....



I thought that EVGA Precision was bad for 3D Vision causing flickering and other bugs. I read a lot of comments here about this issue with EVGA Precision.

I always had it, but after I bought 3D Vision and read the comments I uninstalled it. I actually miss it cause I don't trust auto fans very much. Precision you can set 100% and have no worries while gaming...

Windows 7 Home Premium 64 Bits - Core i7 2600K @ 4.5ghz - Asus Maximus IV Extreme Z68 - Geforce EVGA GTX 690 - 8GB Corsair Vengeance DDR3 1600 9-9-9-24 (2T) - Thermaltake Armor+ - SSD Intel 510 Series Sata3 256GB - HD WD Caviar Black Sata3 64mb 2TB - HD WD Caviar Black 1TB Sata3 64mb - Bose Sound System - LG H20L GGW Blu Ray/DVD/CD RW - LG GH20 DVD RAM - PSU Thermaltake Toughpower 1000W - Samsung S27A950D 3D Vision Ready + 3D HDTV SAMSUNG PL63C7000 3DTVPLAY + ROLLERMOD CHECKERBOARD

#33
Posted 04/15/2010 08:22 PM   
Thanks for the input mate!!!

But do you owners of GTX480/470, do you actually notice a better eye candy with AA turned on and 3D Vision enabled??? I really don't see any aliasing with 3D enabled. In 2D I loved AA and always used to eliminate aliasing, but beyond 4x I could not tell the difference at all.
I never understood the AAQ16x that Crysis had, never....

I thought that EVGA Precision was bad for 3D Vision causing flickering and other bugs. I read a lot of comments here about this issue with EVGA Precision.
I always had it, but after I bought 3D Vision and read the comments I uninstalled it. I actually miss it cause I don't trust auto fans very much. Precision you can set 100% and have no worries while gaming...
Thanks for the input mate!!!



But do you owners of GTX480/470, do you actually notice a better eye candy with AA turned on and 3D Vision enabled??? I really don't see any aliasing with 3D enabled. In 2D I loved AA and always used to eliminate aliasing, but beyond 4x I could not tell the difference at all.

I never understood the AAQ16x that Crysis had, never....



I thought that EVGA Precision was bad for 3D Vision causing flickering and other bugs. I read a lot of comments here about this issue with EVGA Precision.

I always had it, but after I bought 3D Vision and read the comments I uninstalled it. I actually miss it cause I don't trust auto fans very much. Precision you can set 100% and have no worries while gaming...

Windows 7 Home Premium 64 Bits - Core i7 2600K @ 4.5ghz - Asus Maximus IV Extreme Z68 - Geforce EVGA GTX 690 - 8GB Corsair Vengeance DDR3 1600 9-9-9-24 (2T) - Thermaltake Armor+ - SSD Intel 510 Series Sata3 256GB - HD WD Caviar Black Sata3 64mb 2TB - HD WD Caviar Black 1TB Sata3 64mb - Bose Sound System - LG H20L GGW Blu Ray/DVD/CD RW - LG GH20 DVD RAM - PSU Thermaltake Toughpower 1000W - Samsung S27A950D 3D Vision Ready + 3D HDTV SAMSUNG PL63C7000 3DTVPLAY + ROLLERMOD CHECKERBOARD

#34
Posted 04/15/2010 08:22 PM   
  3 / 3    
Scroll To Top