Warcraft, Left 4 Dead, and Musings Throw more hardware at the problem?
1 / 2
So, I've been kind of disappointed in the performance with this 3D Stereo technology in World of Warcraft. For many of us, WoW is the premier game and consumes a lot of our gaming time. (/played anyone?) However, it also seems to be the game that just can't ever get good framerates.
My current setup is:
Intel Core 2 Dual Core @ 3.6 Ghz
8 Gigs of RAM @ 800 Mhz
Vista 64
Gigabyte (gaming) motherboard
XFX overclocked GeForce GTX 260 video card
15 Mb/s Internet connection
I don't point this out to brag, I know many here have even faster computers. However, I do want to quickly illustrate that I'm running a decent machine. Certainly a computer that should blow away a game that came out over 4 years ago. (In fact, TWO COMPUTERS AGO I was playing Warcraft just fine back in the days of a GeForce 5400 or some such nonsense by today's standards.)
So why am I getting 15 frames per second at times while in 3D mode? What the heck? That performance seems horrible.
I've played with every graphic setting in the game, and while small differences can be observed I think the bottom line is that WoW suffers from some bad coding (or something). I just don't think this game is utilizing modern hardware the way it should. Hopefully a patch (to either WoW or nVidia drivers) will alleviate this someday.
On a slightly related note, I downloaded Left 4 Dead tonight after reading about it here (and hearing all of the rave reviews). Thinking that my computer must just be dog-ass slow I was prepared to turn down settings left and right in an effort to squeek out the game in some semblance of a playable state. Much to my surprise (Warcraft has me feeling like I'm running a 386 most days) this game defaulted to the highest resolution, graphical settings, everything. And guess what: it played smooth as butter. Not a single glitch, other than needing to turn off anti-aliasing because of the strange 3D bug we get when AA is turned on. (Not that my computer wasn't handling anti-aliasing, it performed just fine.)
So what the heck? A brand new game can run with FULL SETTINGS, including AA, and perform beautifully. Smooth as silk. And a game that came out four years ago struggles to keep up? I know WoW has pretty graphics, but not that pretty. My computer should blow WoW away, 3D mode or no.
I was actually considering upgrading my power supply, motherboard, and adding a second GTX 260 in SLI mode just to get my frames up in WoW. But now, after playing Left 4 Dead and having an absolute blast tonight on a computer that is entitled to play new games with full settings, I have to say:
... it would be a lot cheaper for Blizzard to simply program WoW better than for us to have to keep throwing more and more hardware at it. I mean seriously, what the heck???
So, I've been kind of disappointed in the performance with this 3D Stereo technology in World of Warcraft. For many of us, WoW is the premier game and consumes a lot of our gaming time. (/played anyone?) However, it also seems to be the game that just can't ever get good framerates.
My current setup is:
Intel Core 2 Dual Core @ 3.6 Ghz
8 Gigs of RAM @ 800 Mhz
Vista 64
Gigabyte (gaming) motherboard
XFX overclocked GeForce GTX 260 video card
15 Mb/s Internet connection
I don't point this out to brag, I know many here have even faster computers. However, I do want to quickly illustrate that I'm running a decent machine. Certainly a computer that should blow away a game that came out over 4 years ago. (In fact, TWO COMPUTERS AGO I was playing Warcraft just fine back in the days of a GeForce 5400 or some such nonsense by today's standards.)
So why am I getting 15 frames per second at times while in 3D mode? What the heck? That performance seems horrible.
I've played with every graphic setting in the game, and while small differences can be observed I think the bottom line is that WoW suffers from some bad coding (or something). I just don't think this game is utilizing modern hardware the way it should. Hopefully a patch (to either WoW or nVidia drivers) will alleviate this someday.
On a slightly related note, I downloaded Left 4 Dead tonight after reading about it here (and hearing all of the rave reviews). Thinking that my computer must just be dog-ass slow I was prepared to turn down settings left and right in an effort to squeek out the game in some semblance of a playable state. Much to my surprise (Warcraft has me feeling like I'm running a 386 most days) this game defaulted to the highest resolution, graphical settings, everything. And guess what: it played smooth as butter. Not a single glitch, other than needing to turn off anti-aliasing because of the strange 3D bug we get when AA is turned on. (Not that my computer wasn't handling anti-aliasing, it performed just fine.)
So what the heck? A brand new game can run with FULL SETTINGS, including AA, and perform beautifully. Smooth as silk. And a game that came out four years ago struggles to keep up? I know WoW has pretty graphics, but not that pretty. My computer should blow WoW away, 3D mode or no.
I was actually considering upgrading my power supply, motherboard, and adding a second GTX 260 in SLI mode just to get my frames up in WoW. But now, after playing Left 4 Dead and having an absolute blast tonight on a computer that is entitled to play new games with full settings, I have to say:
... it would be a lot cheaper for Blizzard to simply program WoW better than for us to have to keep throwing more and more hardware at it. I mean seriously, what the heck???
I wouldn't buy more hardware just for WoW since you already have massive overkill for that game. Honestly if you're not getting good framerates with what you have then it's not a hardware issue. I think it's a driver problem. My guess is that there will be some kind of fix for that when they update the drivers next time.
I wouldn't buy more hardware just for WoW since you already have massive overkill for that game. Honestly if you're not getting good framerates with what you have then it's not a hardware issue. I think it's a driver problem. My guess is that there will be some kind of fix for that when they update the drivers next time.
[quote name='Bosko' post='511285' date='Feb 27 2009, 09:58 AM']I wouldn't buy more hardware just for WoW since you already have massive overkill for that game. Honestly if you're not getting good framerates with what you have then it's not a hardware issue. I think it's a driver problem. My guess is that there will be some kind of fix for that when they update the drivers next time.[/quote]
Like Bosko says, adding more hardware will not help that much, although some have reported much better performance using Core I7 processors. As many have pointed out they actually have better results with the older 8000 series card that the 200 line.I have not switched back to my 8800GTS card to try it because I'm not experiencing the problems they had with the 280's.
I also use the eVga GTX 260 card and it works perfectly except for the low frame rates. I'm pretty sure switching to CoreI7 cpu would help the performance some but not would not justify the xtra cost for the small amount of performance gain. I would like to just to see what happens and may in the future but can't atm. Wait until we hear from Andrew again (if he ever shows back up) and see if they can tweak the drivers for better performance. I think you will always have to expect about a 50% loss in frame rates even in the best conditions, but many are getting much worse than that. I think Andrew said he was getting about 50 fps in 3D and 100 in 2D. That's easily acceptable performance, but we just cant' all seem to get that. Time will tell.
[quote name='Bosko' post='511285' date='Feb 27 2009, 09:58 AM']I wouldn't buy more hardware just for WoW since you already have massive overkill for that game. Honestly if you're not getting good framerates with what you have then it's not a hardware issue. I think it's a driver problem. My guess is that there will be some kind of fix for that when they update the drivers next time.
Like Bosko says, adding more hardware will not help that much, although some have reported much better performance using Core I7 processors. As many have pointed out they actually have better results with the older 8000 series card that the 200 line.I have not switched back to my 8800GTS card to try it because I'm not experiencing the problems they had with the 280's.
I also use the eVga GTX 260 card and it works perfectly except for the low frame rates. I'm pretty sure switching to CoreI7 cpu would help the performance some but not would not justify the xtra cost for the small amount of performance gain. I would like to just to see what happens and may in the future but can't atm. Wait until we hear from Andrew again (if he ever shows back up) and see if they can tweak the drivers for better performance. I think you will always have to expect about a 50% loss in frame rates even in the best conditions, but many are getting much worse than that. I think Andrew said he was getting about 50 fps in 3D and 100 in 2D. That's easily acceptable performance, but we just cant' all seem to get that. Time will tell.
I think it was Andrew that said some mods will slow it down and he was right. Took me awhile to find but where I was getting 14-19 I am getting 30 in the main citys. The most I got so far was 40fps. Where I get 120 then 3d on I get 40. But hit ctrl+r then alt+z and you will see without the hud the fps jump. For me they go from 30 to 50. I still get flickering with my 280 but only in MAIN citys and with weather. My 8800gtx gets the SAME fps but a tad better with no flicker at all. I'm running a q6600 oc and 4g 1066. But no amount of hardware is going to change WOW. Its the drivers(3D).
I think it was Andrew that said some mods will slow it down and he was right. Took me awhile to find but where I was getting 14-19 I am getting 30 in the main citys. The most I got so far was 40fps. Where I get 120 then 3d on I get 40. But hit ctrl+r then alt+z and you will see without the hud the fps jump. For me they go from 30 to 50. I still get flickering with my 280 but only in MAIN citys and with weather. My 8800gtx gets the SAME fps but a tad better with no flicker at all. I'm running a q6600 oc and 4g 1066. But no amount of hardware is going to change WOW. Its the drivers(3D).
[quote name='rickhtoo' post='511293' date='Feb 27 2009, 03:27 PM']I think you will always have to expect about a 50% loss in frame rates even in the best conditions, but many are getting much worse than that. I think Andrew said he was getting about 50 fps in 3D and 100 in 2D. That's easily acceptable performance, but we just cant' all seem to get that. Time will tell.[/quote]
Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
[quote name='rickhtoo' post='511293' date='Feb 27 2009, 03:27 PM']I think you will always have to expect about a 50% loss in frame rates even in the best conditions, but many are getting much worse than that. I think Andrew said he was getting about 50 fps in 3D and 100 in 2D. That's easily acceptable performance, but we just cant' all seem to get that. Time will tell.
Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
[quote name='Zeeblade' post='511313' date='Feb 27 2009, 12:10 PM']I think it was Andrew that said some mods will slow it down and he was right. Took me awhile to find but where I was getting 14-19 I am getting 30 in the main citys. The most I got so far was 40fps. Where I get 120 then 3d on I get 40. But hit ctrl+r then alt+z and you will see without the hud the fps jump. For me they go from 30 to 50. I still get flickering with my 280 but only in MAIN citys and with weather. My 8800gtx gets the SAME fps but a tad better with no flicker at all. I'm running a q6600 oc and 4g 1066. But no amount of hardware is going to change WOW. Its the drivers(3D).[/quote]
Yes, that is true, some mods affect it more than others. I'm pretty sure that xperl is the one that causes the most drop on mine. However I don't have the raid options enabled, and my guess is that it would take a major hit. Or something like DBM would cause a huge drop. And of course the cities will cause a major decrease in fps, that's normal even with 2D. The more players around you the bigger the drop. I would be totally happy if I could get 50 fps in 3D normally and even down to 25 in cities. I'm not sure I made the right choice when I upgraded to the GTX 260, I would have been better sticking with the 8800 and switching to the Core I7 processor. When upgrading a video card it's not worth it unless you go for the high end of the line, as the low end is usually about the same as the upper end of the previous generation.. I should of known better, it's not the first time I've made that mistake. Here's hoping they can tweak a bit more out of the drivers, but I'm not exactly holding my breath...
[quote name='Zeeblade' post='511313' date='Feb 27 2009, 12:10 PM']I think it was Andrew that said some mods will slow it down and he was right. Took me awhile to find but where I was getting 14-19 I am getting 30 in the main citys. The most I got so far was 40fps. Where I get 120 then 3d on I get 40. But hit ctrl+r then alt+z and you will see without the hud the fps jump. For me they go from 30 to 50. I still get flickering with my 280 but only in MAIN citys and with weather. My 8800gtx gets the SAME fps but a tad better with no flicker at all. I'm running a q6600 oc and 4g 1066. But no amount of hardware is going to change WOW. Its the drivers(3D).
Yes, that is true, some mods affect it more than others. I'm pretty sure that xperl is the one that causes the most drop on mine. However I don't have the raid options enabled, and my guess is that it would take a major hit. Or something like DBM would cause a huge drop. And of course the cities will cause a major decrease in fps, that's normal even with 2D. The more players around you the bigger the drop. I would be totally happy if I could get 50 fps in 3D normally and even down to 25 in cities. I'm not sure I made the right choice when I upgraded to the GTX 260, I would have been better sticking with the 8800 and switching to the Core I7 processor. When upgrading a video card it's not worth it unless you go for the high end of the line, as the low end is usually about the same as the upper end of the previous generation.. I should of known better, it's not the first time I've made that mistake. Here's hoping they can tweak a bit more out of the drivers, but I'm not exactly holding my breath...
[quote name='SpyderCanopus' post='511342' date='Feb 27 2009, 06:10 PM']That's not true. It's rendering from a different angle so it takes a hit.[/quote]
If you're playing in 2d and your character is moving then each frame will also be rendered from a different position to the previous one. But framerates in 2d don't drop by 50% every time you move. What's the difference?
Cheers,
DD
[quote name='SpyderCanopus' post='511342' date='Feb 27 2009, 06:10 PM']That's not true. It's rendering from a different angle so it takes a hit.
If you're playing in 2d and your character is moving then each frame will also be rendered from a different position to the previous one. But framerates in 2d don't drop by 50% every time you move. What's the difference?
[quote name='DickDastardly' post='511323' date='Feb 27 2009, 12:36 PM']Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD[/quote]
Are you saying that earlier version of the 3D vision driver using the same technology with the 120 Hz monitor dual DVI was better then it is now? Or using the earlier technology? 50% seems right to me, as now you will have one gpu doing the work of 2 video cards. I may be wrong though, I hope so. I would love to see much better frame rates than we have now..
[quote name='DickDastardly' post='511323' date='Feb 27 2009, 12:36 PM']Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
Are you saying that earlier version of the 3D vision driver using the same technology with the 120 Hz monitor dual DVI was better then it is now? Or using the earlier technology? 50% seems right to me, as now you will have one gpu doing the work of 2 video cards. I may be wrong though, I hope so. I would love to see much better frame rates than we have now..
[quote name='rickhtoo' post='511386' date='Feb 27 2009, 08:24 PM']Are you saying that earlier version of the 3D vision driver using the same technology with the 120 Hz monitor dual DVI was better then it is now? Or using the earlier technology? 50% seems right to me, as now you will have one gpu doing the work of 2 video cards. I may be wrong though, I hope so. I would love to see much better frame rates than we have now...[/quote]
I was talking about earlier versions of nvidia's 3d drivers (when they were called "Consumer 3d stereo" rather than "Geforce 3d vision"). The 3d framerate hit was then often less than 25%. The dual DVI cable is a red herring- from the point of view of the graphics card there isn't really any difference between rendering 120fps via a dual DVI to the new 120Hz monitors or rendering 120fps via a single VGA to a 120Hz CRT monitor. It still has to render the same number of frames in the same period.
My main point, though, is that enabling 3d shouldn't make much difference to framerates - all that's changing is the position of the camera in each frame and that would be changing anyway if your character was moving.
Cheers,
DD
[quote name='rickhtoo' post='511386' date='Feb 27 2009, 08:24 PM']Are you saying that earlier version of the 3D vision driver using the same technology with the 120 Hz monitor dual DVI was better then it is now? Or using the earlier technology? 50% seems right to me, as now you will have one gpu doing the work of 2 video cards. I may be wrong though, I hope so. I would love to see much better frame rates than we have now...
I was talking about earlier versions of nvidia's 3d drivers (when they were called "Consumer 3d stereo" rather than "Geforce 3d vision"). The 3d framerate hit was then often less than 25%. The dual DVI cable is a red herring- from the point of view of the graphics card there isn't really any difference between rendering 120fps via a dual DVI to the new 120Hz monitors or rendering 120fps via a single VGA to a 120Hz CRT monitor. It still has to render the same number of frames in the same period.
My main point, though, is that enabling 3d shouldn't make much difference to framerates - all that's changing is the position of the camera in each frame and that would be changing anyway if your character was moving.
There is definitely a hit to the FPS, but it's not 50% from what I've seen. I still average over 60fps in Left4Dead, and I believe I averaged around 80 before using the 3D.
There is definitely a hit to the FPS, but it's not 50% from what I've seen. I still average over 60fps in Left4Dead, and I believe I averaged around 80 before using the 3D.
My framerates in WoW seem to vary by location. In Dalaran I've seen it as low as 9 fps. In Desolace I'll get a steady 45+ fps. Grizzly Hills it's about 25 fps. This is in 3d with a couple of basic add ons running. It also turns out that the areas that take a bigger frame hit are also the areas more prone to flickering.
My framerates in WoW seem to vary by location. In Dalaran I've seen it as low as 9 fps. In Desolace I'll get a steady 45+ fps. Grizzly Hills it's about 25 fps. This is in 3d with a couple of basic add ons running. It also turns out that the areas that take a bigger frame hit are also the areas more prone to flickering.
[quote name='Bosko' post='511421' date='Feb 27 2009, 04:34 PM']There is definitely a hit to the FPS, but it's not 50% from what I've seen. I still average over 60fps in Left4Dead, and I believe I averaged around 80 before using the 3D.[/quote]
That I can believe, but when you start talking about WoW it's a different story. If there was not such a thing as an addon, then we would all have better results. Normally there would not be a lot of difference in fps between 2D and 3D, but when you factor in the addons, it really causes issues. Most games they don't have to worry about that factor, but it's WoW that's giving them fits. They simply can't deal with the variety of addons out there as everyone has different tastes in them. Hopefully they can get it better but I doubt we will ever see the kind of performance we would really like to. At least not in WoW..
[quote name='Bosko' post='511421' date='Feb 27 2009, 04:34 PM']There is definitely a hit to the FPS, but it's not 50% from what I've seen. I still average over 60fps in Left4Dead, and I believe I averaged around 80 before using the 3D.
That I can believe, but when you start talking about WoW it's a different story. If there was not such a thing as an addon, then we would all have better results. Normally there would not be a lot of difference in fps between 2D and 3D, but when you factor in the addons, it really causes issues. Most games they don't have to worry about that factor, but it's WoW that's giving them fits. They simply can't deal with the variety of addons out there as everyone has different tastes in them. Hopefully they can get it better but I doubt we will ever see the kind of performance we would really like to. At least not in WoW..
[quote]it would be a lot cheaper for Blizzard to simply program WoW better than for us to have to keep throwing more and more hardware at it. I mean seriously, what the heck???[/quote]
Here is the MOST POWERFUL graphics technology in the WORLD! HOLY JESUS!
LOL grasshopper! Have you ever heard of RUBE GOLDBERG? Why should they make something that works great for you when they can keep selling you snake oil and complications?
I remember back on an old amiga in 1985 some computer magazine issued a challenge saying no way jose can amiga do fast 3dtype animation, so this kid programmed some 3dstuff with assembly or pure machine code and made the computer magazine people eat their shoe.
One of my old programmer friends really hate what is happening in the software world, he always send me statistics of these new high level languages - stuff like python or perl, that is 20 times slower than if you use lower level stuff.
No doubt if all developers were like NEO from the MATRIX and saw machine code in their dreams, things would be faster. Just turn off your screen, close your eyes, and imagine getting eaten by a grue - what graphics card can compare to that?
it would be a lot cheaper for Blizzard to simply program WoW better than for us to have to keep throwing more and more hardware at it. I mean seriously, what the heck???
Here is the MOST POWERFUL graphics technology in the WORLD! HOLY JESUS!
LOL grasshopper! Have you ever heard of RUBE GOLDBERG? Why should they make something that works great for you when they can keep selling you snake oil and complications?
I remember back on an old amiga in 1985 some computer magazine issued a challenge saying no way jose can amiga do fast 3dtype animation, so this kid programmed some 3dstuff with assembly or pure machine code and made the computer magazine people eat their shoe.
One of my old programmer friends really hate what is happening in the software world, he always send me statistics of these new high level languages - stuff like python or perl, that is 20 times slower than if you use lower level stuff.
No doubt if all developers were like NEO from the MATRIX and saw machine code in their dreams, things would be faster. Just turn off your screen, close your eyes, and imagine getting eaten by a grue - what graphics card can compare to that?
[quote name='DickDastardly' post='511323' date='Feb 27 2009, 12:36 PM']Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD[/quote]
Play in 3d and each 'frame' now consists of a stereo pair. It's going to take a bunch more GPU horsepower to render this, of course. Like, double. There may be some slight gain by not having to have the CPU do all the setup work twice (or maybe it does?) but seeing a 50% performance drop should by no means be unexpected. Look at the pixel count per frame (normal and stereo pair frame) for 1680x1050. 1764000 pixels 2d, 3528000 for a 3d pair frame. 2560x1600 is only 4096000 pixels, so stereo at 1680x1050 is pretty close. Much more demanding than 1920x1200 (2304000 pixels) even.
If you really want to inflate the FPS count, just double what the screen is displaying for a precise number actually being rendered. The number is misleading though. If someone says they are getting 60 FPS in 3d, but they came to that number by counting each eye's frames and listing the total, the experience may be choppy since each eye is only getting 30 FPS. See?
[quote name='DickDastardly' post='511323' date='Feb 27 2009, 12:36 PM']Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
Play in 3d and each 'frame' now consists of a stereo pair. It's going to take a bunch more GPU horsepower to render this, of course. Like, double. There may be some slight gain by not having to have the CPU do all the setup work twice (or maybe it does?) but seeing a 50% performance drop should by no means be unexpected. Look at the pixel count per frame (normal and stereo pair frame) for 1680x1050. 1764000 pixels 2d, 3528000 for a 3d pair frame. 2560x1600 is only 4096000 pixels, so stereo at 1680x1050 is pretty close. Much more demanding than 1920x1200 (2304000 pixels) even.
If you really want to inflate the FPS count, just double what the screen is displaying for a precise number actually being rendered. The number is misleading though. If someone says they are getting 60 FPS in 3d, but they came to that number by counting each eye's frames and listing the total, the experience may be choppy since each eye is only getting 30 FPS. See?
My current setup is:
Intel Core 2 Dual Core @ 3.6 Ghz
8 Gigs of RAM @ 800 Mhz
Vista 64
Gigabyte (gaming) motherboard
XFX overclocked GeForce GTX 260 video card
15 Mb/s Internet connection
I don't point this out to brag, I know many here have even faster computers. However, I do want to quickly illustrate that I'm running a decent machine. Certainly a computer that should blow away a game that came out over 4 years ago. (In fact, TWO COMPUTERS AGO I was playing Warcraft just fine back in the days of a GeForce 5400 or some such nonsense by today's standards.)
So why am I getting 15 frames per second at times while in 3D mode? What the heck? That performance seems horrible.
I've played with every graphic setting in the game, and while small differences can be observed I think the bottom line is that WoW suffers from some bad coding (or something). I just don't think this game is utilizing modern hardware the way it should. Hopefully a patch (to either WoW or nVidia drivers) will alleviate this someday.
On a slightly related note, I downloaded Left 4 Dead tonight after reading about it here (and hearing all of the rave reviews). Thinking that my computer must just be dog-ass slow I was prepared to turn down settings left and right in an effort to squeek out the game in some semblance of a playable state. Much to my surprise (Warcraft has me feeling like I'm running a 386 most days) this game defaulted to the highest resolution, graphical settings, everything. And guess what: it played smooth as butter. Not a single glitch, other than needing to turn off anti-aliasing because of the strange 3D bug we get when AA is turned on. (Not that my computer wasn't handling anti-aliasing, it performed just fine.)
So what the heck? A brand new game can run with FULL SETTINGS, including AA, and perform beautifully. Smooth as silk. And a game that came out four years ago struggles to keep up? I know WoW has pretty graphics, but not that pretty. My computer should blow WoW away, 3D mode or no.
I was actually considering upgrading my power supply, motherboard, and adding a second GTX 260 in SLI mode just to get my frames up in WoW. But now, after playing Left 4 Dead and having an absolute blast tonight on a computer that is entitled to play new games with full settings, I have to say:
... it would be a lot cheaper for Blizzard to simply program WoW better than for us to have to keep throwing more and more hardware at it. I mean seriously, what the heck???
My current setup is:
Intel Core 2 Dual Core @ 3.6 Ghz
8 Gigs of RAM @ 800 Mhz
Vista 64
Gigabyte (gaming) motherboard
XFX overclocked GeForce GTX 260 video card
15 Mb/s Internet connection
I don't point this out to brag, I know many here have even faster computers. However, I do want to quickly illustrate that I'm running a decent machine. Certainly a computer that should blow away a game that came out over 4 years ago. (In fact, TWO COMPUTERS AGO I was playing Warcraft just fine back in the days of a GeForce 5400 or some such nonsense by today's standards.)
So why am I getting 15 frames per second at times while in 3D mode? What the heck? That performance seems horrible.
I've played with every graphic setting in the game, and while small differences can be observed I think the bottom line is that WoW suffers from some bad coding (or something). I just don't think this game is utilizing modern hardware the way it should. Hopefully a patch (to either WoW or nVidia drivers) will alleviate this someday.
On a slightly related note, I downloaded Left 4 Dead tonight after reading about it here (and hearing all of the rave reviews). Thinking that my computer must just be dog-ass slow I was prepared to turn down settings left and right in an effort to squeek out the game in some semblance of a playable state. Much to my surprise (Warcraft has me feeling like I'm running a 386 most days) this game defaulted to the highest resolution, graphical settings, everything. And guess what: it played smooth as butter. Not a single glitch, other than needing to turn off anti-aliasing because of the strange 3D bug we get when AA is turned on. (Not that my computer wasn't handling anti-aliasing, it performed just fine.)
So what the heck? A brand new game can run with FULL SETTINGS, including AA, and perform beautifully. Smooth as silk. And a game that came out four years ago struggles to keep up? I know WoW has pretty graphics, but not that pretty. My computer should blow WoW away, 3D mode or no.
I was actually considering upgrading my power supply, motherboard, and adding a second GTX 260 in SLI mode just to get my frames up in WoW. But now, after playing Left 4 Dead and having an absolute blast tonight on a computer that is entitled to play new games with full settings, I have to say:
... it would be a lot cheaper for Blizzard to simply program WoW better than for us to have to keep throwing more and more hardware at it. I mean seriously, what the heck???
Like Bosko says, adding more hardware will not help that much, although some have reported much better performance using Core I7 processors. As many have pointed out they actually have better results with the older 8000 series card that the 200 line.I have not switched back to my 8800GTS card to try it because I'm not experiencing the problems they had with the 280's.
I also use the eVga GTX 260 card and it works perfectly except for the low frame rates. I'm pretty sure switching to CoreI7 cpu would help the performance some but not would not justify the xtra cost for the small amount of performance gain. I would like to just to see what happens and may in the future but can't atm. Wait until we hear from Andrew again (if he ever shows back up) and see if they can tweak the drivers for better performance. I think you will always have to expect about a 50% loss in frame rates even in the best conditions, but many are getting much worse than that. I think Andrew said he was getting about 50 fps in 3D and 100 in 2D. That's easily acceptable performance, but we just cant' all seem to get that. Time will tell.
Rick
Like Bosko says, adding more hardware will not help that much, although some have reported much better performance using Core I7 processors. As many have pointed out they actually have better results with the older 8000 series card that the 200 line.I have not switched back to my 8800GTS card to try it because I'm not experiencing the problems they had with the 280's.
I also use the eVga GTX 260 card and it works perfectly except for the low frame rates. I'm pretty sure switching to CoreI7 cpu would help the performance some but not would not justify the xtra cost for the small amount of performance gain. I would like to just to see what happens and may in the future but can't atm. Wait until we hear from Andrew again (if he ever shows back up) and see if they can tweak the drivers for better performance. I think you will always have to expect about a 50% loss in frame rates even in the best conditions, but many are getting much worse than that. I think Andrew said he was getting about 50 fps in 3D and 100 in 2D. That's easily acceptable performance, but we just cant' all seem to get that. Time will tell.
Rick
Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
Just as an aside, there really isn't any reason to expect frame rates to drop by 50% when using stereo 3D. Imagine, for example, you were getting 120fps in 2d so your graphics card was rendering the scene 120 times each second. Now compare that to running at 120fps in 3d. Your graphics card is doing exactly the same amount of work, rendering the scene 120 times per second (so your in-game frame rate counter should still read 120). The only difference is that 60 of the frames were rendered with the camera in one position (for your left eye), and 60 frames were rendered with the camera offset slightly to the side (for your right eye). If you think about it, this is just the same as if you were playing in 2d and stood on one spot for half a second then took a tiny sidestep right and stood there for another half a second. If the 2d driver can handle that at 120fps then there's no reason why the 3d one shouldn't be able to too.
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
Yes, that is true, some mods affect it more than others. I'm pretty sure that xperl is the one that causes the most drop on mine. However I don't have the raid options enabled, and my guess is that it would take a major hit. Or something like DBM would cause a huge drop. And of course the cities will cause a major decrease in fps, that's normal even with 2D. The more players around you the bigger the drop. I would be totally happy if I could get 50 fps in 3D normally and even down to 25 in cities. I'm not sure I made the right choice when I upgraded to the GTX 260, I would have been better sticking with the 8800 and switching to the Core I7 processor. When upgrading a video card it's not worth it unless you go for the high end of the line, as the low end is usually about the same as the upper end of the previous generation.. I should of known better, it's not the first time I've made that mistake. Here's hoping they can tweak a bit more out of the drivers, but I'm not exactly holding my breath...
Rick
Yes, that is true, some mods affect it more than others. I'm pretty sure that xperl is the one that causes the most drop on mine. However I don't have the raid options enabled, and my guess is that it would take a major hit. Or something like DBM would cause a huge drop. And of course the cities will cause a major decrease in fps, that's normal even with 2D. The more players around you the bigger the drop. I would be totally happy if I could get 50 fps in 3D normally and even down to 25 in cities. I'm not sure I made the right choice when I upgraded to the GTX 260, I would have been better sticking with the 8800 and switching to the Core I7 processor. When upgrading a video card it's not worth it unless you go for the high end of the line, as the low end is usually about the same as the upper end of the previous generation.. I should of known better, it's not the first time I've made that mistake. Here's hoping they can tweak a bit more out of the drivers, but I'm not exactly holding my breath...
Rick
That's not true. It's rendering from a different angle so it takes a hit.
That's not true. It's rendering from a different angle so it takes a hit.
1x Intel S5000Xvn Mainboard
2x Quad 2.66GHz Xeons (X5355, 8 Cores)
1x EVGA GTX480
8x 2GB FB-DIMM 667 (16GB)
2x 64GB Corsair M4 SSDs in RAID0 (System)
4x 1TB SATA2 64MB Cache Western Digital Black's in RAID0 (Storage)
1x Sound Blaster X-Fi Elite Pro
1x BD-ROM
1x DVD-RW
1x Antec High Current Pro HCP-1200 1200W Power Supply
1x Dell 30" 2560x1600 LCD
1x Samsung 22" 120hz GeForce 3D Vision Display
1x APC 1500VAC SmartUPS Battery Backup
1x Windows 7 Professional 64-Bit
If you're playing in 2d and your character is moving then each frame will also be rendered from a different position to the previous one. But framerates in 2d don't drop by 50% every time you move. What's the difference?
Cheers,
DD
If you're playing in 2d and your character is moving then each frame will also be rendered from a different position to the previous one. But framerates in 2d don't drop by 50% every time you move. What's the difference?
Cheers,
DD
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD[/quote]
Are you saying that earlier version of the 3D vision driver using the same technology with the 120 Hz monitor dual DVI was better then it is now? Or using the earlier technology? 50% seems right to me, as now you will have one gpu doing the work of 2 video cards. I may be wrong though, I hope so. I would love to see much better frame rates than we have now..
Rick
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
Are you saying that earlier version of the 3D vision driver using the same technology with the 120 Hz monitor dual DVI was better then it is now? Or using the earlier technology? 50% seems right to me, as now you will have one gpu doing the work of 2 video cards. I may be wrong though, I hope so. I would love to see much better frame rates than we have now..
Rick
I was talking about earlier versions of nvidia's 3d drivers (when they were called "Consumer 3d stereo" rather than "Geforce 3d vision"). The 3d framerate hit was then often less than 25%. The dual DVI cable is a red herring- from the point of view of the graphics card there isn't really any difference between rendering 120fps via a dual DVI to the new 120Hz monitors or rendering 120fps via a single VGA to a 120Hz CRT monitor. It still has to render the same number of frames in the same period.
My main point, though, is that enabling 3d shouldn't make much difference to framerates - all that's changing is the position of the camera in each frame and that would be changing anyway if your character was moving.
Cheers,
DD
I was talking about earlier versions of nvidia's 3d drivers (when they were called "Consumer 3d stereo" rather than "Geforce 3d vision"). The 3d framerate hit was then often less than 25%. The dual DVI cable is a red herring- from the point of view of the graphics card there isn't really any difference between rendering 120fps via a dual DVI to the new 120Hz monitors or rendering 120fps via a single VGA to a 120Hz CRT monitor. It still has to render the same number of frames in the same period.
My main point, though, is that enabling 3d shouldn't make much difference to framerates - all that's changing is the position of the camera in each frame and that would be changing anyway if your character was moving.
Cheers,
DD
That I can believe, but when you start talking about WoW it's a different story. If there was not such a thing as an addon, then we would all have better results. Normally there would not be a lot of difference in fps between 2D and 3D, but when you factor in the addons, it really causes issues. Most games they don't have to worry about that factor, but it's WoW that's giving them fits. They simply can't deal with the variety of addons out there as everyone has different tastes in them. Hopefully they can get it better but I doubt we will ever see the kind of performance we would really like to. At least not in WoW..
Rick
That I can believe, but when you start talking about WoW it's a different story. If there was not such a thing as an addon, then we would all have better results. Normally there would not be a lot of difference in fps between 2D and 3D, but when you factor in the addons, it really causes issues. Most games they don't have to worry about that factor, but it's WoW that's giving them fits. They simply can't deal with the variety of addons out there as everyone has different tastes in them. Hopefully they can get it better but I doubt we will ever see the kind of performance we would really like to. At least not in WoW..
Rick
Here is the MOST POWERFUL graphics technology in the WORLD! HOLY JESUS!
[img]http://www.keithpalmer.ca/images/BrainAd2.jpg[/img]
[url="http://www.keithpalmer.ca/images/BrainAd2.jpg"]http://www.keithpalmer.ca/images/BrainAd2.jpg[/url]
LOL grasshopper! Have you ever heard of RUBE GOLDBERG? Why should they make something that works great for you when they can keep selling you snake oil and complications?
I remember back on an old amiga in 1985 some computer magazine issued a challenge saying no way jose can amiga do fast 3dtype animation, so this kid programmed some 3dstuff with assembly or pure machine code and made the computer magazine people eat their shoe.
One of my old programmer friends really hate what is happening in the software world, he always send me statistics of these new high level languages - stuff like python or perl, that is 20 times slower than if you use lower level stuff.
No doubt if all developers were like NEO from the MATRIX and saw machine code in their dreams, things would be faster. Just turn off your screen, close your eyes, and imagine getting eaten by a grue - what graphics card can compare to that?
Here is the MOST POWERFUL graphics technology in the WORLD! HOLY JESUS!
http://www.keithpalmer.ca/images/BrainAd2.jpg
LOL grasshopper! Have you ever heard of RUBE GOLDBERG? Why should they make something that works great for you when they can keep selling you snake oil and complications?
I remember back on an old amiga in 1985 some computer magazine issued a challenge saying no way jose can amiga do fast 3dtype animation, so this kid programmed some 3dstuff with assembly or pure machine code and made the computer magazine people eat their shoe.
One of my old programmer friends really hate what is happening in the software world, he always send me statistics of these new high level languages - stuff like python or perl, that is 20 times slower than if you use lower level stuff.
No doubt if all developers were like NEO from the MATRIX and saw machine code in their dreams, things would be faster. Just turn off your screen, close your eyes, and imagine getting eaten by a grue - what graphics card can compare to that?
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD[/quote]
Play in 3d and each 'frame' now consists of a stereo pair. It's going to take a bunch more GPU horsepower to render this, of course. Like, double. There may be some slight gain by not having to have the CPU do all the setup work twice (or maybe it does?) but seeing a 50% performance drop should by no means be unexpected. Look at the pixel count per frame (normal and stereo pair frame) for 1680x1050. 1764000 pixels 2d, 3528000 for a 3d pair frame. 2560x1600 is only 4096000 pixels, so stereo at 1680x1050 is pretty close. Much more demanding than 1920x1200 (2304000 pixels) even.
If you really want to inflate the FPS count, just double what the screen is displaying for a precise number actually being rendered. The number is misleading though. If someone says they are getting 60 FPS in 3d, but they came to that number by counting each eye's frames and listing the total, the experience may be choppy since each eye is only getting 30 FPS. See?
In earlier versions of the stereo 3d driver the frame rate hit in 3d tended to be less than 25% so its not clear to me why it seems to have got worse generally. I'd be interested to hear an explanation from Andrew.
Cheers,
DD
Play in 3d and each 'frame' now consists of a stereo pair. It's going to take a bunch more GPU horsepower to render this, of course. Like, double. There may be some slight gain by not having to have the CPU do all the setup work twice (or maybe it does?) but seeing a 50% performance drop should by no means be unexpected. Look at the pixel count per frame (normal and stereo pair frame) for 1680x1050. 1764000 pixels 2d, 3528000 for a 3d pair frame. 2560x1600 is only 4096000 pixels, so stereo at 1680x1050 is pretty close. Much more demanding than 1920x1200 (2304000 pixels) even.
If you really want to inflate the FPS count, just double what the screen is displaying for a precise number actually being rendered. The number is misleading though. If someone says they are getting 60 FPS in 3d, but they came to that number by counting each eye's frames and listing the total, the experience may be choppy since each eye is only getting 30 FPS. See?