A simple trick to remove (or at least dramatically decrease) stuttering in WD is to decrease Shadows quality. I've decreased shadows from Ultra to Medium and the game now runs nicely.
Note that I'm running textures on High (Sli GTX680 2Go Vram) and everything else at max. I guess this could also help for Ultra textures when you use 3Go Vram and more.
A simple trick to remove (or at least dramatically decrease) stuttering in WD is to decrease Shadows quality. I've decreased shadows from Ultra to Medium and the game now runs nicely.
Note that I'm running textures on High (Sli GTX680 2Go Vram) and everything else at max. I guess this could also help for Ultra textures when you use 3Go Vram and more.
I always reduce shadows and find performance increases in pretty much every game. (Also 2GB VRAM.)
In WD and AC4, I think medium looks better than ultra! I find ultra is too sharp and bold. The lower rez shadows on medium softens the effect which I think makes it more realistic.
To each, their own though right?
I always reduce shadows and find performance increases in pretty much every game. (Also 2GB VRAM.)
In WD and AC4, I think medium looks better than ultra! I find ultra is too sharp and bold. The lower rez shadows on medium softens the effect which I think makes it more realistic.
To each, their own though right?
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Yeah shadows is one thing thats allways is a memory/performance hog and btw, I use an SSD for my gaming comp and guess that is one great thing so it can load textures much faster then an old regular HDD.
So with my ancient gtx 570 and 8gb system ram with my SSD the game works much better then expected on some med/high settings and I game in 1080 with better fps then people can on the weak PS4 with sub HD and 30 fps ..LOL!
Yeah shadows is one thing thats allways is a memory/performance hog and btw, I use an SSD for my gaming comp and guess that is one great thing so it can load textures much faster then an old regular HDD.
So with my ancient gtx 570 and 8gb system ram with my SSD the game works much better then expected on some med/high settings and I game in 1080 with better fps then people can on the weak PS4 with sub HD and 30 fps ..LOL!
I was just thinking about this recently. I wish the entire industry would develop some sort of movement to increase the resolution of textures to a photo-realistic level while the Nvidia and AMD supported it with hardware.
Shadows: Although i think its a weird engine problem, when shadows are increased or turned in Arma 2 or DayZ, the framerate can actually go up. Its about the same on or off with a few exceptions.
EDIT: Hey, somebody start a photo-realistic texture petition asking for industry wide support on the Nvidia drivers forum?
I was just thinking about this recently. I wish the entire industry would develop some sort of movement to increase the resolution of textures to a photo-realistic level while the Nvidia and AMD supported it with hardware.
Shadows: Although i think its a weird engine problem, when shadows are increased or turned in Arma 2 or DayZ, the framerate can actually go up. Its about the same on or off with a few exceptions.
EDIT: Hey, somebody start a photo-realistic texture petition asking for industry wide support on the Nvidia drivers forum?
Didn't the Unreal engine have this feature where different rez textures would be loaded depending on how far you were away from the wall? I remember being wowed by zooming close into a wall and still having crisp textures.
Of course, the points when new textures were loaded and changed was somewhat visible, but this was over a decade ago.
Also, didn't ID demo their gigatexture tech a while back? I think it's used in RAGE - they used a different texture everywhere without repeating, but the actual texture was unfortunately low rez which was purpose defeating.
Out of curiosity, I'm trying to load up watch dogs on my 4G cards to see how much it actually fills up.
On Skyrim, 1.25GB cards with ultra textures was unplayable. It filled up to 2.4GB when I got the 4G cards, and was butter smooth.
That's the thing with low VRam. It's not the same kind of performance loss as with a weak GPU or CPU. With low VRam, you get eratic, long FPS dips while new textures load whereas with weak GPU/CPU you get "smooth" stutter i.e. low fps.
I'll post when I have some results.
Didn't the Unreal engine have this feature where different rez textures would be loaded depending on how far you were away from the wall? I remember being wowed by zooming close into a wall and still having crisp textures.
Of course, the points when new textures were loaded and changed was somewhat visible, but this was over a decade ago.
Also, didn't ID demo their gigatexture tech a while back? I think it's used in RAGE - they used a different texture everywhere without repeating, but the actual texture was unfortunately low rez which was purpose defeating.
Out of curiosity, I'm trying to load up watch dogs on my 4G cards to see how much it actually fills up.
On Skyrim, 1.25GB cards with ultra textures was unplayable. It filled up to 2.4GB when I got the 4G cards, and was butter smooth.
That's the thing with low VRam. It's not the same kind of performance loss as with a weak GPU or CPU. With low VRam, you get eratic, long FPS dips while new textures load whereas with weak GPU/CPU you get "smooth" stutter i.e. low fps.
I'll post when I have some results.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Detail textures, I remember those. Cool at the time but weren't really anything special, just a small tiled texture that wasn't filtered, or as filtered/stretched as the one they were on top of giving the appearance of more detail but it was more of just a noise/grain filter.
Detail textures, I remember those. Cool at the time but weren't really anything special, just a small tiled texture that wasn't filtered, or as filtered/stretched as the one they were on top of giving the appearance of more detail but it was more of just a noise/grain filter.
[quote]Didn't the Unreal engine have this feature where different rez textures would be loaded depending on how far you were away from the wall? I remember being wowed by zooming close into a wall and still having crisp textures.[/quote]Is that unreal4? I think that tech has existed in a lot of games, which is why you sometimes hear about texture 'pop-in' (textures abruptly snapping to higher-res versions as you reach a certain distance, or lower-res ones as you back away). But perhaps unreal 4 does it in a more subtle or high-res way?
[quote="RAGEdemon"]Also, didn't ID demo their gigatexture tech a while back? I think it's used in RAGE - they used a different texture everywhere without repeating, but the actual texture was unfortunately low rez which was purpose defeating.[/quote]
Yeah, that megatexture business just seemed like a bad joke. Carmack made such a big deal about his revolutionary new texture engine for years leading up to the release of Rage. But the final release was like a bad joke: textures that were rather low-res, had the most relentless pop-in problems many had ever seen, and took up a monumental amount of space to boot.
Maybe there was something good about that tech from a back-end point of view, but from where I was sitting it just looked like a giant gaffe. After that, it was hard for me to get too excited about Carmack joining the Oculus team.
I'm very interested to see your VRAM test results!
Didn't the Unreal engine have this feature where different rez textures would be loaded depending on how far you were away from the wall? I remember being wowed by zooming close into a wall and still having crisp textures.
Is that unreal4? I think that tech has existed in a lot of games, which is why you sometimes hear about texture 'pop-in' (textures abruptly snapping to higher-res versions as you reach a certain distance, or lower-res ones as you back away). But perhaps unreal 4 does it in a more subtle or high-res way?
RAGEdemon said:Also, didn't ID demo their gigatexture tech a while back? I think it's used in RAGE - they used a different texture everywhere without repeating, but the actual texture was unfortunately low rez which was purpose defeating.
Yeah, that megatexture business just seemed like a bad joke. Carmack made such a big deal about his revolutionary new texture engine for years leading up to the release of Rage. But the final release was like a bad joke: textures that were rather low-res, had the most relentless pop-in problems many had ever seen, and took up a monumental amount of space to boot.
Maybe there was something good about that tech from a back-end point of view, but from where I was sitting it just looked like a giant gaffe. After that, it was hard for me to get too excited about Carmack joining the Oculus team.
I'm very interested to see your VRAM test results!
Unreal ... pretty sure they were the first to use [url=http://wiki.beyondunreal.com/Legacy:Detail_Texture]detail textures[/url] over a decade ago, they weren't actually higher res versions of the texture just overlays to give the appearance of more detail. What you're thinking about is [url=http://wiki.beyondunreal.com/Legacy:MipMap]mip-mapping[/url], where actual higher res versions of the textures are loading in.
Unreal ... pretty sure they were the first to use detail textures over a decade ago, they weren't actually higher res versions of the texture just overlays to give the appearance of more detail. What you're thinking about is mip-mapping, where actual higher res versions of the textures are loading in.
Ok, I have played around with it a little bit.
MSI afterburner / Rivatuner (64 bit) statistics server don't allow the game to load, so they have to be exited. When the game is loaded, you can alt-tab out and load them up; then you can alt-tab back in.
Gameplay showed 3GB usage throughout my test on Ultra settings with TXAA x4 - it didn't matter which section I was in - small starting rooms nor the largish stadium. It stayed between 2990MB to 3010MB VRam.
I was getting 45fps average with both GPUs at ~65% mostly. Obviously SLI profile needs optimisation.
It was butter smooth as far as heavy stuttering was concerned, aside from the low FPS. I should mention here that the game is installed on an SSD which may have helped. I usually try and aim for solid 60fps in s3D, and don't plan on playing the game properly till a good 3D fix. I will probably have to take the settings down quite a bit, but textures won't be one of them it would seem.
As someone earlier said, the textures seem "bland" in that they don't stick out. You can have an extremely high rez texture of a blue wall which would still look bland even though it is "high rez"; on the other hand you could have a medium size texture of tree bark, which would look spectacular in comparison. It is unfortunate that they have used the HR textures as a box to tick instead of really taking advantage of them. IMO, you won't notice much of a difference between standard and ultra simply because they don't really take much advantage of the high rez from what I can make out from the first area.
Also unrelated, but do they really want me to feel sorry for and identify with some con artist who goes around hacking into people's bank accounts and stealing their money? The end result is he lost his family. Perhaps he should have thought of the effects on his loved ones before he started down the path of crime. He blames everyone else except himself... What did he expect would be the outcome of going around stealing from people? To me, the protagonist seems like a short sighted narcissist. I hope he has more depth in the rest of the game.
Back on topic, is there anything specific you guys want me to test?
MSI afterburner / Rivatuner (64 bit) statistics server don't allow the game to load, so they have to be exited. When the game is loaded, you can alt-tab out and load them up; then you can alt-tab back in.
Gameplay showed 3GB usage throughout my test on Ultra settings with TXAA x4 - it didn't matter which section I was in - small starting rooms nor the largish stadium. It stayed between 2990MB to 3010MB VRam.
I was getting 45fps average with both GPUs at ~65% mostly. Obviously SLI profile needs optimisation.
It was butter smooth as far as heavy stuttering was concerned, aside from the low FPS. I should mention here that the game is installed on an SSD which may have helped. I usually try and aim for solid 60fps in s3D, and don't plan on playing the game properly till a good 3D fix. I will probably have to take the settings down quite a bit, but textures won't be one of them it would seem.
As someone earlier said, the textures seem "bland" in that they don't stick out. You can have an extremely high rez texture of a blue wall which would still look bland even though it is "high rez"; on the other hand you could have a medium size texture of tree bark, which would look spectacular in comparison. It is unfortunate that they have used the HR textures as a box to tick instead of really taking advantage of them. IMO, you won't notice much of a difference between standard and ultra simply because they don't really take much advantage of the high rez from what I can make out from the first area.
Also unrelated, but do they really want me to feel sorry for and identify with some con artist who goes around hacking into people's bank accounts and stealing their money? The end result is he lost his family. Perhaps he should have thought of the effects on his loved ones before he started down the path of crime. He blames everyone else except himself... What did he expect would be the outcome of going around stealing from people? To me, the protagonist seems like a short sighted narcissist. I hope he has more depth in the rest of the game.
Back on topic, is there anything specific you guys want me to test?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Thanks for the report and that seems to be the case of what I have read, the GTX 770 and in your case 670 cant use anymore then 3GB vram.
So it seems like it's just as good for me to go GTX 780 3GB then, if I understand it correctly and if that really is the case :)
As they say the game can use up to even 4gb or more vram but the 256 bit bus cant handle more then 3GB vram.
And yeah I think that the SSD helps out a lot, I have one myself so that good for fast loading.
Thanks again for the report, really appriciated mate :)
Hi InSync,
I think you have the wrong end of the proverbial stick old chum, as bo3b said earlier. The 670 or any other card can utilise more than 3GB if required, but there seems to be a hard coded limit in the watch dogs engine which stop it from using any more. In the watch dogs options screen, it even specifically says 3GB card will be required for the ultra textures.
This makes sense as it's a designed target for the engine by the devs.
If you can show me an example of someone getting more than 3GB in the game with any card, i'll use the same settings and show you a graph and screenshot of even a 670 using >3gb.
Please forget about the bus width, it really doesn't have any relevance. It's a small piece of a jigsaw of overall performance. A more important piece is memory bandwidth. Rest assured that the full 4GB of memory is addressable regardless of the bus widths. I'll see if I can get a modern game running with SGSSAA to show you a screenshot :)
If you want to go with another card, I would only look at the overall performance and the memory on that card.
EDIT:
Please find my screen of watchdogs below utilising 4GB of memory on a 670 ;-)
GPU and memory usage on top left.
You can also see the fps top right.
Achieved using 8x SGSSAA with 3D Vision enabled (but not working).
[img]http://i.imgur.com/hi1hXGe.jpg[/img]
I think you have the wrong end of the proverbial stick old chum, as bo3b said earlier. The 670 or any other card can utilise more than 3GB if required, but there seems to be a hard coded limit in the watch dogs engine which stop it from using any more. In the watch dogs options screen, it even specifically says 3GB card will be required for the ultra textures.
This makes sense as it's a designed target for the engine by the devs.
If you can show me an example of someone getting more than 3GB in the game with any card, i'll use the same settings and show you a graph and screenshot of even a 670 using >3gb.
Please forget about the bus width, it really doesn't have any relevance. It's a small piece of a jigsaw of overall performance. A more important piece is memory bandwidth. Rest assured that the full 4GB of memory is addressable regardless of the bus widths. I'll see if I can get a modern game running with SGSSAA to show you a screenshot :)
If you want to go with another card, I would only look at the overall performance and the memory on that card.
EDIT:
Please find my screen of watchdogs below utilising 4GB of memory on a 670 ;-)
GPU and memory usage on top left.
You can also see the fps top right.
Achieved using 8x SGSSAA with 3D Vision enabled (but not working).
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Hi Insync, like RAGE says, the bus width isn't relevant to VRAM capacity, it's just relevant to performance.
VRAM Bandwidth in GB/s is a function of the bus speed (Mhz) X bus width (bits).
Bus width on it's own is irrelevant, just like car manufacturers used to quote 'Coeficient of drag' on it's own. It has to be multiplied by something else (frontal area of car) to actually get to a meaningful number.
Did a quick google as well and found this which is quite a good discussion on the topic:
http://hardforum.com/showthread.php?t=1673311
Hi Insync, like RAGE says, the bus width isn't relevant to VRAM capacity, it's just relevant to performance.
VRAM Bandwidth in GB/s is a function of the bus speed (Mhz) X bus width (bits).
Bus width on it's own is irrelevant, just like car manufacturers used to quote 'Coeficient of drag' on it's own. It has to be multiplied by something else (frontal area of car) to actually get to a meaningful number.
Did a quick google as well and found this which is quite a good discussion on the topic:
Hey guys and thanks a million for the explanation, I try to understand the best I can, but it aint that easy, or maybe it's me overcomplicating things ..LOL!
Dont know why people says that the bus on a gtx 770 are to small to use it for pushing out good fps, so thats nonsense then ?
Will check out that link rustyk, if it will make me any smarter is another thing :)
Is it maybe that people mean that for say 4k gaming then the bus is to small for pushing out to all those pixels ?
But when a game like watch dogs use that much memory for e.x textures then the small bus dont matter and the card can use the 4gb without trouble ?
Thanks a lot guys, and really nice of you RAGEdemon trying to explain in such a great way :)
Hey guys and thanks a million for the explanation, I try to understand the best I can, but it aint that easy, or maybe it's me overcomplicating things ..LOL!
Dont know why people says that the bus on a gtx 770 are to small to use it for pushing out good fps, so thats nonsense then ?
Will check out that link rustyk, if it will make me any smarter is another thing :)
Is it maybe that people mean that for say 4k gaming then the bus is to small for pushing out to all those pixels ?
But when a game like watch dogs use that much memory for e.x textures then the small bus dont matter and the card can use the 4gb without trouble ?
Thanks a lot guys, and really nice of you RAGEdemon trying to explain in such a great way :)
[quote="RAGEdemon"]
This makes sense as it's a designed target for the engine by the devs.
[/quote]
Perhaps it's to do with the unified RAM (VRAM+RAM) of the consoles. Maybe they have to put a hard limit on the VRAM usage, otherwise it could encroach on the RAM which would run out for other things. If so, that's a shame, as it suggests that we might have these sorts of 3GB hard limits for the entirety of this console cycle.
But I'm a bit confused about this hard limit, since it was 3GB in your first test, and 4GB in your second one. I'm guessing your first test was in 1080p, which I guess is what the Xbox One or PS4 are running. So that would seem to suggest that the 3GB limit Ubisoft placed on the game was centred around a console context?
[quote="RAGEdemon"]Also unrelated, but do they really want me to feel sorry for and identify with some con artist who goes around hacking into people's bank accounts and stealing their money? The end result is he lost his family. Perhaps he should have thought of the effects on his loved ones before he started down the path of crime. He blames everyone else except himself... What did he expect would be the outcome of going around stealing from people? To me, the protagonist seems like a short sighted narcissist. I hope he has more depth in the rest of the game.[/quote]
I can barely remember the last time a ubisoft protagonist was likeable, or even not morally repulsive. I think it's partly due to the nature of the open-world sandbox genre, where the style of play practically forces you to behave like a sociopath. If they made the protagonist a nice guy, there'd probably be too much cognitive dissonance between how the character's portrayed in cut-scenes and how he behaves in-game.
Driver San Francisco got around this problem by making it impossible to run over a pedestrian, so that your crazy driving is wacky, but never cruel. The Batman games got around it by removing civilians entirely, making all combat non-lethal, and just for good measure, repeatedly highlighting Batman's struggle with his own pathological leanings in the story.
But GTA, Assassins' Creed, Far Cry 2&3, and now Watch Dogs just seem to address this problem by always making their main characters dour, amoral arseholes, so that no action you can take in the game ever seems beneath them.
RAGEdemon said:
This makes sense as it's a designed target for the engine by the devs.
Perhaps it's to do with the unified RAM (VRAM+RAM) of the consoles. Maybe they have to put a hard limit on the VRAM usage, otherwise it could encroach on the RAM which would run out for other things. If so, that's a shame, as it suggests that we might have these sorts of 3GB hard limits for the entirety of this console cycle.
But I'm a bit confused about this hard limit, since it was 3GB in your first test, and 4GB in your second one. I'm guessing your first test was in 1080p, which I guess is what the Xbox One or PS4 are running. So that would seem to suggest that the 3GB limit Ubisoft placed on the game was centred around a console context?
RAGEdemon said:Also unrelated, but do they really want me to feel sorry for and identify with some con artist who goes around hacking into people's bank accounts and stealing their money? The end result is he lost his family. Perhaps he should have thought of the effects on his loved ones before he started down the path of crime. He blames everyone else except himself... What did he expect would be the outcome of going around stealing from people? To me, the protagonist seems like a short sighted narcissist. I hope he has more depth in the rest of the game.
I can barely remember the last time a ubisoft protagonist was likeable, or even not morally repulsive. I think it's partly due to the nature of the open-world sandbox genre, where the style of play practically forces you to behave like a sociopath. If they made the protagonist a nice guy, there'd probably be too much cognitive dissonance between how the character's portrayed in cut-scenes and how he behaves in-game.
Driver San Francisco got around this problem by making it impossible to run over a pedestrian, so that your crazy driving is wacky, but never cruel. The Batman games got around it by removing civilians entirely, making all combat non-lethal, and just for good measure, repeatedly highlighting Batman's struggle with his own pathological leanings in the story.
But GTA, Assassins' Creed, Far Cry 2&3, and now Watch Dogs just seem to address this problem by always making their main characters dour, amoral arseholes, so that no action you can take in the game ever seems beneath them.
Note that I'm running textures on High (Sli GTX680 2Go Vram) and everything else at max. I guess this could also help for Ultra textures when you use 3Go Vram and more.
In WD and AC4, I think medium looks better than ultra! I find ultra is too sharp and bold. The lower rez shadows on medium softens the effect which I think makes it more realistic.
To each, their own though right?
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
So with my ancient gtx 570 and 8gb system ram with my SSD the game works much better then expected on some med/high settings and I game in 1080 with better fps then people can on the weak PS4 with sub HD and 30 fps ..LOL!
Shadows: Although i think its a weird engine problem, when shadows are increased or turned in Arma 2 or DayZ, the framerate can actually go up. Its about the same on or off with a few exceptions.
EDIT: Hey, somebody start a photo-realistic texture petition asking for industry wide support on the Nvidia drivers forum?
46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530
Of course, the points when new textures were loaded and changed was somewhat visible, but this was over a decade ago.
Also, didn't ID demo their gigatexture tech a while back? I think it's used in RAGE - they used a different texture everywhere without repeating, but the actual texture was unfortunately low rez which was purpose defeating.
Out of curiosity, I'm trying to load up watch dogs on my 4G cards to see how much it actually fills up.
On Skyrim, 1.25GB cards with ultra textures was unplayable. It filled up to 2.4GB when I got the 4G cards, and was butter smooth.
That's the thing with low VRam. It's not the same kind of performance loss as with a weak GPU or CPU. With low VRam, you get eratic, long FPS dips while new textures load whereas with weak GPU/CPU you get "smooth" stutter i.e. low fps.
I'll post when I have some results.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
Cool, I waiting to hear your results later on, wondering about those 4gb card so please report back :)
Yeah, that megatexture business just seemed like a bad joke. Carmack made such a big deal about his revolutionary new texture engine for years leading up to the release of Rage. But the final release was like a bad joke: textures that were rather low-res, had the most relentless pop-in problems many had ever seen, and took up a monumental amount of space to boot.
Maybe there was something good about that tech from a back-end point of view, but from where I was sitting it just looked like a giant gaffe. After that, it was hard for me to get too excited about Carmack joining the Oculus team.
I'm very interested to see your VRAM test results!
[MonitorSizeOverride][Global/Base Profile Tweaks][Depth=IPD]
MSI afterburner / Rivatuner (64 bit) statistics server don't allow the game to load, so they have to be exited. When the game is loaded, you can alt-tab out and load them up; then you can alt-tab back in.
Gameplay showed 3GB usage throughout my test on Ultra settings with TXAA x4 - it didn't matter which section I was in - small starting rooms nor the largish stadium. It stayed between 2990MB to 3010MB VRam.
I was getting 45fps average with both GPUs at ~65% mostly. Obviously SLI profile needs optimisation.
It was butter smooth as far as heavy stuttering was concerned, aside from the low FPS. I should mention here that the game is installed on an SSD which may have helped. I usually try and aim for solid 60fps in s3D, and don't plan on playing the game properly till a good 3D fix. I will probably have to take the settings down quite a bit, but textures won't be one of them it would seem.
As someone earlier said, the textures seem "bland" in that they don't stick out. You can have an extremely high rez texture of a blue wall which would still look bland even though it is "high rez"; on the other hand you could have a medium size texture of tree bark, which would look spectacular in comparison. It is unfortunate that they have used the HR textures as a box to tick instead of really taking advantage of them. IMO, you won't notice much of a difference between standard and ultra simply because they don't really take much advantage of the high rez from what I can make out from the first area.
Also unrelated, but do they really want me to feel sorry for and identify with some con artist who goes around hacking into people's bank accounts and stealing their money? The end result is he lost his family. Perhaps he should have thought of the effects on his loved ones before he started down the path of crime. He blames everyone else except himself... What did he expect would be the outcome of going around stealing from people? To me, the protagonist seems like a short sighted narcissist. I hope he has more depth in the rest of the game.
Back on topic, is there anything specific you guys want me to test?
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
So it seems like it's just as good for me to go GTX 780 3GB then, if I understand it correctly and if that really is the case :)
As they say the game can use up to even 4gb or more vram but the 256 bit bus cant handle more then 3GB vram.
And yeah I think that the SSD helps out a lot, I have one myself so that good for fast loading.
Thanks again for the report, really appriciated mate :)
I think you have the wrong end of the proverbial stick old chum, as bo3b said earlier. The 670 or any other card can utilise more than 3GB if required, but there seems to be a hard coded limit in the watch dogs engine which stop it from using any more. In the watch dogs options screen, it even specifically says 3GB card will be required for the ultra textures.
This makes sense as it's a designed target for the engine by the devs.
If you can show me an example of someone getting more than 3GB in the game with any card, i'll use the same settings and show you a graph and screenshot of even a 670 using >3gb.
Please forget about the bus width, it really doesn't have any relevance. It's a small piece of a jigsaw of overall performance. A more important piece is memory bandwidth. Rest assured that the full 4GB of memory is addressable regardless of the bus widths. I'll see if I can get a modern game running with SGSSAA to show you a screenshot :)
If you want to go with another card, I would only look at the overall performance and the memory on that card.
EDIT:
Please find my screen of watchdogs below utilising 4GB of memory on a 670 ;-)
GPU and memory usage on top left.
You can also see the fps top right.
Achieved using 8x SGSSAA with 3D Vision enabled (but not working).
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
VRAM Bandwidth in GB/s is a function of the bus speed (Mhz) X bus width (bits).
Bus width on it's own is irrelevant, just like car manufacturers used to quote 'Coeficient of drag' on it's own. It has to be multiplied by something else (frontal area of car) to actually get to a meaningful number.
Did a quick google as well and found this which is quite a good discussion on the topic:
http://hardforum.com/showthread.php?t=1673311
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Dont know why people says that the bus on a gtx 770 are to small to use it for pushing out good fps, so thats nonsense then ?
Will check out that link rustyk, if it will make me any smarter is another thing :)
Is it maybe that people mean that for say 4k gaming then the bus is to small for pushing out to all those pixels ?
But when a game like watch dogs use that much memory for e.x textures then the small bus dont matter and the card can use the 4gb without trouble ?
Thanks a lot guys, and really nice of you RAGEdemon trying to explain in such a great way :)
Perhaps it's to do with the unified RAM (VRAM+RAM) of the consoles. Maybe they have to put a hard limit on the VRAM usage, otherwise it could encroach on the RAM which would run out for other things. If so, that's a shame, as it suggests that we might have these sorts of 3GB hard limits for the entirety of this console cycle.
But I'm a bit confused about this hard limit, since it was 3GB in your first test, and 4GB in your second one. I'm guessing your first test was in 1080p, which I guess is what the Xbox One or PS4 are running. So that would seem to suggest that the 3GB limit Ubisoft placed on the game was centred around a console context?
I can barely remember the last time a ubisoft protagonist was likeable, or even not morally repulsive. I think it's partly due to the nature of the open-world sandbox genre, where the style of play practically forces you to behave like a sociopath. If they made the protagonist a nice guy, there'd probably be too much cognitive dissonance between how the character's portrayed in cut-scenes and how he behaves in-game.
Driver San Francisco got around this problem by making it impossible to run over a pedestrian, so that your crazy driving is wacky, but never cruel. The Batman games got around it by removing civilians entirely, making all combat non-lethal, and just for good measure, repeatedly highlighting Batman's struggle with his own pathological leanings in the story.
But GTA, Assassins' Creed, Far Cry 2&3, and now Watch Dogs just seem to address this problem by always making their main characters dour, amoral arseholes, so that no action you can take in the game ever seems beneath them.