Hello guys!
As we were talking about the GTX770 4GB version in the Watch Dog thread I have seen this over the net!
Can it really be that bad that the bus is to small for 4gb with the 256 bit ?
If so we who was thinking about going 770 4GB are in for a rude awakening ?
Just wondering as I really have to upgrade my old gtx 570 and was hearing that a 3gb card may be to little for Watch Dogs on Ultra settings!
E.x one guy mentioned here.
http://www.guru3d.com/news_story/msi_to_release_4gb_version_of_geforce_gtx_770_gaming_graphics_card.html
"From every thing I have seen over the years the 256bit memory bus is not wide enough to make use of any thing much over 2GB thats why even at 2560x1440 and 5670x1080 you see little to no difference in 2gb cards vs 4gb cards.
I would bet the same would show if they made a Titan with 3GB vs 6GB, the 384bit bus can saturate 3GB just fine and I think 6GB not really needed at all."
As we were talking about the GTX770 4GB version in the Watch Dog thread I have seen this over the net!
Can it really be that bad that the bus is to small for 4gb with the 256 bit ?
If so we who was thinking about going 770 4GB are in for a rude awakening ?
Just wondering as I really have to upgrade my old gtx 570 and was hearing that a 3gb card may be to little for Watch Dogs on Ultra settings!
"From every thing I have seen over the years the 256bit memory bus is not wide enough to make use of any thing much over 2GB thats why even at 2560x1440 and 5670x1080 you see little to no difference in 2gb cards vs 4gb cards.
I would bet the same would show if they made a Titan with 3GB vs 6GB, the 384bit bus can saturate 3GB just fine and I think 6GB not really needed at all."
just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now).
just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now).
They are confusing two different ideas into one. There is no relationship between how much VRAM you have and what the bus width is. You could have 6G VRAM and a 56 bit bus if you wanted.
The only time this would matter is if you are streaming data into the card. Like say, if you had a 2G card and are running Watch Dogs where it seems to be constantly streaming data in. Even with 780ti and 384 bit buses, you still get stutters, because streaming data is just not good enough, it needs to already be loaded in VRAM.
You can think of it like paging off an HD. Doesn't matter how good the HD is, it's still too slow. If you get more RAM, you page less.
They are confusing two different ideas into one. There is no relationship between how much VRAM you have and what the bus width is. You could have 6G VRAM and a 56 bit bus if you wanted.
The only time this would matter is if you are streaming data into the card. Like say, if you had a 2G card and are running Watch Dogs where it seems to be constantly streaming data in. Even with 780ti and 384 bit buses, you still get stutters, because streaming data is just not good enough, it needs to already be loaded in VRAM.
You can think of it like paging off an HD. Doesn't matter how good the HD is, it's still too slow. If you get more RAM, you page less.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Ahaa okey bo3b I get it, and that makes perfect sense, so a 4GB 770 with a 256 bus should be great for Watch Dogs and probably other upcoming new games that need much vram then :)
But I seen over the evga forum I think it was and one guy said he had stutter even on his 4gb 770, so I dont know.
Well lets hope it's as you said couse I really have to upgrade my ancient card by yesterday :)
Ahaa okey bo3b I get it, and that makes perfect sense, so a 4GB 770 with a 256 bus should be great for Watch Dogs and probably other upcoming new games that need much vram then :)
But I seen over the evga forum I think it was and one guy said he had stutter even on his 4gb 770, so I dont know.
Well lets hope it's as you said couse I really have to upgrade my ancient card by yesterday :)
[quote="teardropmina"]just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now). [/quote]
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)
teardropmina said:just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now).
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)
[quote="InSync"][quote="teardropmina"]just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now). [/quote]
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)[/quote]
I didn't mean that the ultra isn't better than high; I was saying that the ultra Watch Dogs looks bland compared to, say, Tomb Raider 2013 and even Splinter Cell: Blacklist.
teardropmina said:just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now).
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)
I didn't mean that the ultra isn't better than high; I was saying that the ultra Watch Dogs looks bland compared to, say, Tomb Raider 2013 and even Splinter Cell: Blacklist.
[quote="teardropmina"][quote="InSync"][quote="teardropmina"]just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now). [/quote]
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)[/quote]
I didn't mean that the ultra isn't better than high; I was saying that the ultra Watch Dogs looks bland compared to, say, Tomb Raider 2013 and even Splinter Cell: Blacklist.
[/quote]
Ahh alright thats another deal :)
I myself are a high res texture freak, the more the better :D
And I think that Watch Dogs with ultra textures looks pretty awesome, maybe just me but it surely seems downgraded from the E3 2012 as we know but as said looks good anyway, atleast for what I have seen :)
teardropmina said:just from my own experience, if the ultra texture of Watch Dogs is the only reason for a 4G vram card update, don't bother. it looks quite bland and dull, even @6050x1080 (the resolution I'm playing at right now).
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)
I didn't mean that the ultra isn't better than high; I was saying that the ultra Watch Dogs looks bland compared to, say, Tomb Raider 2013 and even Splinter Cell: Blacklist.
Ahh alright thats another deal :)
I myself are a high res texture freak, the more the better :D
And I think that Watch Dogs with ultra textures looks pretty awesome, maybe just me but it surely seems downgraded from the E3 2012 as we know but as said looks good anyway, atleast for what I have seen :)
Since it is a matter of discussion, this is how Watchdogs looks at max settings with Ultra textures.
Click for full size.
[url=http://i.picpar.com/7BM.png][img]http://i.picpar.com/7BM.png[/img][/url]
[url=http://i6.minus.com/iyjik18JvjTGx.jpg][img]http://i6.minus.com/iyjik18JvjTGx.jpg[/img][/url]
[url=http://abload.de/img/watch_dogs2014-05-271mwuht.jpg][img]http://abload.de/img/watch_dogs2014-05-271mwuht.jpg[/img][/url]
And with heavy DoF if you like that
[url=http://i2.minus.com/iwQJsDwHCegcn.jpg][img]http://i2.minus.com/iwQJsDwHCegcn.jpg[/img][/url]
[url=http://www.neogaf.com/forum/showthread.php?t=743719]All taken from the 2014 High-Res PC Screenshot thread on Neogaf[/url]
I thought that increased memory on GPUs was more for higher resolutions than in-game settings. Isn't it the amount of Cuda Cores that makes the difference for in-game settings as well as the computing architecture of the generation of GPU model/chipset being used?
I remember that when I upgraded my GTX 7900 to a GTX 8800 the bus width jumped from 256 to 384.
You'd think that if they could make use of the bandwidth then that it would at least be 384 if not higher with subsequent releases of GPUs.
Looking at the GeForce pages the GTX 770 shows a Memory Bandwidth 224.3 (GB/sec) with a Memory Interface Width of 256-bit. The GTX 780 shows a Memory Bandwidth of 288.4 (GB/sec) with a Memory Interface Width of 384-bit. The big difference would be that the 770 uses the GK104 whereas the 780 uses the GK110
http://www.nvidia.com/gtx-700-graphics-cards/gtx-770/
http://www.nvidia.com/gtx-700-graphics-cards/gtx-780/
Here's a whitepaper on the GK 110
http://www.nvidia.com/content/PDF/kepler/NVIDIA-Kepler-GK110-Architecture-Whitepaper.pdf
Here's a great whitepaper detailing the evolution of architecture of Nvidia's previous GPUs
http://www.nvidia.com/content/PDF/fermi_white_papers/NVIDIA_Fermi_Compute_Architecture_Whitepaper.pdf
I thought that increased memory on GPUs was more for higher resolutions than in-game settings. Isn't it the amount of Cuda Cores that makes the difference for in-game settings as well as the computing architecture of the generation of GPU model/chipset being used?
I remember that when I upgraded my GTX 7900 to a GTX 8800 the bus width jumped from 256 to 384.
You'd think that if they could make use of the bandwidth then that it would at least be 384 if not higher with subsequent releases of GPUs.
Looking at the GeForce pages the GTX 770 shows a Memory Bandwidth 224.3 (GB/sec) with a Memory Interface Width of 256-bit. The GTX 780 shows a Memory Bandwidth of 288.4 (GB/sec) with a Memory Interface Width of 384-bit. The big difference would be that the 770 uses the GK104 whereas the 780 uses the GK110
Yeah so the deal is, are the GTX 7704GB version able to output the filled vram at high fps, thats the question. As everywhere they say the bus is to small. Well I dont know, guess we have to wait for someone over here buy the 4gb version :)
Nice pics btw Alo81. I often checking out that thread over neogaf :)
"UPDATE"
Well it seem it's a damn stutter feast no matter what, even on 4 and 6gb cards :(
Look here what people are saying in the comments!
http://n4g.com/news/1518696/watch-dogs-can-use-upto-3-gb-of-ram-on-ps4-xbo-pc-issues-due-to-lack-of-vram
"Yea its def not VRAM issue, I have a GTX 770 4GB in one build an a R9 280x in another (disabled crossfire) and still have plenty of issues on both. Without question a Uisoft issue and not a tech issue "
Yeah so the deal is, are the GTX 7704GB version able to output the filled vram at high fps, thats the question. As everywhere they say the bus is to small. Well I dont know, guess we have to wait for someone over here buy the 4gb version :)
Nice pics btw Alo81. I often checking out that thread over neogaf :)
"UPDATE"
Well it seem it's a damn stutter feast no matter what, even on 4 and 6gb cards :(
"Yea its def not VRAM issue, I have a GTX 770 4GB in one build an a R9 280x in another (disabled crossfire) and still have plenty of issues on both. Without question a Uisoft issue and not a tech issue "
@D-Man11: All of the textures have to be on the card as well, and this can typically be a bigger hit than the screen resolution. All the different pieces play into it though, including super-sampling AA, which allocates a larger screen buffer than usual. If you double or triple that for triple-buffering, you eat up another chunk.
The original coding style would stream in textures from system RAM, because there was no way you'd have enough VRAM for all that stuff, and the cards were really designed to be just a frame buffer. As things evolved, it became clear that the PCI bandwidth itself is not enough for an active scene.
So for example, if you have to page in textures for every frame, it will be unacceptably slow, as the bottleneck switches to PCI. In WatchDogs case, they don't do that exactly, but often need to do that as you drive rapidly to a new section. That's where stutters while driving come from.
Next generation consoles have recognized this problem, and that's why they have unified memory systems now, so that there is no bus in the way, and RAM is RAM and the textures don't have to stream across a bus.
(Some educated guessing here, not sure I'm completely right.)
BTW, this is apparently a problem that also hits Skyrim when using high-res texture packs. You need more VRAM to run it well. This also happened on GTA4 when I played around with mods, the high-res packs can be Gigs in size.
@TearDropMina: I completely agree it's not worth the big VRAM for a single game, not even Watch Dogs, even if you love it. But, it's true for Skyrim, GTA4, probably GTA5 if it appears, and future games built on Disrupt engine. I've stuck at 1.5G for a long time, because nothing really pushed it until recently. I'm also thinking that for slight future proofing, that 3G is probably not the right jump. Either 4G on lesser cards, or 6G on monster cards is probably worth the extra money.
@D-Man11: All of the textures have to be on the card as well, and this can typically be a bigger hit than the screen resolution. All the different pieces play into it though, including super-sampling AA, which allocates a larger screen buffer than usual. If you double or triple that for triple-buffering, you eat up another chunk.
The original coding style would stream in textures from system RAM, because there was no way you'd have enough VRAM for all that stuff, and the cards were really designed to be just a frame buffer. As things evolved, it became clear that the PCI bandwidth itself is not enough for an active scene.
So for example, if you have to page in textures for every frame, it will be unacceptably slow, as the bottleneck switches to PCI. In WatchDogs case, they don't do that exactly, but often need to do that as you drive rapidly to a new section. That's where stutters while driving come from.
Next generation consoles have recognized this problem, and that's why they have unified memory systems now, so that there is no bus in the way, and RAM is RAM and the textures don't have to stream across a bus.
(Some educated guessing here, not sure I'm completely right.)
BTW, this is apparently a problem that also hits Skyrim when using high-res texture packs. You need more VRAM to run it well. This also happened on GTA4 when I played around with mods, the high-res packs can be Gigs in size.
@TearDropMina: I completely agree it's not worth the big VRAM for a single game, not even Watch Dogs, even if you love it. But, it's true for Skyrim, GTA4, probably GTA5 if it appears, and future games built on Disrupt engine. I've stuck at 1.5G for a long time, because nothing really pushed it until recently. I'm also thinking that for slight future proofing, that 3G is probably not the right jump. Either 4G on lesser cards, or 6G on monster cards is probably worth the extra money.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="InSync"]Look here what people are saying in the comments!
http://n4g.com/news/1518696/watch-dogs-can-use-upto-3-gb-of-ram-on-ps4-xbo-pc-issues-due-to-lack-of-vram
"Yea its def not VRAM issue, I have a GTX 770 4GB in one build an a R9 280x in another (disabled crossfire) and still have plenty of issues on both. Without question a Uisoft issue and not a tech issue "
[/quote]Comments from some clown boy who didn't put any effort into figuring it out? It still annoys me that people don't even try to understand the problem and just jump straight to calling it poor optimization.
It's not. I had low expectations, because first-day games typically are crash, bug filled, slow, and bad, like BF4. But, Ubisoft hit this one out of the park. It's really good.
I got it running smoothly, without stutters, on a weak setup of single GTX 480, 1.5G of VRAM. It runs at 40 fps, with mostly High settings. With 480 SLI, I get 52 fps. Most importantly, that is my [i]minimum [/i]frame rate, not average, so it runs very smoothly.
Now that's not the same as turning all the dials to 11 and then seeing that it doesn't run smoothly, even on epic machines like 4G 780ti SLI. That's more of a Crysis effect, and means the engine has room to grow.
If clown boy cannot be bothered to turn down any settings at all, then yeah, he's going to have stutters. Either turn down high-res textures, lower the resolution, or lower the AA. I'm not sure if Geforce Experience sets the parameters properly though. :->
"Yea its def not VRAM issue, I have a GTX 770 4GB in one build an a R9 280x in another (disabled crossfire) and still have plenty of issues on both. Without question a Uisoft issue and not a tech issue "
Comments from some clown boy who didn't put any effort into figuring it out? It still annoys me that people don't even try to understand the problem and just jump straight to calling it poor optimization.
It's not. I had low expectations, because first-day games typically are crash, bug filled, slow, and bad, like BF4. But, Ubisoft hit this one out of the park. It's really good.
I got it running smoothly, without stutters, on a weak setup of single GTX 480, 1.5G of VRAM. It runs at 40 fps, with mostly High settings. With 480 SLI, I get 52 fps. Most importantly, that is my minimum frame rate, not average, so it runs very smoothly.
Now that's not the same as turning all the dials to 11 and then seeing that it doesn't run smoothly, even on epic machines like 4G 780ti SLI. That's more of a Crysis effect, and means the engine has room to grow.
If clown boy cannot be bothered to turn down any settings at all, then yeah, he's going to have stutters. Either turn down high-res textures, lower the resolution, or lower the AA. I'm not sure if Geforce Experience sets the parameters properly though. :->
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
770 its 2g version 2650rez I5 4670 blah blah blah. Have the latest drivers. Installed the game set ULTRA and runs great. Takes a SECOND sometimes like its loading the graphics but thats only at starting it.
Oh I dont use AA. And I dont care for blur..yet its on. Everyone PC is different. I NEVER listen to what they say. Reviews are nice you get an idea but..
770 its 2g version 2650rez I5 4670 blah blah blah. Have the latest drivers. Installed the game set ULTRA and runs great. Takes a SECOND sometimes like its loading the graphics but thats only at starting it.
Oh I dont use AA. And I dont care for blur..yet its on. Everyone PC is different. I NEVER listen to what they say. Reviews are nice you get an idea but..
I still remember how disappointed I was when I tried running ultra high-res texture packs on Skyrim with my GTX 680 (2GB VRAM). Those texture mods usually came in several sizes. I could run the "high" versions no problems. But when I tried to run the "ultra" versions, the performance hit was just too great, as my VRAM got filled up, even though I had one of the best cards at the time. And despite the name "ultra", those textures weren't all that high-res....they would still go blurry if you got too close.
The thing about textures is that they're almost NEVER high-res enough, at least in 1st person games (in something like Watch Dogs it's probably ok as is). Take any game - Battlefield 4, Crysis 3, anything - walk up to a wall, and by the time you're up against it, it'll likely have turned into a big blobby mess.
It's always been like that, so we're all used to it. Yet it baffles me that everyone is so excited about 4K, when the textures in the average game are so low res. What's the point of extra resolution if it's driving the same old blurry low res textures? That's like buying a $5000 hifi to listen to 64mbps MP3s. It doesn't make any sense. If anything, it'll just amplify the faults.
Imagine a game where you could look at any object in the game up close and it would always have photorealistic crispness. Imagine walking up to an open book on a table in an Elder Scrolls game, and you could actually read the book there and then, without picking it up!
One day we'll have textures that good, and I think that the added immersion they'll bring will be awesome. But the VRAM of our current cards is probably not even close to what it'd need to be for that (the storage requirements would be steep too).
I still remember how disappointed I was when I tried running ultra high-res texture packs on Skyrim with my GTX 680 (2GB VRAM). Those texture mods usually came in several sizes. I could run the "high" versions no problems. But when I tried to run the "ultra" versions, the performance hit was just too great, as my VRAM got filled up, even though I had one of the best cards at the time. And despite the name "ultra", those textures weren't all that high-res....they would still go blurry if you got too close.
The thing about textures is that they're almost NEVER high-res enough, at least in 1st person games (in something like Watch Dogs it's probably ok as is). Take any game - Battlefield 4, Crysis 3, anything - walk up to a wall, and by the time you're up against it, it'll likely have turned into a big blobby mess.
It's always been like that, so we're all used to it. Yet it baffles me that everyone is so excited about 4K, when the textures in the average game are so low res. What's the point of extra resolution if it's driving the same old blurry low res textures? That's like buying a $5000 hifi to listen to 64mbps MP3s. It doesn't make any sense. If anything, it'll just amplify the faults.
Imagine a game where you could look at any object in the game up close and it would always have photorealistic crispness. Imagine walking up to an open book on a table in an Elder Scrolls game, and you could actually read the book there and then, without picking it up!
One day we'll have textures that good, and I think that the added immersion they'll bring will be awesome. But the VRAM of our current cards is probably not even close to what it'd need to be for that (the storage requirements would be steep too).
+1 Volnaiskra!
I hope that that day comes sooner then later as you said, textures still looks like crap in many ways!
@bo3b!
Yeah I hear ya and agree. So now where the hell is that 780TI 6GB Card :)
And diled down the game runs good even on my ancient card with 1280MB, so the fps aint that bad. They did a pretty good job if you ask me to :)
+1 Volnaiskra!
I hope that that day comes sooner then later as you said, textures still looks like crap in many ways!
@bo3b!
Yeah I hear ya and agree. So now where the hell is that 780TI 6GB Card :)
And diled down the game runs good even on my ancient card with 1280MB, so the fps aint that bad. They did a pretty good job if you ask me to :)
As we were talking about the GTX770 4GB version in the Watch Dog thread I have seen this over the net!
Can it really be that bad that the bus is to small for 4gb with the 256 bit ?
If so we who was thinking about going 770 4GB are in for a rude awakening ?
Just wondering as I really have to upgrade my old gtx 570 and was hearing that a 3gb card may be to little for Watch Dogs on Ultra settings!
E.x one guy mentioned here.
http://www.guru3d.com/news_story/msi_to_release_4gb_version_of_geforce_gtx_770_gaming_graphics_card.html
"From every thing I have seen over the years the 256bit memory bus is not wide enough to make use of any thing much over 2GB thats why even at 2560x1440 and 5670x1080 you see little to no difference in 2gb cards vs 4gb cards.
I would bet the same would show if they made a Titan with 3GB vs 6GB, the 384bit bus can saturate 3GB just fine and I think 6GB not really needed at all."
epenny size =/= nerdiness
The only time this would matter is if you are streaming data into the card. Like say, if you had a 2G card and are running Watch Dogs where it seems to be constantly streaming data in. Even with 780ti and 384 bit buses, you still get stutters, because streaming data is just not good enough, it needs to already be loaded in VRAM.
You can think of it like paging off an HD. Doesn't matter how good the HD is, it's still too slow. If you get more RAM, you page less.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
But I seen over the evga forum I think it was and one guy said he had stutter even on his 4gb 770, so I dont know.
Well lets hope it's as you said couse I really have to upgrade my ancient card by yesterday :)
I think the difference is really big with high vs ultra textures for what I have seen :)
Thanks anyway for chiming in :)
I didn't mean that the ultra isn't better than high; I was saying that the ultra Watch Dogs looks bland compared to, say, Tomb Raider 2013 and even Splinter Cell: Blacklist.
epenny size =/= nerdiness
Ahh alright thats another deal :)
I myself are a high res texture freak, the more the better :D
And I think that Watch Dogs with ultra textures looks pretty awesome, maybe just me but it surely seems downgraded from the E3 2012 as we know but as said looks good anyway, atleast for what I have seen :)
Click for full size.
And with heavy DoF if you like that
All taken from the 2014 High-Res PC Screenshot thread on Neogaf
I remember that when I upgraded my GTX 7900 to a GTX 8800 the bus width jumped from 256 to 384.
You'd think that if they could make use of the bandwidth then that it would at least be 384 if not higher with subsequent releases of GPUs.
Looking at the GeForce pages the GTX 770 shows a Memory Bandwidth 224.3 (GB/sec) with a Memory Interface Width of 256-bit. The GTX 780 shows a Memory Bandwidth of 288.4 (GB/sec) with a Memory Interface Width of 384-bit. The big difference would be that the 770 uses the GK104 whereas the 780 uses the GK110
http://www.nvidia.com/gtx-700-graphics-cards/gtx-770/
http://www.nvidia.com/gtx-700-graphics-cards/gtx-780/
Here's a whitepaper on the GK 110
http://www.nvidia.com/content/PDF/kepler/NVIDIA-Kepler-GK110-Architecture-Whitepaper.pdf
Here's a great whitepaper detailing the evolution of architecture of Nvidia's previous GPUs
http://www.nvidia.com/content/PDF/fermi_white_papers/NVIDIA_Fermi_Compute_Architecture_Whitepaper.pdf
Nice pics btw Alo81. I often checking out that thread over neogaf :)
"UPDATE"
Well it seem it's a damn stutter feast no matter what, even on 4 and 6gb cards :(
Look here what people are saying in the comments!
http://n4g.com/news/1518696/watch-dogs-can-use-upto-3-gb-of-ram-on-ps4-xbo-pc-issues-due-to-lack-of-vram
"Yea its def not VRAM issue, I have a GTX 770 4GB in one build an a R9 280x in another (disabled crossfire) and still have plenty of issues on both. Without question a Uisoft issue and not a tech issue "
The original coding style would stream in textures from system RAM, because there was no way you'd have enough VRAM for all that stuff, and the cards were really designed to be just a frame buffer. As things evolved, it became clear that the PCI bandwidth itself is not enough for an active scene.
So for example, if you have to page in textures for every frame, it will be unacceptably slow, as the bottleneck switches to PCI. In WatchDogs case, they don't do that exactly, but often need to do that as you drive rapidly to a new section. That's where stutters while driving come from.
Next generation consoles have recognized this problem, and that's why they have unified memory systems now, so that there is no bus in the way, and RAM is RAM and the textures don't have to stream across a bus.
(Some educated guessing here, not sure I'm completely right.)
BTW, this is apparently a problem that also hits Skyrim when using high-res texture packs. You need more VRAM to run it well. This also happened on GTA4 when I played around with mods, the high-res packs can be Gigs in size.
@TearDropMina: I completely agree it's not worth the big VRAM for a single game, not even Watch Dogs, even if you love it. But, it's true for Skyrim, GTA4, probably GTA5 if it appears, and future games built on Disrupt engine. I've stuck at 1.5G for a long time, because nothing really pushed it until recently. I'm also thinking that for slight future proofing, that 3G is probably not the right jump. Either 4G on lesser cards, or 6G on monster cards is probably worth the extra money.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
It's not. I had low expectations, because first-day games typically are crash, bug filled, slow, and bad, like BF4. But, Ubisoft hit this one out of the park. It's really good.
I got it running smoothly, without stutters, on a weak setup of single GTX 480, 1.5G of VRAM. It runs at 40 fps, with mostly High settings. With 480 SLI, I get 52 fps. Most importantly, that is my minimum frame rate, not average, so it runs very smoothly.
Now that's not the same as turning all the dials to 11 and then seeing that it doesn't run smoothly, even on epic machines like 4G 780ti SLI. That's more of a Crysis effect, and means the engine has room to grow.
If clown boy cannot be bothered to turn down any settings at all, then yeah, he's going to have stutters. Either turn down high-res textures, lower the resolution, or lower the AA. I'm not sure if Geforce Experience sets the parameters properly though. :->
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Oh I dont use AA. And I dont care for blur..yet its on. Everyone PC is different. I NEVER listen to what they say. Reviews are nice you get an idea but..
The thing about textures is that they're almost NEVER high-res enough, at least in 1st person games (in something like Watch Dogs it's probably ok as is). Take any game - Battlefield 4, Crysis 3, anything - walk up to a wall, and by the time you're up against it, it'll likely have turned into a big blobby mess.
It's always been like that, so we're all used to it. Yet it baffles me that everyone is so excited about 4K, when the textures in the average game are so low res. What's the point of extra resolution if it's driving the same old blurry low res textures? That's like buying a $5000 hifi to listen to 64mbps MP3s. It doesn't make any sense. If anything, it'll just amplify the faults.
Imagine a game where you could look at any object in the game up close and it would always have photorealistic crispness. Imagine walking up to an open book on a table in an Elder Scrolls game, and you could actually read the book there and then, without picking it up!
One day we'll have textures that good, and I think that the added immersion they'll bring will be awesome. But the VRAM of our current cards is probably not even close to what it'd need to be for that (the storage requirements would be steep too).
I hope that that day comes sooner then later as you said, textures still looks like crap in many ways!
@bo3b!
Yeah I hear ya and agree. So now where the hell is that 780TI 6GB Card :)
And diled down the game runs good even on my ancient card with 1280MB, so the fps aint that bad. They did a pretty good job if you ask me to :)