What kind of improvement, vram or power, improves 3D vision?
1 / 2
So what is going on exactly when you turn on 3D vision? Does your graphics card have to render twice as much stuff, or does it have to render the same amount of stuff, it just separates the image into halves?
I guess what I'm asking is, does using 3D vision use up more video memory or no? More practically speaking, would it make sense to get a higher VRAM card or just simply a faster card, or do both of these things play a factor?
I know that higher resolutions like 2560x1440 and also using high resolution texture packs, and installing a lot of mods for something like Skyrim - these things all are benefitted by having more VRAM. But what is 3D vision's impact on VRAM?
So what is going on exactly when you turn on 3D vision? Does your graphics card have to render twice as much stuff, or does it have to render the same amount of stuff, it just separates the image into halves?
I guess what I'm asking is, does using 3D vision use up more video memory or no? More practically speaking, would it make sense to get a higher VRAM card or just simply a faster card, or do both of these things play a factor?
I know that higher resolutions like 2560x1440 and also using high resolution texture packs, and installing a lot of mods for something like Skyrim - these things all are benefitted by having more VRAM. But what is 3D vision's impact on VRAM?
I can only answer your first question authoritatively. It does not separate the image into 'halves.' You are rendering the same image at two different angles - so your card is doing TWICE the rendering work as non-3D. That doesn't necessarily translate into exactly halving your video card's capabilities, but it's close.
I can only answer your first question authoritatively. It does not separate the image into 'halves.' You are rendering the same image at two different angles - so your card is doing TWICE the rendering work as non-3D. That doesn't necessarily translate into exactly halving your video card's capabilities, but it's close.
|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64
[quote="jlmitnick"]So what is going on exactly when you turn on 3D vision? Does your graphics card have to render twice as much stuff, or does it have to render the same amount of stuff, it just separates the image into halves?
I guess what I'm asking is, does using 3D vision use up more video memory or no? More practically speaking, would it make sense to get a higher VRAM card or just simply a faster card, or do both of these things play a factor?
I know that higher resolutions like 2560x1440 and also using high resolution texture packs, and installing a lot of mods for something like Skyrim - these things all are benefitted by having more VRAM. But what is 3D vision's impact on VRAM?[/quote]
If you love smooth frame rates at that res with high tex packs in 3D I would advise a gtx 690 or 2x 670 or 680 in SLI. Games like skyrim which are CPU bound will also benefit from a top end OC CPU (sandy or ivy 3.5ghz+)
3D as stated doesn't effect vram, but texture mods will. Whatever gtx you get, 2Gb vram should be the minimum if you want to use tex mods on high resolution.
jlmitnick said:So what is going on exactly when you turn on 3D vision? Does your graphics card have to render twice as much stuff, or does it have to render the same amount of stuff, it just separates the image into halves?
I guess what I'm asking is, does using 3D vision use up more video memory or no? More practically speaking, would it make sense to get a higher VRAM card or just simply a faster card, or do both of these things play a factor?
I know that higher resolutions like 2560x1440 and also using high resolution texture packs, and installing a lot of mods for something like Skyrim - these things all are benefitted by having more VRAM. But what is 3D vision's impact on VRAM?
If you love smooth frame rates at that res with high tex packs in 3D I would advise a gtx 690 or 2x 670 or 680 in SLI. Games like skyrim which are CPU bound will also benefit from a top end OC CPU (sandy or ivy 3.5ghz+)
3D as stated doesn't effect vram, but texture mods will. Whatever gtx you get, 2Gb vram should be the minimum if you want to use tex mods on high resolution.
3d vision does not increase vram usage ??
Are you all sure about that?
My understanding of vram usage is ... this is the amount of allocated memory in vram. The amount of data in vram which is currently being processed is normally less.
I have maxed the 1.5GB vram of my GTX 580 in some games and still had smooth gameplay.
With my old 8800GTS 320MB, occasionally, in games, fps would fall very low for 5-10 seconds and then return to normal. It was annoying, but games were still playable.
3d vision does not increase vram usage ??
Are you all sure about that?
My understanding of vram usage is ... this is the amount of allocated memory in vram. The amount of data in vram which is currently being processed is normally less.
I have maxed the 1.5GB vram of my GTX 580 in some games and still had smooth gameplay.
With my old 8800GTS 320MB, occasionally, in games, fps would fall very low for 5-10 seconds and then return to normal. It was annoying, but games were still playable.
I just tested Witcher 2 by loading up an old save game.
Did not move the mouse. Same exact location. same save game.
With 3D vision disabled: vram usage = 458MB
with 3D vision enabled and turned on: vram usage = 616MB
34.5% vram usage increase.
nvidia drivers 310.90
For testing without 3D, make sure you disable the 3D vision driver and restart your PC.
MSI after burner is showing 3Gb Memory usgage (flat line when game runs) when I only have 2GB. Sorry, otherwise i would have posted my results. I will try again tomorrow. partol. Can you check your vram usage on desktop with 3D off and then 3D on. Would be interesting to know.
MSI after burner is showing 3Gb Memory usgage (flat line when game runs) when I only have 2GB. Sorry, otherwise i would have posted my results. I will try again tomorrow. partol. Can you check your vram usage on desktop with 3D off and then 3D on. Would be interesting to know.
Processor is EXTREMELY important as well. You want to be able to run a game at 120hz. Which is double in GPU usage/CPU usage.
If you want to test your cpu meets requirements, open a few games at lowest resolution possible, turn vsync off, then launch fraps. You want around 120 fps. I was at one point bottlenecking with a crazy video card setup because I was bottlenecking on CPU.
Processor is EXTREMELY important as well. You want to be able to run a game at 120hz. Which is double in GPU usage/CPU usage.
If you want to test your cpu meets requirements, open a few games at lowest resolution possible, turn vsync off, then launch fraps. You want around 120 fps. I was at one point bottlenecking with a crazy video card setup because I was bottlenecking on CPU.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
[quote="eqzitara"]Processor is EXTREMELY important as well. You want to be able to run a game at 120hz. Which is double in GPU usage/CPU usage.
If you want to test your cpu meets requirements, open a few games at lowest resolution possible, turn vsync off, then launch fraps. You want around 120 fps. I was at one point bottlenecking with a crazy video card setup because I was bottlenecking on CPU.[/quote]
Right now I've got an i7-950 clocked to 3.8ghz. Will a GTX 680 be bottlenecked by that?
The thing is, I would upgrade to an i-7 3770k, it's not even that expensive since I've got a Microcenter near me...could pick up a new mobo and that CPU for like $300ish. But I'd rather just wait for Haswell at this point (supposedly coming out in June) assuming my i7-950 @ 3.8 isn't bottlenecking.
eqzitara said:Processor is EXTREMELY important as well. You want to be able to run a game at 120hz. Which is double in GPU usage/CPU usage.
If you want to test your cpu meets requirements, open a few games at lowest resolution possible, turn vsync off, then launch fraps. You want around 120 fps. I was at one point bottlenecking with a crazy video card setup because I was bottlenecking on CPU.
Right now I've got an i7-950 clocked to 3.8ghz. Will a GTX 680 be bottlenecked by that?
The thing is, I would upgrade to an i-7 3770k, it's not even that expensive since I've got a Microcenter near me...could pick up a new mobo and that CPU for like $300ish. But I'd rather just wait for Haswell at this point (supposedly coming out in June) assuming my i7-950 @ 3.8 isn't bottlenecking.
[quote="eqzitara"]An i7 is no worries.
If you had something amd or lowend i5 it may of been an issue but your fine.[/quote]
While I agree with you that the I7950 is a great cpu, it might still be bottlenecking even at 3.8ghz on many games. My I7930 when over clocked from 2.8 to 3.8ghz gained almost 50% higher min frames per second in Batman. I know the maths doesn't quite add up, but i did multiple tests. Visually it was night and day better. I lowered the clock down to 3.57 for stability at a low core voltage.
With each over clock i ran fraps and with each increase the fps increase was quite linear on a good selection of games, tailing only a little towards the end. I wouldn't advise him to upgrade as benefits will be small, but if I go sli I think an upgrade from the 900 series will be hugely beneficial.
So even though CPU is not as important as GPU, considering most of us here have a top gpu setup, minimum and average fps in my experience are hugely affected by cpu speed. For anyone building, my advice is get a top Ivy (or wait for hasswell) and give it a moderate over-clock.
eqzitara said:An i7 is no worries.
If you had something amd or lowend i5 it may of been an issue but your fine.
While I agree with you that the I7950 is a great cpu, it might still be bottlenecking even at 3.8ghz on many games. My I7930 when over clocked from 2.8 to 3.8ghz gained almost 50% higher min frames per second in Batman. I know the maths doesn't quite add up, but i did multiple tests. Visually it was night and day better. I lowered the clock down to 3.57 for stability at a low core voltage.
With each over clock i ran fraps and with each increase the fps increase was quite linear on a good selection of games, tailing only a little towards the end. I wouldn't advise him to upgrade as benefits will be small, but if I go sli I think an upgrade from the 900 series will be hugely beneficial.
So even though CPU is not as important as GPU, considering most of us here have a top gpu setup, minimum and average fps in my experience are hugely affected by cpu speed. For anyone building, my advice is get a top Ivy (or wait for hasswell) and give it a moderate over-clock.
[quote="Partol"]3d vision does not increase vram usage ??
Are you all sure about that?
[/quote]
I was pretty sure that was the case, but I think these results indicate there is an impact. It is likely game dependent.
You really take a hit by bumping the resolution of course. Much more than the 3D itself. Thanks to a tip from DMan-11, I ran the extra case of 1920x1080.
[code]Trine1:
1280x720: 576M to 596M 3.4%
1920x1080: 749M to 766M 2.2%
Bioshock2:
1280x720: 796M to 856M 7.5%
1920x1080: 856M to 962M 12.2%
[/code]
I think to do the test you need to drive around a little. When I ran Bioshock it would start at 450M, and slowly creep up to 850M. The steady state is pretty much the only number of interest, which is why I'm showing the graphs, to show steady state.
I'm surprised the 3D driver requires a reboot to clear it, but that definitely is important for the minimum number.
Partol said:3d vision does not increase vram usage ??
Are you all sure about that?
I was pretty sure that was the case, but I think these results indicate there is an impact. It is likely game dependent.
You really take a hit by bumping the resolution of course. Much more than the 3D itself. Thanks to a tip from DMan-11, I ran the extra case of 1920x1080.
Trine1:
1280x720: 576M to 596M 3.4%
1920x1080: 749M to 766M 2.2%
Bioshock2:
1280x720: 796M to 856M 7.5%
1920x1080: 856M to 962M 12.2%
I think to do the test you need to drive around a little. When I ran Bioshock it would start at 450M, and slowly creep up to 850M. The steady state is pretty much the only number of interest, which is why I'm showing the graphs, to show steady state.
I'm surprised the 3D driver requires a reboot to clear it, but that definitely is important for the minimum number.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
The problem with moving/driving is that each test may not be the same.
Try running a gpu benchmark (such as [url]http://unigine.com[/url]) in 2D and 3D.
The reason I say restart the PC is mainly to ensure that starting conditions are identical (to avoid data accumulating in vram). Restart your PC before each test. Do not run two or more tests consecutively.
The problem with moving/driving is that each test may not be the same.
Try running a gpu benchmark (such as http://unigine.com) in 2D and 3D.
The reason I say restart the PC is mainly to ensure that starting conditions are identical (to avoid data accumulating in vram). Restart your PC before each test. Do not run two or more tests consecutively.
Probably best to combine both ideas. Restart from a specific save, don't move, but wait for it to stabilize.
That big bump from 450 to 850 by simply waiting is at least as big a problem as moving around and varying the data.
On the graphs, the time scale is 2 minutes. So, that's a fairly stable VRAM usage for 2 minutes of fighting and looking. It doesn't vary nearly as much as GPU usage itself.
In general for this sort of test, I only really care about [b][i]maximum [/i][/b]usage, not average.
Similarly for frame rates, I only care about [b][i]minimum [/i][/b]frame rate, definitely can give a damn about average.
Probably best to combine both ideas. Restart from a specific save, don't move, but wait for it to stabilize.
That big bump from 450 to 850 by simply waiting is at least as big a problem as moving around and varying the data.
On the graphs, the time scale is 2 minutes. So, that's a fairly stable VRAM usage for 2 minutes of fighting and looking. It doesn't vary nearly as much as GPU usage itself.
In general for this sort of test, I only really care about maximum usage, not average.
Similarly for frame rates, I only care about minimum frame rate, definitely can give a damn about average.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I guess what I'm asking is, does using 3D vision use up more video memory or no? More practically speaking, would it make sense to get a higher VRAM card or just simply a faster card, or do both of these things play a factor?
I know that higher resolutions like 2560x1440 and also using high resolution texture packs, and installing a lot of mods for something like Skyrim - these things all are benefitted by having more VRAM. But what is 3D vision's impact on VRAM?
|CPU: i7-2700k @ 4.5Ghz
|Cooler: Zalman 9900 Max
|MB: MSI Military Class II Z68 GD-80
|RAM: Corsair Vengence 16GB DDR3
|SSDs: Seagate 600 240GB; Crucial M4 128GB
|HDDs: Seagate Barracuda 1TB; Seagate Barracuda 500GB
|PS: OCZ ZX Series 1250watt
|Case: Antec 1200 V3
|Monitors: Asus 3D VG278HE; Asus 3D VG236H; Samsung 3D 51" Plasma;
|GPU:MSI 1080GTX "Duke"
|OS: Windows 10 Pro X64
If you love smooth frame rates at that res with high tex packs in 3D I would advise a gtx 690 or 2x 670 or 680 in SLI. Games like skyrim which are CPU bound will also benefit from a top end OC CPU (sandy or ivy 3.5ghz+)
3D as stated doesn't effect vram, but texture mods will. Whatever gtx you get, 2Gb vram should be the minimum if you want to use tex mods on high resolution.
OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti
Are you all sure about that?
My understanding of vram usage is ... this is the amount of allocated memory in vram. The amount of data in vram which is currently being processed is normally less.
I have maxed the 1.5GB vram of my GTX 580 in some games and still had smooth gameplay.
With my old 8800GTS 320MB, occasionally, in games, fps would fall very low for 5-10 seconds and then return to normal. It was annoying, but games were still playable.
Thief 1/2/gold in 3D
https://forums.geforce.com/default/topic/523535/3d-vision/thief-1-2-and-system-shock-2-perfect-3d-with-unofficial-patch-1-19
http://photos.3dvisionlive.com/Partol/album/509eb580a3e067153c000020/
[Acer GD245HQ - 1920x1080 120Hz] [Nvidia 3D Vision]
[MSI H81M-P33 with Pentium G3258 @ 4.4GHz and Zalman CNPS5X}[Transcend 2x2GB DDR3]
[Asus GTX 750 Ti @ 1350MHz] [Intel SSD 330 - 240GB]
[Creative Titanium HD + Beyerdynamic DT 880 (250ohm) headphones] [Windows 7 64bit]
Edit: added post reboot data
Edit2: added 1920x1080
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Did not move the mouse. Same exact location. same save game.
With 3D vision disabled: vram usage = 458MB
with 3D vision enabled and turned on: vram usage = 616MB
34.5% vram usage increase.
nvidia drivers 310.90
For testing without 3D, make sure you disable the 3D vision driver and restart your PC.
Thief 1/2/gold in 3D
https://forums.geforce.com/default/topic/523535/3d-vision/thief-1-2-and-system-shock-2-perfect-3d-with-unofficial-patch-1-19
http://photos.3dvisionlive.com/Partol/album/509eb580a3e067153c000020/
[Acer GD245HQ - 1920x1080 120Hz] [Nvidia 3D Vision]
[MSI H81M-P33 with Pentium G3258 @ 4.4GHz and Zalman CNPS5X}[Transcend 2x2GB DDR3]
[Asus GTX 750 Ti @ 1350MHz] [Intel SSD 330 - 240GB]
[Creative Titanium HD + Beyerdynamic DT 880 (250ohm) headphones] [Windows 7 64bit]
OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti
If you want to test your cpu meets requirements, open a few games at lowest resolution possible, turn vsync off, then launch fraps. You want around 120 fps. I was at one point bottlenecking with a crazy video card setup because I was bottlenecking on CPU.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
Right now I've got an i7-950 clocked to 3.8ghz. Will a GTX 680 be bottlenecked by that?
The thing is, I would upgrade to an i-7 3770k, it's not even that expensive since I've got a Microcenter near me...could pick up a new mobo and that CPU for like $300ish. But I'd rather just wait for Haswell at this point (supposedly coming out in June) assuming my i7-950 @ 3.8 isn't bottlenecking.
If you had something amd or lowend i5 it may of been an issue but your fine.
Co-founder of helixmod.blog.com
If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com
While I agree with you that the I7950 is a great cpu, it might still be bottlenecking even at 3.8ghz on many games. My I7930 when over clocked from 2.8 to 3.8ghz gained almost 50% higher min frames per second in Batman. I know the maths doesn't quite add up, but i did multiple tests. Visually it was night and day better. I lowered the clock down to 3.57 for stability at a low core voltage.
With each over clock i ran fraps and with each increase the fps increase was quite linear on a good selection of games, tailing only a little towards the end. I wouldn't advise him to upgrade as benefits will be small, but if I go sli I think an upgrade from the 900 series will be hugely beneficial.
So even though CPU is not as important as GPU, considering most of us here have a top gpu setup, minimum and average fps in my experience are hugely affected by cpu speed. For anyone building, my advice is get a top Ivy (or wait for hasswell) and give it a moderate over-clock.
OS: Win 8 CPU: I7 4770k 3.5GZ GPU: GTX 780ti
I was pretty sure that was the case, but I think these results indicate there is an impact. It is likely game dependent.
You really take a hit by bumping the resolution of course. Much more than the 3D itself. Thanks to a tip from DMan-11, I ran the extra case of 1920x1080.
I think to do the test you need to drive around a little. When I ran Bioshock it would start at 450M, and slowly creep up to 850M. The steady state is pretty much the only number of interest, which is why I'm showing the graphs, to show steady state.
I'm surprised the 3D driver requires a reboot to clear it, but that definitely is important for the minimum number.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Try running a gpu benchmark (such as http://unigine.com) in 2D and 3D.
The reason I say restart the PC is mainly to ensure that starting conditions are identical (to avoid data accumulating in vram). Restart your PC before each test. Do not run two or more tests consecutively.
Thief 1/2/gold in 3D
https://forums.geforce.com/default/topic/523535/3d-vision/thief-1-2-and-system-shock-2-perfect-3d-with-unofficial-patch-1-19
http://photos.3dvisionlive.com/Partol/album/509eb580a3e067153c000020/
[Acer GD245HQ - 1920x1080 120Hz] [Nvidia 3D Vision]
[MSI H81M-P33 with Pentium G3258 @ 4.4GHz and Zalman CNPS5X}[Transcend 2x2GB DDR3]
[Asus GTX 750 Ti @ 1350MHz] [Intel SSD 330 - 240GB]
[Creative Titanium HD + Beyerdynamic DT 880 (250ohm) headphones] [Windows 7 64bit]
That big bump from 450 to 850 by simply waiting is at least as big a problem as moving around and varying the data.
On the graphs, the time scale is 2 minutes. So, that's a fairly stable VRAM usage for 2 minutes of fighting and looking. It doesn't vary nearly as much as GPU usage itself.
In general for this sort of test, I only really care about maximum usage, not average.
Similarly for frame rates, I only care about minimum frame rate, definitely can give a damn about average.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers