I am considering adding Another GTX 580.
Normal reviews never really test 3D Vision that much.
Should I be expecting 30% 70% or 100% scaling from SLI.
What happens if there is no SLI profile?
Because of 3D Vision vsync single card performance would have to be <35 to be able to judge scaling.
If this has been discussed alot already point me in the right direction.
100 % scaling
3d vision automatic doesn't need/use sli profiles and the game doesn't have to be sli compatible either
games with native stereo3d mode have to support sli and need a sli profile
vsync is forced with fermi but not with kepler
100 % scaling
3d vision automatic doesn't need/use sli profiles and the game doesn't have to be sli compatible either
games with native stereo3d mode have to support sli and need a sli profile
vsync is forced with fermi but not with kepler
NVIDIA TITAN X (Pascal), Intel Core i7-6900K, Win 10 Pro,
ASUS ROG Rampage V Edition 10, G.Skill RipJaws V 4x 8GB DDR4-3200 CL14-14-14-34,
ASUS ROG Swift PG258Q, ASUS ROG Swift PG278Q, Acer Predator XB280HK, BenQ W710ST
Okay, well I wasn't expecting that to be right (100%), but I just turned off my second card (GTX 660), and tested against both cards in Tomb Raider and Witcher 2. On both, my framerate approximately doubled.
I wouldn't count on that though, as it will vary based on your CPU, ram, shoe size, and eye colour. But in my case, it does appear to be close to 80%-100%.
Okay, well I wasn't expecting that to be right (100%), but I just turned off my second card (GTX 660), and tested against both cards in Tomb Raider and Witcher 2. On both, my framerate approximately doubled.
I wouldn't count on that though, as it will vary based on your CPU, ram, shoe size, and eye colour. But in my case, it does appear to be close to 80%-100%.
I would expect kepler to use vsync during 3D Vision. Anything else would be really strange.
It's not like in 2D where you would get infrequent tearing.
I pretty much get 95 to 100 percent for games with a 3d vision or sli profile. sli is optimized for 3d vision. each card can render it's own full frame for each eye, it doesn't split the scan lines like in 2d, so it's basically like the game running full speed in 2d on both cards. .. that was kind of a crappy explaination.. oh well. That's how I understand nvidia's take on it, back when they were promoting 3d vision 1 years ago.
I pretty much get 95 to 100 percent for games with a 3d vision or sli profile. sli is optimized for 3d vision. each card can render it's own full frame for each eye, it doesn't split the scan lines like in 2d, so it's basically like the game running full speed in 2d on both cards. .. that was kind of a crappy explaination.. oh well. That's how I understand nvidia's take on it, back when they were promoting 3d vision 1 years ago.
AsRock X58 Extreme6 mobo
Intel Core-i7 950 @ 4ghz
12gb Corsair Dominator DDR3 1600
ASUS DirectCU II GTX 780 3gb
Corsair TX 950w PSU
NZXT Phantom Red/Black Case
3d Vision 1 w/ Samsung 2233rz Monitor
3d Vision 2 w/ ASUS VG278HE Monitor
I'm thinking about getting a second 670. What's the SLI scaling like with the 6 series?
Thanks!
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Yes, similar results here. Nearly doubled performance with a 2nd GTX 580.
This also allows me to do super scaling with larger virtual screens.
This also pointed out my CPU bottleneck, which I overclock now.
You don't need the same make of card right? SLI noob here...
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
It needs to be the same model. A 650 and a 660 won't work together. A 660 and a 660TI won't work together.
But it doesn't have to be the same manufacturer. Though it's probably best if you can get perfectly matching cards.
[quote="Flugan"]I am considering adding Another GTX 580.
Normal reviews never really test 3D Vision that much.
Should I be expecting 30% 70% or 100% scaling from SLI.
What happens if there is no SLI profile?
Because of 3D Vision vsync single card performance would have to be <35 to be able to judge scaling.
If this has been discussed alot already point me in the right direction.[/quote]
In a perfect world where the game itself is not CPU bound and is coded to make use of SLI (AND a good SLI profile exists) you can expect to see up to 100% performance enhancement along with the GPU "load" being evenly split between your two GPU's.
So if you were getting 30FPS with 90% GPU load on a single GPU, SLI could get you 60FPS and 45/45% load on your dual GPU's.
Using the above example as a base line the worse case examples of SLI NOT working include:
1. 25FPS and 90/90% = worse FPS with 2x GPU load (or higher)
2. 30FPS and 90-100/0% = same (or lower) FPS and same or higher GPU use (2nd GPU not used at all)
In the real world what you'll get is somewhere in between the two. As an example, "Guildwars2" is considered to be "CPU bound" where tossing 2x-4x SLI Titans at it won't help the FPS performance of the game over using a single Titan. However if you have a fast enough CPU and a 4xx/5xx series GPU you might see a slight improvement in FPS+smoother game play and a 90-100% single GPU load turn into a 50-70/50-70% GPU load when enabling a second card in SLI.
The SLI in this case (and others) is only buying you some GPU headroom and possibly less GPU cooling fan noise over using a single GPU card.
Flugan said:I am considering adding Another GTX 580.
Normal reviews never really test 3D Vision that much.
Should I be expecting 30% 70% or 100% scaling from SLI.
What happens if there is no SLI profile?
Because of 3D Vision vsync single card performance would have to be <35 to be able to judge scaling.
If this has been discussed alot already point me in the right direction.
In a perfect world where the game itself is not CPU bound and is coded to make use of SLI (AND a good SLI profile exists) you can expect to see up to 100% performance enhancement along with the GPU "load" being evenly split between your two GPU's.
So if you were getting 30FPS with 90% GPU load on a single GPU, SLI could get you 60FPS and 45/45% load on your dual GPU's.
Using the above example as a base line the worse case examples of SLI NOT working include:
1. 25FPS and 90/90% = worse FPS with 2x GPU load (or higher)
2. 30FPS and 90-100/0% = same (or lower) FPS and same or higher GPU use (2nd GPU not used at all)
In the real world what you'll get is somewhere in between the two. As an example, "Guildwars2" is considered to be "CPU bound" where tossing 2x-4x SLI Titans at it won't help the FPS performance of the game over using a single Titan. However if you have a fast enough CPU and a 4xx/5xx series GPU you might see a slight improvement in FPS+smoother game play and a 90-100% single GPU load turn into a 50-70/50-70% GPU load when enabling a second card in SLI.
The SLI in this case (and others) is only buying you some GPU headroom and possibly less GPU cooling fan noise over using a single GPU card.
Normal reviews never really test 3D Vision that much.
Should I be expecting 30% 70% or 100% scaling from SLI.
What happens if there is no SLI profile?
Because of 3D Vision vsync single card performance would have to be <35 to be able to judge scaling.
If this has been discussed alot already point me in the right direction.
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
donations: ulfjalmbrant@hotmail.com
3d vision automatic doesn't need/use sli profiles and the game doesn't have to be sli compatible either
games with native stereo3d mode have to support sli and need a sli profile
vsync is forced with fermi but not with kepler
NVIDIA TITAN X (Pascal), Intel Core i7-6900K, Win 10 Pro,
ASUS ROG Rampage V Edition 10, G.Skill RipJaws V 4x 8GB DDR4-3200 CL14-14-14-34,
ASUS ROG Swift PG258Q, ASUS ROG Swift PG278Q, Acer Predator XB280HK, BenQ W710ST
I wouldn't count on that though, as it will vary based on your CPU, ram, shoe size, and eye colour. But in my case, it does appear to be close to 80%-100%.
It's not like in 2D where you would get infrequent tearing.
Thanks to everybody using my assembler it warms my heart.
To have a critical piece of code that everyone can enjoy!
What more can you ask for?
donations: ulfjalmbrant@hotmail.com
AsRock X58 Extreme6 mobo
Intel Core-i7 950 @ 4ghz
12gb Corsair Dominator DDR3 1600
ASUS DirectCU II GTX 780 3gb
Corsair TX 950w PSU
NZXT Phantom Red/Black Case
3d Vision 1 w/ Samsung 2233rz Monitor
3d Vision 2 w/ ASUS VG278HE Monitor
Thanks!
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
This also allows me to do super scaling with larger virtual screens.
This also pointed out my CPU bottleneck, which I overclock now.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
You don't need the same make of card right? SLI noob here...
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
But it doesn't have to be the same manufacturer. Though it's probably best if you can get perfectly matching cards.
In a perfect world where the game itself is not CPU bound and is coded to make use of SLI (AND a good SLI profile exists) you can expect to see up to 100% performance enhancement along with the GPU "load" being evenly split between your two GPU's.
So if you were getting 30FPS with 90% GPU load on a single GPU, SLI could get you 60FPS and 45/45% load on your dual GPU's.
Using the above example as a base line the worse case examples of SLI NOT working include:
1. 25FPS and 90/90% = worse FPS with 2x GPU load (or higher)
2. 30FPS and 90-100/0% = same (or lower) FPS and same or higher GPU use (2nd GPU not used at all)
In the real world what you'll get is somewhere in between the two. As an example, "Guildwars2" is considered to be "CPU bound" where tossing 2x-4x SLI Titans at it won't help the FPS performance of the game over using a single Titan. However if you have a fast enough CPU and a 4xx/5xx series GPU you might see a slight improvement in FPS+smoother game play and a 90-100% single GPU load turn into a 50-70/50-70% GPU load when enabling a second card in SLI.
The SLI in this case (and others) is only buying you some GPU headroom and possibly less GPU cooling fan noise over using a single GPU card.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"