Where are the gtx 1080 3d vision reviews? what about cpu bottlenecking for high FPS?
  1 / 2    
Everything I've seen about the gtx 1080 is about high 4k res gaming and super sampling. It's understandable that no reviews even consider that some of use might be shooting for 120fps at 1080p or even 240fps for that matter. This then ties into the single pass rendering of stereo images that NVidia talked so much about to eliminate the need for ridiculously high fps. I did see that was written off in another discussion as being part of VR works and requiring Developer implementation. To me it looked like a driver level feature for multi monitors and then profiles for different VR sets to preadjust the image for the display. I haven't seen how this will actually work, but it certainly SHOULD be able to greatly improve 3d vision performance by rendering a larger resolution image in a single pass then displaying each half on a 3d vision display just like a side by side 3d video works. Then again that would require NVidia actually working to support 3d vision and that's another discussion. I also haven't found any real benchmark information about cpu bottlenecking on the new gtx 1080 boards. I have heard that it is a real concern, but no actual data to confirm or deny it. This is very annoying because most people don't have the newest i7 which ALL benchmarks use. Mostly I've seen forum posts which are opinion rather than any actual testing to show where the throttling kicks in. I'm looking at games like ark survival evolved and planet side 2 which are already crushing my cpu. I have a 4670k @4.2Ghz and the last thing I want to do is upgrade my cpu, as I haven't seen anything that says I wouldn't have to upgrade into a new socket to get an noticeable improvement. I'm upgrading from dual 660sli which is a MUCH needed upgrade. I almost never use 3d vision because there is either an sli or 3d vision problem with a game. I'm hoping to finally have a single card that can actually make 3d vision shine. This is the only place I can expect to get real information and I'm hoping people can give first hand experience or real benchmarks rather than opinion.
Everything I've seen about the gtx 1080 is about high 4k res gaming and super sampling. It's understandable that no reviews even consider that some of use might be shooting for 120fps at 1080p or even 240fps for that matter. This then ties into the single pass rendering of stereo images that NVidia talked so much about to eliminate the need for ridiculously high fps. I did see that was written off in another discussion as being part of VR works and requiring Developer implementation. To me it looked like a driver level feature for multi monitors and then profiles for different VR sets to preadjust the image for the display. I haven't seen how this will actually work, but it certainly SHOULD be able to greatly improve 3d vision performance by rendering a larger resolution image in a single pass then displaying each half on a 3d vision display just like a side by side 3d video works. Then again that would require NVidia actually working to support 3d vision and that's another discussion.

I also haven't found any real benchmark information about cpu bottlenecking on the new gtx 1080 boards. I have heard that it is a real concern, but no actual data to confirm or deny it. This is very annoying because most people don't have the newest i7 which ALL benchmarks use. Mostly I've seen forum posts which are opinion rather than any actual testing to show where the throttling kicks in. I'm looking at games like ark survival evolved and planet side 2 which are already crushing my cpu. I have a 4670k @4.2Ghz and the last thing I want to do is upgrade my cpu, as I haven't seen anything that says I wouldn't have to upgrade into a new socket to get an noticeable improvement. I'm upgrading from dual 660sli which is a MUCH needed upgrade. I almost never use 3d vision because there is either an sli or 3d vision problem with a game. I'm hoping to finally have a single card that can actually make 3d vision shine. This is the only place I can expect to get real information and I'm hoping people can give first hand experience or real benchmarks rather than opinion.

- Dual gtx 660 SLI
- I5-4670K @ 4.1GHZ (dog chip, very unstable overclocking)
- 8gigs 1600 ram
- Asrock z87m extreme 4
- Asus VG278HE 3d vision LCD monitor
- Viewsonic PJD6211 3d vision DLP projector
- 3dvision glasses w/ emitter. (usually run games in 2D at 100hz with lightboost and forced adaptive vsync)
- Windows 8.1 64bit
- Silverstone SG10

#1
Posted 06/08/2016 10:39 AM   
The new GPUs by NVidia will work just fine with 3D vision. I don't expect there to be any difference to the 980TI or the Titan X. To be honest, most of the new games in 3D vision games work in Ultra settings just smooth and fine with those cards.
The new GPUs by NVidia will work just fine with 3D vision. I don't expect there to be any difference to the 980TI or the Titan X. To be honest, most of the new games in 3D vision games work in Ultra settings just smooth and fine with those cards.

Intel Core i7-3820, 4 X 3,60 GHz overclocked to 4,50 GHz ; EVGA Titan X 12VRAM ; 16 GB Corsair Vengeance DDR-1600 (4x 4 GB) ; Asus VG278H 27-inch incl. 3D vision 2 glasses, integrated transmitter ; Xbox One Elite wireless controller ; Windows 10HTC VIVE 2,5 m2 roomscale3D VISION GAMERS - VISIT ME ON STEAM and feel free to add me: http://steamcommunity.com/profiles/76561198064106555 YOUTUBE: https://www.youtube.com/channel/UC1UE5TPoF0HX0HVpF_E4uPQ STEAM CURATOR: https://store.steampowered.com/curator/33611530-Streaming-Deluxe/ Image

#2
Posted 06/08/2016 11:28 AM   
With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps). Note that the 4K display processing load is more than Surround setups, even though half the display pixels are thrown away during interlace formatting. The EDID mod forces 3840x2160 Desktop (60 Hz), and in-game resolution changes do not produce good 3D for any other resolution in some games like Dark Souls 3. SLI scaling was good for Dark Souls 3, but the game could shift into a stutter mode (especially when traveling between bondfires), dropping to 5 fps or less, occasionally requiring stopping/restarting game to get normal frame rates again. Chassis heat and noise was an issue, also. With i7-4960X CPU, I have not found CPU limitations in any recent games (Assassin's Creed 3 remains notable). Unfortunately, Surround and 4K setups will need at least a 1080 to play some games - a 1080 Ti would be better. More intelligent display processing for SBS, TB, and interlace 3D formats would be great, also.
With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps). Note that the 4K display processing load is more than Surround setups, even though half the display pixels are thrown away during interlace formatting. The EDID mod forces 3840x2160 Desktop (60 Hz), and in-game resolution changes do not produce good 3D for any other resolution in some games like Dark Souls 3.

SLI scaling was good for Dark Souls 3, but the game could shift into a stutter mode (especially when traveling between bondfires), dropping to 5 fps or less, occasionally requiring stopping/restarting game to get normal frame rates again. Chassis heat and noise was an issue, also. With i7-4960X CPU, I have not found CPU limitations in any recent games (Assassin's Creed 3 remains notable).

Unfortunately, Surround and 4K setups will need at least a 1080 to play some games - a 1080 Ti would be better. More intelligent display processing for SBS, TB, and interlace 3D formats would be great, also.

#3
Posted 06/08/2016 01:42 PM   
[quote="whyme466"]With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps).[/quote] I wonder if and overclocked 1080 would be enough for Dark Souls 3 at 2560x1440 at 60fps per eye. Assuming you get 30fps as the average, that SLI scaling is perfect and that the 1080 is 25% more powerful than a 980Ti: 30*0.5*((3840*2160)/(2560*1440))*1.25 = 42.1875fps With OC it could be higher. Although a 980Ti gets 37fps average in 2D at 4k, and with SLI 60fps average (hitting the cap, so it should be a bit more) according to this: http://gamegpu.com/images/stories/Test_GPU/MMO/DARK_SOULS_III/test/ds3_3840.jpg And that must be at stock clocks. Well, I guess I can use a lower custom resolution with black bars.
whyme466 said:With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps).


I wonder if and overclocked 1080 would be enough for Dark Souls 3 at 2560x1440 at 60fps per eye. Assuming you get 30fps as the average, that SLI scaling is perfect and that the 1080 is 25% more powerful than a 980Ti:

30*0.5*((3840*2160)/(2560*1440))*1.25 = 42.1875fps

With OC it could be higher. Although a 980Ti gets 37fps average in 2D at 4k, and with SLI 60fps average (hitting the cap, so it should be a bit more) according to this: http://gamegpu.com/images/stories/Test_GPU/MMO/DARK_SOULS_III/test/ds3_3840.jpg

And that must be at stock clocks.

Well, I guess I can use a lower custom resolution with black bars.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#4
Posted 06/08/2016 08:39 PM   
[quote="jabuki"] I almost never use 3d vision because there is either an sli or 3d vision problem with a game. I'm hoping to finally have a single card that can actually make 3d vision shine. This is the only place I can expect to get real information and I'm hoping people can give first hand experience or real benchmarks rather than opinion.[/quote] When you say problems with 3D, are you using HelixMod fixes? As a general rule, anything default from the driver is going to be terrible, but we've fixed some 350 games. We have seen CPU throttling in a few games, but I'm less sure this is a driver problem, and not a game problem. AC3 is a good reminder of older games, where it's two threads only, 3D or not. In 3D the extra drawing cost caps it outside of raw GHz. GTA5 is another example. We thought this was driver capping it for 3 cores, but I now believe this is a GTA bug, not a driver bug. Remember, GTA is running the 3D here, not the usual 3D Vision Automatic.
jabuki said: I almost never use 3d vision because there is either an sli or 3d vision problem with a game. I'm hoping to finally have a single card that can actually make 3d vision shine. This is the only place I can expect to get real information and I'm hoping people can give first hand experience or real benchmarks rather than opinion.

When you say problems with 3D, are you using HelixMod fixes? As a general rule, anything default from the driver is going to be terrible, but we've fixed some 350 games.


We have seen CPU throttling in a few games, but I'm less sure this is a driver problem, and not a game problem. AC3 is a good reminder of older games, where it's two threads only, 3D or not. In 3D the extra drawing cost caps it outside of raw GHz.

GTA5 is another example. We thought this was driver capping it for 3 cores, but I now believe this is a GTA bug, not a driver bug. Remember, GTA is running the 3D here, not the usual 3D Vision Automatic.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#5
Posted 06/08/2016 10:12 PM   
i've got the gtx 1080 - definitely no issues with bottle necking at all - on everyday software and benches it's just marginally better than 980ti and on max OC the 980Ti and GTX 1080 is on par (for comparison the r9 295x2 is about 60% faster than 1080 OCd to 2.0Ghz - on almost everything). Where the GTX 1080 shines is the unigine https://unigine.com/en/ where they implemented the gameworks API and it has got whooping 50% boost over the 980Ti and you can use the SMP for multi monitor correction (it's more a preview but you can try it on Heaven Benchmark) the 3D vision runs on GTX 1080 with elite dangerous and movies (doesn't work on unigine and cryengine for some reason) and you can even use it on non-certified projectors at 3840x1080 @ 120HZ (but you can do it on 980Ti as well)
i've got the gtx 1080 - definitely no issues with bottle necking at all - on everyday software and benches it's just marginally better than 980ti and on max OC the 980Ti and GTX 1080 is on par (for comparison the r9 295x2 is about 60% faster than 1080 OCd to 2.0Ghz - on almost everything). Where the GTX 1080 shines is the unigine https://unigine.com/en/ where they implemented the gameworks API and it has got whooping 50% boost over the 980Ti and you can use the SMP for multi monitor correction (it's more a preview but you can try it on Heaven Benchmark)

the 3D vision runs on GTX 1080 with elite dangerous and movies (doesn't work on unigine and cryengine for some reason) and you can even use it on non-certified projectors at 3840x1080 @ 120HZ (but you can do it on 980Ti as well)

#6
Posted 06/09/2016 01:04 AM   
[quote="whyme466"]With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps). Note that the 4K display processing load is more than Surround setups, even though half the display pixels are thrown away during interlace formatting. The EDID mod forces 3840x2160 Desktop (60 Hz), and in-game resolution changes do not produce good 3D for any other resolution in some games like Dark Souls 3. SLI scaling was good for Dark Souls 3, but the game could shift into a stutter mode (especially when traveling between bondfires), dropping to 5 fps or less, occasionally requiring stopping/restarting game to get normal frame rates again. Chassis heat and noise was an issue, also. With i7-4960X CPU, I have not found CPU limitations in any recent games (Assassin's Creed 3 remains notable). Unfortunately, Surround and 4K setups will need at least a 1080 to play some games - a 1080 Ti would be better. More intelligent display processing for SBS, TB, and interlace 3D formats would be great, also.[/quote] In Dark Souls 3 if use SLI: - Make sure you Disable 3D Vision (CTRL+T) before ANY loading screen (including death). Once the scene is loaded you can enable it back. This way you will work around the FPS going down problem. I think is a GAME problem as it only appears on resolution higher than 1080p in 3D Vision. Might be a driver issue but I haven't seen a behaviour like this before (Rise of the Tomb Raider has something similar when the FPS goes down from 60 to 45 in the main screen and game. Solution was to start the game with "3D Vision enabled on startup" = OFF and enable it after a scene is loaded - needs to be disabled during Loading screens).
whyme466 said:With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps). Note that the 4K display processing load is more than Surround setups, even though half the display pixels are thrown away during interlace formatting. The EDID mod forces 3840x2160 Desktop (60 Hz), and in-game resolution changes do not produce good 3D for any other resolution in some games like Dark Souls 3.

SLI scaling was good for Dark Souls 3, but the game could shift into a stutter mode (especially when traveling between bondfires), dropping to 5 fps or less, occasionally requiring stopping/restarting game to get normal frame rates again. Chassis heat and noise was an issue, also. With i7-4960X CPU, I have not found CPU limitations in any recent games (Assassin's Creed 3 remains notable).

Unfortunately, Surround and 4K setups will need at least a 1080 to play some games - a 1080 Ti would be better. More intelligent display processing for SBS, TB, and interlace 3D formats would be great, also.


In Dark Souls 3 if use SLI:
- Make sure you Disable 3D Vision (CTRL+T) before ANY loading screen (including death). Once the scene is loaded you can enable it back. This way you will work around the FPS going down problem. I think is a GAME problem as it only appears on resolution higher than 1080p in 3D Vision. Might be a driver issue but I haven't seen a behaviour like this before (Rise of the Tomb Raider has something similar when the FPS goes down from 60 to 45 in the main screen and game. Solution was to start the game with "3D Vision enabled on startup" = OFF and enable it after a scene is loaded - needs to be disabled during Loading screens).

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#7
Posted 06/09/2016 11:10 AM   
[quote="helifax"][quote="whyme466"]With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps). Note that the 4K display processing load is more than Surround setups, even though half the display pixels are thrown away during interlace formatting. The EDID mod forces 3840x2160 Desktop (60 Hz), and in-game resolution changes do not produce good 3D for any other resolution in some games like Dark Souls 3. SLI scaling was good for Dark Souls 3, but the game could shift into a stutter mode (especially when traveling between bondfires), dropping to 5 fps or less, occasionally requiring stopping/restarting game to get normal frame rates again. Chassis heat and noise was an issue, also. With i7-4960X CPU, I have not found CPU limitations in any recent games (Assassin's Creed 3 remains notable). Unfortunately, Surround and 4K setups will need at least a 1080 to play some games - a 1080 Ti would be better. More intelligent display processing for SBS, TB, and interlace 3D formats would be great, also.[/quote] In Dark Souls 3 if use SLI: - Make sure you Disable 3D Vision (CTRL+T) before ANY loading screen (including death). Once the scene is loaded you can enable it back. This way you will work around the FPS going down problem. I think is a GAME problem as it only appears on resolution higher than 1080p in 3D Vision. Might be a driver issue but I haven't seen a behaviour like this before (Rise of the Tomb Raider has something similar when the FPS goes down from 60 to 45 in the main screen and game. Solution was to start the game with "3D Vision enabled on startup" = OFF and enable it after a scene is loaded - needs to be disabled during Loading screens).[/quote] Another notable game with that problem is GTA4 (not 5). Whenever 3D was on, it would take like 5 minutes to launch. You know, programmers. And bad assumptions.
helifax said:
whyme466 said:With my 4K 3D display (LG passive EDID mod), Dark Souls 3 was the first game I have played that REQUIRED 980 Ti SLI to get acceptable frame rates (above 25 fps). Note that the 4K display processing load is more than Surround setups, even though half the display pixels are thrown away during interlace formatting. The EDID mod forces 3840x2160 Desktop (60 Hz), and in-game resolution changes do not produce good 3D for any other resolution in some games like Dark Souls 3.

SLI scaling was good for Dark Souls 3, but the game could shift into a stutter mode (especially when traveling between bondfires), dropping to 5 fps or less, occasionally requiring stopping/restarting game to get normal frame rates again. Chassis heat and noise was an issue, also. With i7-4960X CPU, I have not found CPU limitations in any recent games (Assassin's Creed 3 remains notable).

Unfortunately, Surround and 4K setups will need at least a 1080 to play some games - a 1080 Ti would be better. More intelligent display processing for SBS, TB, and interlace 3D formats would be great, also.


In Dark Souls 3 if use SLI:
- Make sure you Disable 3D Vision (CTRL+T) before ANY loading screen (including death). Once the scene is loaded you can enable it back. This way you will work around the FPS going down problem. I think is a GAME problem as it only appears on resolution higher than 1080p in 3D Vision. Might be a driver issue but I haven't seen a behaviour like this before (Rise of the Tomb Raider has something similar when the FPS goes down from 60 to 45 in the main screen and game. Solution was to start the game with "3D Vision enabled on startup" = OFF and enable it after a scene is loaded - needs to be disabled during Loading screens).

Another notable game with that problem is GTA4 (not 5). Whenever 3D was on, it would take like 5 minutes to launch. You know, programmers. And bad assumptions.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#8
Posted 06/09/2016 11:15 AM   
helifax - thanks for recommendation. When I must use SLI again (finished DS3), I will remember your helpful guidance. By the way, the DS3 fix looks SUPERB on 4K OLED - thanks for all the great work, also! In my PC chassis, the second 980 Ti adds 5-10 C to CPU and main 980 Ti temperatures, even when not used in SLI - so I remove it for all other games (and Vive/Rift). This also avoids SLI quirks...
helifax - thanks for recommendation. When I must use SLI again (finished DS3), I will remember your helpful guidance. By the way, the DS3 fix looks SUPERB on 4K OLED - thanks for all the great work, also!

In my PC chassis, the second 980 Ti adds 5-10 C to CPU and main 980 Ti temperatures, even when not used in SLI - so I remove it for all other games (and Vive/Rift). This also avoids SLI quirks...

#9
Posted 06/09/2016 01:57 PM   
[quote="whyme466"]helifax - thanks for recommendation. When I must use SLI again (finished DS3), I will remember your helpful guidance. By the way, the DS3 fix looks SUPERB on 4K OLED - thanks for all the great work, also! In my PC chassis, the second 980 Ti adds 5-10 C to CPU and main 980 Ti temperatures, even when not used in SLI - so I remove it for all other games (and Vive/Rift). This also avoids SLI quirks...[/quote] Yes, on Air coolers you get that effect. That is why I watercool both my GPUs and CPU to avoid this type of scenarios. Running both GPUs at 99% dissipates a lot of heat which must be taken care of;) It's funny how big the difference between air and water cooling is (99% full load on both GPUs max temperature is 65 degrees Celsius after countless of continuous usage. On air that is the temperature I got when the cards were Idling:) ) If you always use SLI and see strange behaviour like in DS3 always try the above walk-around and see if it improves things;) 99% of the times it will make the issue go away;)
whyme466 said:helifax - thanks for recommendation. When I must use SLI again (finished DS3), I will remember your helpful guidance. By the way, the DS3 fix looks SUPERB on 4K OLED - thanks for all the great work, also!

In my PC chassis, the second 980 Ti adds 5-10 C to CPU and main 980 Ti temperatures, even when not used in SLI - so I remove it for all other games (and Vive/Rift). This also avoids SLI quirks...


Yes, on Air coolers you get that effect. That is why I watercool both my GPUs and CPU to avoid this type of scenarios. Running both GPUs at 99% dissipates a lot of heat which must be taken care of;)
It's funny how big the difference between air and water cooling is (99% full load on both GPUs max temperature is 65 degrees Celsius after countless of continuous usage. On air that is the temperature I got when the cards were Idling:) )

If you always use SLI and see strange behaviour like in DS3 always try the above walk-around and see if it improves things;) 99% of the times it will make the issue go away;)

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#10
Posted 06/09/2016 02:28 PM   
From the PC Perspective interview with Tom Petersen, he implied that 3D Vision would not support SMP unless updated.
From the PC Perspective interview with Tom Petersen, he implied that 3D Vision would not support SMP unless updated.

Inno3D RTX 2080 Ti iChill Black (330W Power Limit / +50 MHz Core / +750 MHz Memory)
Intel Core i9-9900X (4.6 GHz Core / 3.0 GHz Mesh)
Corsair Vengeance LPX 32 GB (4 x 8 GB) DDR4 CMK32GX4M4Z3200C16 (4000 MHz 18-20-20-40-1T)
MSI MEG X299 Creation
Asus ROG Swift PG27VQ / Dell S2417DG / 3D Vision 2 / Oculus Rift / Marantz SR6012 / LG OLED55B7T
Intel Optane 900P 280 GB / Tiered Storage Space (Samsung 950 PRO 512 GB / Seagate IronWolf Pro 10 TB)
Windows 10 Pro 64-bit

#11
Posted 06/11/2016 11:51 PM   
Hmm, Did I miss anything ? What is "SMP" standing for exactly? ^_^
Hmm, Did I miss anything ? What is "SMP" standing for exactly? ^_^

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#12
Posted 06/12/2016 12:02 AM   
He's referring to Simultaneous Multi-Projection, that was introduced with Nvidia's Pascal.
He's referring to Simultaneous Multi-Projection, that was introduced with Nvidia's Pascal.

#13
Posted 06/12/2016 12:05 AM   
[quote="D-Man11"]He's refering to Simultaneous Multi-Projection, that was introduced with Nvidia's Pascal.[/quote] We’ve taken the new Simultaneous Multi-Projection architecture of NVIDIA Pascal-based GPUs to create two major new techniques for tackling the unique performance challenges VR creates: Lens Matched Shading and Single Pass Stereo. Lens Matched Shading improves pixel shading performance by rendering more natively to the unique dimensions of VR display output. This avoids rendering many pixels that would otherwise be discarded before the image is output to the VR headset. Single Pass Stereo turbocharges geometry performance by allowing the head-mounted display’s left and right displays to share a single geometry pass. We’re effectively halving the workload of traditional VR rendering, which requires the GPU to draw geometry twice — once for the left eye and once for the right eye. Both techniques allow developers to increase performance and visual detail of their VR applications. Combined with the performance of GTX 1080 GPUs, Simultaneous Multi-Projection delivers a dramatic 2x VR performance improvement over the GeForce GTX TITAN X.*
D-Man11 said:He's refering to Simultaneous Multi-Projection, that was introduced with Nvidia's Pascal.



We’ve taken the new Simultaneous Multi-Projection architecture of NVIDIA Pascal-based GPUs to create two major new techniques for tackling the unique performance challenges VR creates: Lens Matched Shading and Single Pass Stereo.

Lens Matched Shading improves pixel shading performance by rendering more natively to the unique dimensions of VR display output. This avoids rendering many pixels that would otherwise be discarded before the image is output to the VR headset.

Single Pass Stereo turbocharges geometry performance by allowing the head-mounted display’s left and right displays to share a single geometry pass. We’re effectively halving the workload of traditional VR rendering, which requires the GPU to draw geometry twice — once for the left eye and once for the right eye.

Both techniques allow developers to increase performance and visual detail of their VR applications. Combined with the performance of GTX 1080 GPUs, Simultaneous Multi-Projection delivers a dramatic 2x VR performance improvement over the GeForce GTX TITAN X.*

#14
Posted 06/12/2016 12:10 AM   
Right! Thanks for the clarification! I am actually interested on how they "only render the geometry once" but still get stereo... The definition of Stereo States: - You need 2 cameras (one for each eye) - You need to render the GEOMETRY using each camera. So...how do they do it? The only way where you render the geometry ONCE is in Z-buffer (depth buffer) approaches... which we all know how they look like... I guess the single pass stereo works hand in hand with the SMP technique but I still fail to understand what it takes effectively to make it "happen" and how GOOD are the results...
Right!
Thanks for the clarification!

I am actually interested on how they "only render the geometry once" but still get stereo...

The definition of Stereo States:
- You need 2 cameras (one for each eye)
- You need to render the GEOMETRY using each camera.

So...how do they do it? The only way where you render the geometry ONCE is in Z-buffer (depth buffer) approaches... which we all know how they look like...

I guess the single pass stereo works hand in hand with the SMP technique but I still fail to understand what it takes effectively to make it "happen" and how GOOD are the results...

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#15
Posted 06/12/2016 01:51 AM   
  1 / 2    
Scroll To Top