Bioshock: Infinite to have native 3D support
  2 / 2    
[quote name='chiz' date='08 February 2012 - 01:09 PM' timestamp='1328735372' post='1367045']
This is just the usual anti-Nvidia rhetoric from you, unfortunately its nonsense. There's plenty of games that demonstrate devs have total control over stereo rendering and output method to bypass the Nvidia auto stereo driver and use Nvidia's quad buffer stereo mode to output to any non-Nvidia 3D display or format:

Trine 2
Crysis 2
Battlefield 3
Avatar the Game
Deus Ex HR

Probably missed a few in there, but you get the point, there's some great examples of 3D Vision that don't have any issues with API or drivers. If anything we've seen this creates a new problem as it shifts the rendering bottleneck to the game engine for dual camera views, which adversely impacts performance.
[/quote]

You're absolutely right in that games can be programmed to be agnostic to the 3D solution being used - by proper adherence to standardized geometric rendering, as well as having their shaders be respective of the proper z-depth. The fact that we can emulate nintendo 64 games in perfect 3D, which were so far from being designed for stereoscopy, is testament to this fact. This is completely irrelevant to the discussion that games are being programmed to be compatible with specific 3D vendors and technologies, and that the standard belongs at the shared rendering abstraction layer (in this case, directx 11.1). Your argument makes absolutely no sense - how is it that games explicity supporting nvidia's quad-buffered stereo mode help users with AMD cards?

That being said, with proprietary hardware despite industry plans to standardize, 3d compatibility with only "3d vision approved" monitors (it really isn't sad to you watching nvidia dance around the simple question "can we please just hide the warning message?"), it's inarguable that nvidia's interests lie in locking people into their ecosystem to sell hardware. Can you imagine if 10 years ago, you had to buy a driver for your video card to output over DVI instead of VGA? Nvidia's decision to sell you a driver to enable what is a freely available universal output spec (hdmi 1.4a) is undeniable proof of their monetization strategy of their 3D driver. None of this would even be so bad if nvidia didn't use their monopoly to force the usage of horrible drivers lacking basic functionality and terrible game support. I've spent $1k+ on a 3d vision setup, so once again, your argument that I'm "anti-nvidia" is laughable. I'm not anti-nvidia, I'm anti-bull****. And there's a whole lot of that going around at nvidia these days.
[quote name='chiz' date='08 February 2012 - 01:09 PM' timestamp='1328735372' post='1367045']

This is just the usual anti-Nvidia rhetoric from you, unfortunately its nonsense. There's plenty of games that demonstrate devs have total control over stereo rendering and output method to bypass the Nvidia auto stereo driver and use Nvidia's quad buffer stereo mode to output to any non-Nvidia 3D display or format:



Trine 2

Crysis 2

Battlefield 3

Avatar the Game

Deus Ex HR



Probably missed a few in there, but you get the point, there's some great examples of 3D Vision that don't have any issues with API or drivers. If anything we've seen this creates a new problem as it shifts the rendering bottleneck to the game engine for dual camera views, which adversely impacts performance.





You're absolutely right in that games can be programmed to be agnostic to the 3D solution being used - by proper adherence to standardized geometric rendering, as well as having their shaders be respective of the proper z-depth. The fact that we can emulate nintendo 64 games in perfect 3D, which were so far from being designed for stereoscopy, is testament to this fact. This is completely irrelevant to the discussion that games are being programmed to be compatible with specific 3D vendors and technologies, and that the standard belongs at the shared rendering abstraction layer (in this case, directx 11.1). Your argument makes absolutely no sense - how is it that games explicity supporting nvidia's quad-buffered stereo mode help users with AMD cards?



That being said, with proprietary hardware despite industry plans to standardize, 3d compatibility with only "3d vision approved" monitors (it really isn't sad to you watching nvidia dance around the simple question "can we please just hide the warning message?"), it's inarguable that nvidia's interests lie in locking people into their ecosystem to sell hardware. Can you imagine if 10 years ago, you had to buy a driver for your video card to output over DVI instead of VGA? Nvidia's decision to sell you a driver to enable what is a freely available universal output spec (hdmi 1.4a) is undeniable proof of their monetization strategy of their 3D driver. None of this would even be so bad if nvidia didn't use their monopoly to force the usage of horrible drivers lacking basic functionality and terrible game support. I've spent $1k+ on a 3d vision setup, so once again, your argument that I'm "anti-nvidia" is laughable. I'm not anti-nvidia, I'm anti-bull****. And there's a whole lot of that going around at nvidia these days.

#16
Posted 02/09/2012 04:55 AM   
[quote name='supcaj' date='08 February 2012 - 11:55 PM' timestamp='1328763328' post='1367193']
You're absolutely right in that games can be programmed to be agnostic to the 3D solution being used - by proper adherence to standardized geometric rendering, as well as having their shaders be respective of the proper z-depth. The fact that we can emulate nintendo 64 games in perfect 3D, which were so far from being designed for stereoscopy, is testament to this fact. This is completely irrelevant to the discussion that games are being programmed to be compatible with specific 3D vendors and technologies, and that the standard belongs at the shared rendering abstraction layer (in this case, directx 11.1). Your argument makes absolutely no sense - how is it that games explicity supporting nvidia's quad-buffered stereo mode help users with AMD cards? [/quote]
No DX11.1 as an API is not going to automagically start making games 3D for any game using DX11.1 and replace auto-stereo drivers if that is what you are implying. All it does is provide a quad buffer stereo output mode for the API to directly interface the IHV's driver instead of the devs having to conform to each IHV's own version of quad buffer stereo, which is the case now. Game engines that handle 3D natively still need to handle all the stereo 3D functions with dual camera and double draw calls for 3D and this will not change with DX11.1.

This helps AMD the same way console 3D helps Nvidia/AMD because it provides a standard format for any native 3D game engine where the driver/hardware simply renders the stereo frames as dictated by the game engine instead of having to create stereo images from a single image with an auto-stereo middleware driver. There is no "fighting" drivers any more than fighting drivers to render 2D, because the driver/hardware simply renders what it is told in those quad buffers, then outputs them to whatever 3D hardware format its told to.

[quote]That being said, with proprietary hardware despite industry plans to standardize, 3d compatibility with only "3d vision approved" monitors (it really isn't sad to you watching nvidia dance around the simple question "can we please just hide the warning message?"), it's inarguable that nvidia's interests lie in locking people into their ecosystem to sell hardware. Can you imagine if 10 years ago, you had to buy a driver for your video card to output over DVI instead of VGA? Nvidia's decision to sell you a driver to enable what is a freely available universal output spec (hdmi 1.4a) is undeniable proof of their monetization strategy of their 3D driver. None of this would even be so bad if nvidia didn't use their monopoly to force the usage of horrible drivers lacking basic functionality and terrible game support. I've spent $1k+ on a 3d vision setup, so once again, your argument that I'm "anti-nvidia" is laughable. I'm not anti-nvidia, I'm anti-bull****. And there's a whole lot of that going around at nvidia these days.
[/quote]
Nvidia isn't running a charity sorry. Support costs money and 3D Vision is clearly considered to be a value add premium feature at this point. 3D Vision users pay for their support in the cost of their 3D Vision hardware. 3DTV Play was added much later on to support hardware that Nvidia doesn't collect a single penny in royalties, so they go to the end-user for the software fee. Any 3rd party software like DDD or iZ3D costs similar. Optimized for GeForce is probably paid up front by the OEM and ultimately passed on to the end-user.

Nvidia also has the right to control what hardware they support within their ecosystem. Surely you aren't suggesting they're required to support a direct competitor's product are you? Might as well expect them to write drivers for AMD graphics cards at that point. But yes it makes sense for them to promote their hardware from both a business and QA perspective. It helps ensure the continued success of their 3D Vision program and justifies the cost of continued development.

But best of all, if you don't like the options with Nvidia's ecosystem, you are free to use whatever 3D solution suits you. Nvidia graphics cards work with all of the various solutions out there, DDD, iZ3D with both their hardware and software. Or you can just go with AMD's ecosystem top to bottom, then wonder why they don't support whatever hardware you think they should support. But yes if you think there's better options out there, you're free to pursue them, to say Nvidia locks you into anything is laughable.
[quote name='supcaj' date='08 February 2012 - 11:55 PM' timestamp='1328763328' post='1367193']

You're absolutely right in that games can be programmed to be agnostic to the 3D solution being used - by proper adherence to standardized geometric rendering, as well as having their shaders be respective of the proper z-depth. The fact that we can emulate nintendo 64 games in perfect 3D, which were so far from being designed for stereoscopy, is testament to this fact. This is completely irrelevant to the discussion that games are being programmed to be compatible with specific 3D vendors and technologies, and that the standard belongs at the shared rendering abstraction layer (in this case, directx 11.1). Your argument makes absolutely no sense - how is it that games explicity supporting nvidia's quad-buffered stereo mode help users with AMD cards?

No DX11.1 as an API is not going to automagically start making games 3D for any game using DX11.1 and replace auto-stereo drivers if that is what you are implying. All it does is provide a quad buffer stereo output mode for the API to directly interface the IHV's driver instead of the devs having to conform to each IHV's own version of quad buffer stereo, which is the case now. Game engines that handle 3D natively still need to handle all the stereo 3D functions with dual camera and double draw calls for 3D and this will not change with DX11.1.



This helps AMD the same way console 3D helps Nvidia/AMD because it provides a standard format for any native 3D game engine where the driver/hardware simply renders the stereo frames as dictated by the game engine instead of having to create stereo images from a single image with an auto-stereo middleware driver. There is no "fighting" drivers any more than fighting drivers to render 2D, because the driver/hardware simply renders what it is told in those quad buffers, then outputs them to whatever 3D hardware format its told to.



That being said, with proprietary hardware despite industry plans to standardize, 3d compatibility with only "3d vision approved" monitors (it really isn't sad to you watching nvidia dance around the simple question "can we please just hide the warning message?"), it's inarguable that nvidia's interests lie in locking people into their ecosystem to sell hardware. Can you imagine if 10 years ago, you had to buy a driver for your video card to output over DVI instead of VGA? Nvidia's decision to sell you a driver to enable what is a freely available universal output spec (hdmi 1.4a) is undeniable proof of their monetization strategy of their 3D driver. None of this would even be so bad if nvidia didn't use their monopoly to force the usage of horrible drivers lacking basic functionality and terrible game support. I've spent $1k+ on a 3d vision setup, so once again, your argument that I'm "anti-nvidia" is laughable. I'm not anti-nvidia, I'm anti-bull****. And there's a whole lot of that going around at nvidia these days.



Nvidia isn't running a charity sorry. Support costs money and 3D Vision is clearly considered to be a value add premium feature at this point. 3D Vision users pay for their support in the cost of their 3D Vision hardware. 3DTV Play was added much later on to support hardware that Nvidia doesn't collect a single penny in royalties, so they go to the end-user for the software fee. Any 3rd party software like DDD or iZ3D costs similar. Optimized for GeForce is probably paid up front by the OEM and ultimately passed on to the end-user.



Nvidia also has the right to control what hardware they support within their ecosystem. Surely you aren't suggesting they're required to support a direct competitor's product are you? Might as well expect them to write drivers for AMD graphics cards at that point. But yes it makes sense for them to promote their hardware from both a business and QA perspective. It helps ensure the continued success of their 3D Vision program and justifies the cost of continued development.



But best of all, if you don't like the options with Nvidia's ecosystem, you are free to use whatever 3D solution suits you. Nvidia graphics cards work with all of the various solutions out there, DDD, iZ3D with both their hardware and software. Or you can just go with AMD's ecosystem top to bottom, then wonder why they don't support whatever hardware you think they should support. But yes if you think there's better options out there, you're free to pursue them, to say Nvidia locks you into anything is laughable.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

#17
Posted 02/09/2012 05:39 AM   
[quote name='chiz' date='08 February 2012 - 09:39 PM' timestamp='1328765992' post='1367202']
No DX11.1 as an API is not going to automagically start making games 3D for any game using DX11.1 and replace auto-stereo drivers if that is what you are implying. All it does is provide a quad buffer stereo output mode for the API to directly interface the IHV's driver instead of the devs having to conform to each IHV's own version of quad buffer stereo, which is the case now. Game engines that handle 3D natively still need to handle all the stereo 3D functions with dual camera and double draw calls for 3D and this will not change with DX11.1.

This helps AMD the same way console 3D helps Nvidia/AMD because it provides a standard format for any native 3D game engine where the driver/hardware simply renders the stereo frames as dictated by the game engine instead of having to create stereo images from a single image with an auto-stereo middleware driver. There is no "fighting" drivers any more than fighting drivers to render 2D, because the driver/hardware simply renders what it is told in those quad buffers, then outputs them to whatever 3D hardware format its told to.
[/quote]

Obviously nobody is arguing that games will automatically become perfect 3D with directx 11.1. The argument is that there is now a set of standardized guidelines, best practices and interfaces for rendering geometry and shaders compatible with stereoscopy. Why are you saying "no", complaining about all of the misinformation, and then agreeing word-for-word with everything that everyone is saying is great about DX11.1?

[quote name='chiz' date='08 February 2012 - 09:39 PM' timestamp='1328765992' post='1367202']
Nvidia isn't running a charity sorry. Support costs money and 3D Vision is clearly considered to be a value add premium feature at this point. 3D Vision users pay for their support in the cost of their 3D Vision hardware. 3DTV Play was added much later on to support hardware that Nvidia doesn't collect a single penny in royalties, so they go to the end-user for the software fee. Any 3rd party software like DDD or iZ3D costs similar. Optimized for GeForce is probably paid up front by the OEM and ultimately passed on to the end-user.

Nvidia also has the right to control what hardware they support within their ecosystem. Surely you aren't suggesting they're required to support a direct competitor's product are you? Might as well expect them to write drivers for AMD graphics cards at that point. But yes it makes sense for them to promote their hardware from both a business and QA perspective. It helps ensure the continued success of their 3D Vision program and justifies the cost of continued development.

But best of all, if you don't like the options with Nvidia's ecosystem, you are free to use whatever 3D solution suits you. Nvidia graphics cards work with all of the various solutions out there, DDD, iZ3D with both their hardware and software. Or you can just go with AMD's ecosystem top to bottom, then wonder why they don't support whatever hardware you think they should support. But yes if you think there's better options out there, you're free to pursue them, to say Nvidia locks you into anything is laughable.
[/quote]

Back to my earlier example - what if nvidia happened to be a year earlier to the market with a card that has a DVI port. In the same year, dozens of DVI monitors were released, but only one paid nvidia to be a "nvidia dvi compatible monitor", and therefore it was the only that one worked. All the other monitors, although perfectly capable of using DVI input, displayed a big red error message across the screen for absolutely no reason that couldn't be disabled. Oh and also, you have to buy the official nvidia DVI cable which will be entirely incompatible with any future DVI monitors despite being the exact same technology.

This is exactly the scenario going on with nvidia right now. We're here because nvidia was years earlier to the market with 3d technology. They're using this advantage to lock you into the exact technologies and displays that they either own or receive money for specifically supporting. Sorry, there's a big difference between needing to be profitable, and taking advantage of your market position to limit the choices and quality of the consumer experience.
[quote name='chiz' date='08 February 2012 - 09:39 PM' timestamp='1328765992' post='1367202']

No DX11.1 as an API is not going to automagically start making games 3D for any game using DX11.1 and replace auto-stereo drivers if that is what you are implying. All it does is provide a quad buffer stereo output mode for the API to directly interface the IHV's driver instead of the devs having to conform to each IHV's own version of quad buffer stereo, which is the case now. Game engines that handle 3D natively still need to handle all the stereo 3D functions with dual camera and double draw calls for 3D and this will not change with DX11.1.



This helps AMD the same way console 3D helps Nvidia/AMD because it provides a standard format for any native 3D game engine where the driver/hardware simply renders the stereo frames as dictated by the game engine instead of having to create stereo images from a single image with an auto-stereo middleware driver. There is no "fighting" drivers any more than fighting drivers to render 2D, because the driver/hardware simply renders what it is told in those quad buffers, then outputs them to whatever 3D hardware format its told to.





Obviously nobody is arguing that games will automatically become perfect 3D with directx 11.1. The argument is that there is now a set of standardized guidelines, best practices and interfaces for rendering geometry and shaders compatible with stereoscopy. Why are you saying "no", complaining about all of the misinformation, and then agreeing word-for-word with everything that everyone is saying is great about DX11.1?



[quote name='chiz' date='08 February 2012 - 09:39 PM' timestamp='1328765992' post='1367202']

Nvidia isn't running a charity sorry. Support costs money and 3D Vision is clearly considered to be a value add premium feature at this point. 3D Vision users pay for their support in the cost of their 3D Vision hardware. 3DTV Play was added much later on to support hardware that Nvidia doesn't collect a single penny in royalties, so they go to the end-user for the software fee. Any 3rd party software like DDD or iZ3D costs similar. Optimized for GeForce is probably paid up front by the OEM and ultimately passed on to the end-user.



Nvidia also has the right to control what hardware they support within their ecosystem. Surely you aren't suggesting they're required to support a direct competitor's product are you? Might as well expect them to write drivers for AMD graphics cards at that point. But yes it makes sense for them to promote their hardware from both a business and QA perspective. It helps ensure the continued success of their 3D Vision program and justifies the cost of continued development.



But best of all, if you don't like the options with Nvidia's ecosystem, you are free to use whatever 3D solution suits you. Nvidia graphics cards work with all of the various solutions out there, DDD, iZ3D with both their hardware and software. Or you can just go with AMD's ecosystem top to bottom, then wonder why they don't support whatever hardware you think they should support. But yes if you think there's better options out there, you're free to pursue them, to say Nvidia locks you into anything is laughable.





Back to my earlier example - what if nvidia happened to be a year earlier to the market with a card that has a DVI port. In the same year, dozens of DVI monitors were released, but only one paid nvidia to be a "nvidia dvi compatible monitor", and therefore it was the only that one worked. All the other monitors, although perfectly capable of using DVI input, displayed a big red error message across the screen for absolutely no reason that couldn't be disabled. Oh and also, you have to buy the official nvidia DVI cable which will be entirely incompatible with any future DVI monitors despite being the exact same technology.



This is exactly the scenario going on with nvidia right now. We're here because nvidia was years earlier to the market with 3d technology. They're using this advantage to lock you into the exact technologies and displays that they either own or receive money for specifically supporting. Sorry, there's a big difference between needing to be profitable, and taking advantage of your market position to limit the choices and quality of the consumer experience.

#18
Posted 02/09/2012 06:56 AM   
[quote name='supcaj' date='09 February 2012 - 01:56 AM' timestamp='1328770576' post='1367217']
Obviously nobody is arguing that games will automatically become perfect 3D with directx 11.1. The argument is that there is now a set of standardized guidelines, best practices and interfaces for rendering geometry and shaders compatible with stereoscopy. Why are you saying "no", complaining about all of the misinformation, and then agreeing word-for-word with everything that everyone is saying is great about DX11.1?[/quote]
Because it won't do any of that. All it does is present a quad buffer to the game or app through DX API so that it doesn't have to use the IHV's own proprietary quad buffer implementation, that's the only "standard" involved: http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/5

Devs are free to continue implementing their good and bad S3D as they see fit, or not at all as the case for most. The only misinformation involved here is you claiming native 3D games to-date suck without a standard API because "their solutions always fight with the drivers and end up sucking" when that's clearly not the case. The drivers don't have any hand in the quality of the 3D in these cases and neither will DX11.1, all it will do is provide a standard stereo render target that both IHVs can handle natively through DX.

[quote]Back to my earlier example - what if nvidia happened to be a year earlier to the market with a card that has a DVI port. In the same year, dozens of DVI monitors were released, but only one paid nvidia to be a "nvidia dvi compatible monitor", and therefore it was the only that one worked. All the other monitors, although perfectly capable of using DVI input, displayed a big red error message across the screen for absolutely no reason that couldn't be disabled. Oh and also, you have to buy the official nvidia DVI cable which will be entirely incompatible with any future DVI monitors despite being the exact same technology.

This is exactly the scenario going on with nvidia right now. We're here because nvidia was years earlier to the market with 3d technology. They're using this advantage to lock you into the exact technologies and displays that they either own or receive money for specifically supporting. Sorry, there's a big difference between needing to be profitable, and taking advantage of your market position to limit the choices and quality of the consumer experience.
[/quote]
Well in the case of DVI, Nvidia does have claim to do whatever they want with it since they were the driving force behind bringing 120Hz DL-DVI LCD monitors to the market. None existed before Nvidia brought them to market with various tech partners and Nvidia holds patents on those technologies. But you obviously don't give them any credit for that. Instead you probably expect them to support their competitor's products like Samsung who clearly chose to divorce themselves from their business dealings with Nvidia.

But in your specific example with HDMI, its not even relevant since Nvidia only ever set out to support HDMI 1.4 output modes with 3DTV Play despite what output modes you and others seem to think they should support after the fact. They're not locking you into anything, they support certain output modes on certain products. If that supports your needs when you buy, great. If not, there's alternatives (but not really). I'm sure they have their reasons for limiting support but there's little doubt they will increase support as the market dictates.
[quote name='supcaj' date='09 February 2012 - 01:56 AM' timestamp='1328770576' post='1367217']

Obviously nobody is arguing that games will automatically become perfect 3D with directx 11.1. The argument is that there is now a set of standardized guidelines, best practices and interfaces for rendering geometry and shaders compatible with stereoscopy. Why are you saying "no", complaining about all of the misinformation, and then agreeing word-for-word with everything that everyone is saying is great about DX11.1?

Because it won't do any of that. All it does is present a quad buffer to the game or app through DX API so that it doesn't have to use the IHV's own proprietary quad buffer implementation, that's the only "standard" involved: http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/5



Devs are free to continue implementing their good and bad S3D as they see fit, or not at all as the case for most. The only misinformation involved here is you claiming native 3D games to-date suck without a standard API because "their solutions always fight with the drivers and end up sucking" when that's clearly not the case. The drivers don't have any hand in the quality of the 3D in these cases and neither will DX11.1, all it will do is provide a standard stereo render target that both IHVs can handle natively through DX.



Back to my earlier example - what if nvidia happened to be a year earlier to the market with a card that has a DVI port. In the same year, dozens of DVI monitors were released, but only one paid nvidia to be a "nvidia dvi compatible monitor", and therefore it was the only that one worked. All the other monitors, although perfectly capable of using DVI input, displayed a big red error message across the screen for absolutely no reason that couldn't be disabled. Oh and also, you have to buy the official nvidia DVI cable which will be entirely incompatible with any future DVI monitors despite being the exact same technology.



This is exactly the scenario going on with nvidia right now. We're here because nvidia was years earlier to the market with 3d technology. They're using this advantage to lock you into the exact technologies and displays that they either own or receive money for specifically supporting. Sorry, there's a big difference between needing to be profitable, and taking advantage of your market position to limit the choices and quality of the consumer experience.



Well in the case of DVI, Nvidia does have claim to do whatever they want with it since they were the driving force behind bringing 120Hz DL-DVI LCD monitors to the market. None existed before Nvidia brought them to market with various tech partners and Nvidia holds patents on those technologies. But you obviously don't give them any credit for that. Instead you probably expect them to support their competitor's products like Samsung who clearly chose to divorce themselves from their business dealings with Nvidia.



But in your specific example with HDMI, its not even relevant since Nvidia only ever set out to support HDMI 1.4 output modes with 3DTV Play despite what output modes you and others seem to think they should support after the fact. They're not locking you into anything, they support certain output modes on certain products. If that supports your needs when you buy, great. If not, there's alternatives (but not really). I'm sure they have their reasons for limiting support but there's little doubt they will increase support as the market dictates.

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

#19
Posted 02/09/2012 08:27 AM   
  2 / 2    
Scroll To Top