3D Vision and the GTX Titan X
  2 / 8    
For me, one of the biggest annoyances of PC gaming right now is the frequent lack of good AA options. I don't know what it is about DX11 and AA, but they just don't seem to get along well. Many games ship with post-processing AA and that's it. Even when it's there, it doesn't always work very well. For example, The Crew comes with various AA options including TXAA x4 and MSAA x8, yet they all look shit. There are a LOT of thin fences, wires, and other lines in The Crew, and for some reason none of the AA options do much to deter the deluge of jaggies that results. Getting a 4K monitor would slightly improve this situation, but not much, and only if you choose a smallish monitor that gives you the benefit of smaller pixels. As far as I can tell, there's only one way to successfully smooth out the jaggies and pixel creep in games like the Crew, and that's to use a 4K (or similarly high) resolution and downsample (even downsampling from a modest 1.2x resolution gives better results to my eyes than TXAA x4 or MSAA x8). So, personally, I'm not itching to get a 4K monitor at all. I'd wager that The Crew looks better at 4K downsampled to 1080p than it would at 4K native. [quote="vulcan78"] Oh god, I hope I never have to do an OS re-install because of new hardware. :( [/quote]You'll want to stay away from motherboard upgrades then. :D
For me, one of the biggest annoyances of PC gaming right now is the frequent lack of good AA options. I don't know what it is about DX11 and AA, but they just don't seem to get along well.

Many games ship with post-processing AA and that's it. Even when it's there, it doesn't always work very well. For example, The Crew comes with various AA options including TXAA x4 and MSAA x8, yet they all look shit. There are a LOT of thin fences, wires, and other lines in The Crew, and for some reason none of the AA options do much to deter the deluge of jaggies that results.

Getting a 4K monitor would slightly improve this situation, but not much, and only if you choose a smallish monitor that gives you the benefit of smaller pixels.

As far as I can tell, there's only one way to successfully smooth out the jaggies and pixel creep in games like the Crew, and that's to use a 4K (or similarly high) resolution and downsample (even downsampling from a modest 1.2x resolution gives better results to my eyes than TXAA x4 or MSAA x8).

So, personally, I'm not itching to get a 4K monitor at all. I'd wager that The Crew looks better at 4K downsampled to 1080p than it would at 4K native.

vulcan78 said:
Oh god, I hope I never have to do an OS re-install because of new hardware. :(
You'll want to stay away from motherboard upgrades then. :D

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#16
Posted 03/28/2015 09:12 AM   
@vulcan78 Well, it's a case of who to believe and also what comparative setups they're using for testing. I'm just taking one example, that being, Digital Storm who somehow managed to snap up four Titan X's for 3-way and 4-way SLI benchmarking. 4-way was lousy, straight off. I think DS do state that they max out all graphical effects as standard for all their tests, so given that, they reckon a single Titan X performed at 50fps in 4k with Crysis 3, 92fps with 2, and 128fps with 3. That obviously differs markedly from the guy you linked to, and who's to say which of them is right at this early stage. I'd imagine the more benchmarking is done by different testers, the clearer the trend will be. It should be said that some testers have been caught out by using a CPU that's not powerful enough when testing, resulting in a number of tested games that were actually CPU locked, and said that they'd come back with another review to test those particular games again with an upgraded CPU just to be sure. The fastest Intel CPU currently, the i7 5960X, at least for bench testing purposes may have to be the first thing to go in. https://www.youtube.com/watch?v=zsZROZjUL-I This is all conjecture of course, as neither of us are going to get one of these anytime soon, if at all, but if I did for instance, I'd be running my CPU and GPUs on a water-cooled dual loop as I currently do now. When water cooling goes wrong, it goes really wrong, which to be fair is not very often. When it works it's blissfully quiet, and it's possible to get a pretty decent overclock and yet remain stable, which is the case for my Titans, where I'm able to safely squeeze just that little bit more performance out of them. I'm glad to know that the ROG Swift has worked out for you with your 780ti's, and that you're getting the most out of it. It's certainly nice to hear some good news. As has been said, it's a shame that the 9XX series with SLI using the ROG Swift appears to have been somewhat problematic of late, but I know nothing more than that, unless you know differently. CD Projekt are an awesome outfit at the moment. Their public relations stance is excellent on the one side regarding DLC and other potential bugbears, and on the other, they provide a great game with excellent eye candy. I do have a soft spot for them as they haven't given up on NVidia's VisualFX and PhysX rendering suites. It's just nice to have that third card put to some good use!
@vulcan78

Well, it's a case of who to believe and also what comparative setups they're using for testing. I'm just taking one example, that being, Digital Storm who somehow managed to snap up four Titan X's for 3-way and 4-way SLI benchmarking. 4-way was lousy, straight off. I think DS do state that they max out all graphical effects as standard for all their tests, so given that, they reckon a single Titan X performed at 50fps in 4k with Crysis 3, 92fps with 2, and 128fps with 3. That obviously differs markedly from the guy you linked to, and who's to say which of them is right at this early stage. I'd imagine the more benchmarking is done by different testers, the clearer the trend will be. It should be said that some testers have been caught out by using a CPU that's not powerful enough when testing, resulting in a number of tested games that were actually CPU locked, and said that they'd come back with another review to test those particular games again with an upgraded CPU just to be sure. The fastest Intel CPU currently, the i7 5960X, at least for bench testing purposes may have to be the first thing to go in.



This is all conjecture of course, as neither of us are going to get one of these anytime soon, if at all, but if I did for instance, I'd be running my CPU and GPUs on a water-cooled dual loop as I currently do now. When water cooling goes wrong, it goes really wrong, which to be fair is not very often. When it works it's blissfully quiet, and it's possible to get a pretty decent overclock and yet remain stable, which is the case for my Titans, where I'm able to safely squeeze just that little bit more performance out of them.

I'm glad to know that the ROG Swift has worked out for you with your 780ti's, and that you're getting the most out of it. It's certainly nice to hear some good news. As has been said, it's a shame that the 9XX series with SLI using the ROG Swift appears to have been somewhat problematic of late, but I know nothing more than that, unless you know differently.

CD Projekt are an awesome outfit at the moment. Their public relations stance is excellent on the one side regarding DLC and other potential bugbears, and on the other, they provide a great game with excellent eye candy. I do have a soft spot for them as they haven't given up on NVidia's VisualFX and PhysX rendering suites. It's just nice to have that third card put to some good use!

Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82, DX11.0

#17
Posted 03/28/2015 09:03 PM   
[quote="vulcan78"]As far as The Witcher 3, given CD Projekt's excellent history, it is pretty much a foregone conclusion that this title will have 3D Vision at launch. [/quote] No it's not. They recently stated in an interview that running with "recommended" specs on ultra settings will only yield 30fps. If their optimisation is that poor, there's no way they'll sacrifice the extra development time they'd need to implement 3D Vision support (and performance would be awful). Since then, they've apparently made optimisations and are now able to hit 60 on high end hardware. But that's still by no means a guarantee for 3D support.
vulcan78 said:As far as The Witcher 3, given CD Projekt's excellent history, it is pretty much a foregone conclusion that this title will have 3D Vision at launch.
No it's not. They recently stated in an interview that running with "recommended" specs on ultra settings will only yield 30fps. If their optimisation is that poor, there's no way they'll sacrifice the extra development time they'd need to implement 3D Vision support (and performance would be awful). Since then, they've apparently made optimisations and are now able to hit 60 on high end hardware. But that's still by no means a guarantee for 3D support.

#18
Posted 03/29/2015 04:37 AM   
@Volnaiskra I'm curious as to what you'd consider the best size that a 4K monitor should be, i.e.; the definition of smallish. Is it a case of too big and the pixels are too large for decent AA, too small, and text becomes just too difficult to read? @Pirateguybrush Risen 3 of course, taught me that particular lesson after Risen 2. @vulcan78 Nice rig and nice vids too. They're very informative, good job. You've obviously done well to keep those cards as cool as you have done without water assistance. Just so that you know, I didn't put my own water-cooled rigs together. That's a little out of my league. I just know what I want. These days I view my gaming hobby as a device to relieve stress, rather than create more of it. @new_parad1gm Sorry for taking your thread off topic a little. Look forward to hearing of any progress you make. If it all works out, and a driver update does correct things, it would be nice if you could give a brief performance overview of stereoscopic 3D in 3D Surround at 1080p and whether you think it was worth the investment.
@Volnaiskra

I'm curious as to what you'd consider the best size that a 4K monitor should be, i.e.; the definition of smallish. Is it a case of too big and the pixels are too large for decent AA, too small, and text becomes just too difficult to read?

@Pirateguybrush

Risen 3 of course, taught me that particular lesson after Risen 2.

@vulcan78

Nice rig and nice vids too. They're very informative, good job. You've obviously done well to keep those cards as cool as you have done without water assistance. Just so that you know, I didn't put my own water-cooled rigs together. That's a little out of my league. I just know what I want. These days I view my gaming hobby as a device to relieve stress, rather than create more of it.

@new_parad1gm

Sorry for taking your thread off topic a little. Look forward to hearing of any progress you make. If it all works out, and a driver update does correct things, it would be nice if you could give a brief performance overview of stereoscopic 3D in 3D Surround at 1080p and whether you think it was worth the investment.

Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82, DX11.0

#19
Posted 03/29/2015 09:28 PM   
[quote="Pirateguybrush"][quote="vulcan78"]As far as The Witcher 3, given CD Projekt's excellent history, it is pretty much a foregone conclusion that this title will have 3D Vision at launch. [/quote] No it's not. They recently stated in an interview that running with "recommended" specs on ultra settings will only yield 30fps. If their optimisation is that poor, there's no way they'll sacrifice the extra development time they'd need to implement 3D Vision support (and performance would be awful). Since then, they've apparently made optimisations and are now able to hit 60 on high end hardware. But that's still by no means a guarantee for 3D support.[/quote]As a Pole, I'm very proud of CDProjekt and am in love with the Witcher games. But I'd hesitate to ever use the words "Witcher" and "at launch" together in any sentence. They're just not good games to play at launch. Initial reviews of Witcher 1 all complain about performance, loading times, crashes, and poor UI, among other things. By the time I got the game, it was the "Extended edition" a year after launch, and all of those issues were non-existent, either fixed, or redone from scratch. I also waited over a year to play Witcher 2, and by then it was version 1.9 or something, with loads of improvements since launch. It turns out that even that wasn't long enough to enjoy the game at its best, because a few months later, one of the developers made a semi-official mod that overhauled and greatly improved all of the combat. One of the reasons CDProjekt are so awesome is because they aim so high. They tried to match the likes of Bioware with their first ever game (and they just about did). With their second ever game, they had the audacity to not only attempt to match Bioware, but to attempt making the best-looking RPG in history (and succeeded). With only their third ever game, they're going to try and outdo Skyrim. The sky's the limit for these guys. And they'll probably pull it off. But I'm sure it'll have just as many bumps as the first two games did in the early months of the game, if not more. If Bethesda still releases a buggy mess each time after 20 years in the business, how will Witcher 3 look when CDProjekt try and make an open world more impressive than Skyrim on their first attempt? But another thing that makes CDProjekt awesome is that they keep doing hardcore improvements to their games a year after launch. If Witcher 3 runs at 30fps, I won't touch it even in 2D. But what I'm more interested in is how it'll end up looking and running in mid 2016 :D I might still buy the Witcher 3 at launch, purely to support a game developer who I see as embodying many of the best qualities a PC game developer should have. But I doubt I'll play it till next year.
Pirateguybrush said:
vulcan78 said:As far as The Witcher 3, given CD Projekt's excellent history, it is pretty much a foregone conclusion that this title will have 3D Vision at launch.
No it's not. They recently stated in an interview that running with "recommended" specs on ultra settings will only yield 30fps. If their optimisation is that poor, there's no way they'll sacrifice the extra development time they'd need to implement 3D Vision support (and performance would be awful). Since then, they've apparently made optimisations and are now able to hit 60 on high end hardware. But that's still by no means a guarantee for 3D support.
As a Pole, I'm very proud of CDProjekt and am in love with the Witcher games. But I'd hesitate to ever use the words "Witcher" and "at launch" together in any sentence. They're just not good games to play at launch.

Initial reviews of Witcher 1 all complain about performance, loading times, crashes, and poor UI, among other things. By the time I got the game, it was the "Extended edition" a year after launch, and all of those issues were non-existent, either fixed, or redone from scratch.

I also waited over a year to play Witcher 2, and by then it was version 1.9 or something, with loads of improvements since launch. It turns out that even that wasn't long enough to enjoy the game at its best, because a few months later, one of the developers made a semi-official mod that overhauled and greatly improved all of the combat.

One of the reasons CDProjekt are so awesome is because they aim so high. They tried to match the likes of Bioware with their first ever game (and they just about did). With their second ever game, they had the audacity to not only attempt to match Bioware, but to attempt making the best-looking RPG in history (and succeeded). With only their third ever game, they're going to try and outdo Skyrim. The sky's the limit for these guys.

And they'll probably pull it off. But I'm sure it'll have just as many bumps as the first two games did in the early months of the game, if not more. If Bethesda still releases a buggy mess each time after 20 years in the business, how will Witcher 3 look when CDProjekt try and make an open world more impressive than Skyrim on their first attempt?

But another thing that makes CDProjekt awesome is that they keep doing hardcore improvements to their games a year after launch.

If Witcher 3 runs at 30fps, I won't touch it even in 2D. But what I'm more interested in is how it'll end up looking and running in mid 2016 :D

I might still buy the Witcher 3 at launch, purely to support a game developer who I see as embodying many of the best qualities a PC game developer should have. But I doubt I'll play it till next year.

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#20
Posted 03/30/2015 02:16 AM   
[quote="ToThePoint"]@Volnaiskra I'm curious as to what you'd consider the best size that a 4K monitor should be, i.e.; the definition of smallish. Is it a case of too big and the pixels are too large for decent AA, too small, and text becomes just too difficult to read?[/quote] I don't have any specific opinions on the matter in terms of the most optimal sizes. But in general, just find jaggies and (especially) pixel creep to be very undesirable. Both jaggies and pixel creep have way more to do with how big the pixels are than with how many of them there are. So it's a bit silly how the industry talks so breathlessly about resolutions, without ever even mentioning pixel size. It's kind of like 15 years ago when huge HD TVs would proudly boast that they had 4x the pixels of standard TVs. It was like "so what? it's also physically 8x bigger, so the pixels end up fatter than ever, and the image looks like it's made of lego." My monitor and my smartphone are both 1920x1080. On my smartphone, text and all other curves and lines look beautiful, and jaggies are non-existent. On my monitor, which is more than 5 times larger, everything looks 5 times jaggier. As I type this, I can see every individual pixel in each letter. And of course things are much worse in games when you get diagonal lines and curves in motion (pixel creep). My phone has a pixel density of 441ppi, which seems to be near the perfect limit to my eyes. I can't see jaggies on my phone with normal viewing (whereas on Apple's so-called "retina" sreens, which are 264ppi, I can easily see jaggies, though they are small). (you can compare the specs of many devices at www.screensiz.es ) For reference, the ROG Swift is a paltry 109ppi, while the 4K PB287Q monitor is still a measly 157ppi. If my 27" monitor were to have the same great smoothness as my phone (ie. have a 441 pixel density), it would need to have a resolution [color="orange"]of more than 9600 x 5400[/color]. By the time we have screens like that, and the GPUs to push them, we'll all be driving hovercars. Therein lies the core issue. Our monitors are nowhere near that kind of pixel density, and for all the hyperbolic marketing hype, 4K resolutions take us only marginally closer (and only if the monitors stay the same size). So for now, and in the the foreseeable future, the only way to sucessfully combat jaggies and pixel creep is to use AA. And by far the best form of AA is to downsample (especially nowadays, where many recent games don't have good in-built AA). So what I'm saying is that many games will end up looking smoother and nicer on a 1080p screen downsampled from 3K or 4K, than on a native 4K screen. Of course, games would look even better on a 4K screen downsampled from 6K or 8K, but that level of brute-force antialiasing at playable framerates is unfortunately still science fiction.
ToThePoint said:@Volnaiskra

I'm curious as to what you'd consider the best size that a 4K monitor should be, i.e.; the definition of smallish. Is it a case of too big and the pixels are too large for decent AA, too small, and text becomes just too difficult to read?


I don't have any specific opinions on the matter in terms of the most optimal sizes. But in general, just find jaggies and (especially) pixel creep to be very undesirable.

Both jaggies and pixel creep have way more to do with how big the pixels are than with how many of them there are. So it's a bit silly how the industry talks so breathlessly about resolutions, without ever even mentioning pixel size. It's kind of like 15 years ago when huge HD TVs would proudly boast that they had 4x the pixels of standard TVs. It was like "so what? it's also physically 8x bigger, so the pixels end up fatter than ever, and the image looks like it's made of lego."

My monitor and my smartphone are both 1920x1080. On my smartphone, text and all other curves and lines look beautiful, and jaggies are non-existent. On my monitor, which is more than 5 times larger, everything looks 5 times jaggier. As I type this, I can see every individual pixel in each letter. And of course things are much worse in games when you get diagonal lines and curves in motion (pixel creep).

My phone has a pixel density of 441ppi, which seems to be near the perfect limit to my eyes. I can't see jaggies on my phone with normal viewing (whereas on Apple's so-called "retina" sreens, which are 264ppi, I can easily see jaggies, though they are small). (you can compare the specs of many devices at www.screensiz.es )

For reference, the ROG Swift is a paltry 109ppi, while the 4K PB287Q monitor is still a measly 157ppi.

If my 27" monitor were to have the same great smoothness as my phone (ie. have a 441 pixel density), it would need to have a resolution of more than 9600 x 5400. By the time we have screens like that, and the GPUs to push them, we'll all be driving hovercars.

Therein lies the core issue. Our monitors are nowhere near that kind of pixel density, and for all the hyperbolic marketing hype, 4K resolutions take us only marginally closer (and only if the monitors stay the same size).

So for now, and in the the foreseeable future, the only way to sucessfully combat jaggies and pixel creep is to use AA. And by far the best form of AA is to downsample (especially nowadays, where many recent games don't have good in-built AA).

So what I'm saying is that many games will end up looking smoother and nicer on a 1080p screen downsampled from 3K or 4K, than on a native 4K screen. Of course, games would look even better on a 4K screen downsampled from 6K or 8K, but that level of brute-force antialiasing at playable framerates is unfortunately still science fiction.

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#21
Posted 03/30/2015 03:00 AM   
[quote="Volnaiskra] Therein lies the core issue. Our monitors are nowhere near that kind of pixel density, and for all the hyperbolic marketing hype, 4K resolutions take us only marginally closer (and only if the monitors stay the same size). So for now, and in the the foreseeable future, the only way to sucessfully combat jaggies and pixel creep is to use AA. And by far the best form of AA is to downsample (especially nowadays, where many recent games don't have good in-built AA).[/quote] Thanks for your excellent explanation on the subject, but it's raised more questions than answers. So if I understand you correctly, the use of downsampling which essentially overclocks the monitor to resolutions beyond it's native one in order to mimic the effect of AA, still runs the risk of shortening the monitor's life? I appreciate the theory behind it, but would you necessarily advise doing so, even to slightly higher than native resolutions that are not overly ambitious? In other words, are you doing so now? I'm guessing not as you said you're looking at your monitor at 1080p. For reference, my monitor is an Asus VG278H.
Volnaiskra said:

Therein lies the core issue. Our monitors are nowhere near that kind of pixel density, and for all the hyperbolic marketing hype, 4K resolutions take us only marginally closer (and only if the monitors stay the same size).

So for now, and in the the foreseeable future, the only way to sucessfully combat jaggies and pixel creep is to use AA. And by far the best form of AA is to downsample (especially nowadays, where many recent games don't have good in-built AA).


Thanks for your excellent explanation on the subject, but it's raised more questions than answers. So if I understand you correctly, the use of downsampling which essentially overclocks the monitor to resolutions beyond it's native one in order to mimic the effect of AA, still runs the risk of shortening the monitor's life? I appreciate the theory behind it, but would you necessarily advise doing so, even to slightly higher than native resolutions that are not overly ambitious? In other words, are you doing so now? I'm guessing not as you said you're looking at your monitor at 1080p. For reference, my monitor is an Asus VG278H.

Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82, DX11.0

#22
Posted 03/30/2015 10:47 AM   
[quote="ToThePoint"]downsampling which essentially overclocks the monitor to resolutions beyond it's native one[/quote] That's not how it works. Downsampling renders the image at a resolution beyond the monitor resolution, then shrinks it down to fit on the monitor. As far as the monitor is concerned, it's being fed a normal image at native resolution.
ToThePoint said:downsampling which essentially overclocks the monitor to resolutions beyond it's native one
That's not how it works. Downsampling renders the image at a resolution beyond the monitor resolution, then shrinks it down to fit on the monitor. As far as the monitor is concerned, it's being fed a normal image at native resolution.

#23
Posted 03/30/2015 11:55 AM   
[quote="Pirateguybrush"][quote="ToThePoint"]downsampling which essentially overclocks the monitor to resolutions beyond it's native one[/quote] That's not how it works. Downsampling renders the image at a resolution beyond the monitor resolution, then shrinks it down to fit on the monitor. As far as the monitor is concerned, it's being fed a normal image at native resolution.[/quote] Assuming that is correct, it just sounds too good to be true, even if it is. Have you downsampled your monitor, and if so to what resolution? I've been reading a both informative and amusing guide to downsampling, who's author has pushed the resolution of his 1080p monitor to 3600x2025, using manual timings, but at 60Hz only. I realise that you use a projector, I just wondered if you had a monitor as well.
Pirateguybrush said:
ToThePoint said:downsampling which essentially overclocks the monitor to resolutions beyond it's native one
That's not how it works. Downsampling renders the image at a resolution beyond the monitor resolution, then shrinks it down to fit on the monitor. As far as the monitor is concerned, it's being fed a normal image at native resolution.


Assuming that is correct, it just sounds too good to be true, even if it is. Have you downsampled your monitor, and if so to what resolution? I've been reading a both informative and amusing guide to downsampling, who's author has pushed the resolution of his 1080p monitor to 3600x2025, using manual timings, but at 60Hz only. I realise that you use a projector, I just wondered if you had a monitor as well.

Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82, DX11.0

#24
Posted 03/30/2015 12:19 PM   
Yep, like pirate says. It makes your GPU calculate a larger image that it then shrinks to your native resolution. In the process of shrinking the image, it smooths out really nicely. And unlike most other forms of anti aliasing, it smooths out every pixel and every object. Nothing is left behind. You can try it easily. Enable DSR in the "global" part of the Nvidia control panel. For example to 1.5x resolution. Don't worry, this won't actually do anything to your games on its own. What it does is enables higher resolutions to be selectable in in-game menus. Load up a game, go to the graphics options, and choose a high, and previously unavailable, resolution. Bask at the peerless anti aliasing, and cry at the performance hit!
Yep, like pirate says. It makes your GPU calculate a larger image that it then shrinks to your native resolution.

In the process of shrinking the image, it smooths out really nicely. And unlike most other forms of anti aliasing, it smooths out every pixel and every object. Nothing is left behind.

You can try it easily. Enable DSR in the "global" part of the Nvidia control panel. For example to 1.5x resolution.

Don't worry, this won't actually do anything to your games on its own. What it does is enables higher resolutions to be selectable in in-game menus.

Load up a game, go to the graphics options, and choose a high, and previously unavailable, resolution. Bask at the peerless anti aliasing, and cry at the performance hit!

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#25
Posted 03/30/2015 12:23 PM   
@Volnaiskra Okay, I'll try that. ;)
@Volnaiskra

Okay, I'll try that. ;)

Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82, DX11.0

#26
Posted 03/30/2015 12:27 PM   
We have the same monitor by the way. I've been using DSR recently, and I love it. But I only use it sparingly, since it has a sizeable performance impact (it is after all rendering way more pixels, even if it doesn't end up displaying them all).
We have the same monitor by the way. I've been using DSR recently, and I love it. But I only use it sparingly, since it has a sizeable performance impact (it is after all rendering way more pixels, even if it doesn't end up displaying them all).

ImageVolnaPC.com - Tips, tweaks, performance comparisons (PhysX card, SLI scaling, etc)

#27
Posted 03/30/2015 12:34 PM   
@Volnaiskra Alas, my driver's too old for DSR to be an available option, which explains why I hadn't experimented with it. Tried changing the resolution, but even at 1440 @ 120Hz there's a bandwidth issue. It was worth a try. My current driver gives me 3D,SLI and SLI scaling for all DX9 and DX11.0 games that I've tried except Shadow of Mordor. I took the decision a while back that I wasn't going to consider undermining my system's stereoscopic 3D functionality with a driver update just to get that going in S3D. Hence why I already consider my rig an archive unfortunately.
@Volnaiskra

Alas, my driver's too old for DSR to be an available option, which explains why I hadn't experimented with it. Tried changing the resolution, but even at 1440 @ 120Hz there's a bandwidth issue. It was worth a try. My current driver gives me 3D,SLI and SLI scaling for all DX9 and DX11.0 games that I've tried except Shadow of Mordor. I took the decision a while back that I wasn't going to consider undermining my system's stereoscopic 3D functionality with a driver update just to get that going in S3D. Hence why I already consider my rig an archive unfortunately.

Intel Core i7 4770k @ 4.4Ghz, 3x GTX Titan, 16GB Tactical Tracer LED, CPU/GPU Dual-Loop Water-Cooled - Driver 331.82, DX11.0

#28
Posted 03/30/2015 01:57 PM   
[quote="ToThePoint"]Have you downsampled your monitor, and if so to what resolution?[/quote] Yes, to a range of different resolutions [quote="ToThePoint"]I realise that you use a projector, I just wondered if you had a monitor as well. [/quote] I do, but DSR doesn't work on my projector. It only downsamples to the native resolution (1080p), but I need it to go down to 720p in order to get 60hz 3d.
ToThePoint said:Have you downsampled your monitor, and if so to what resolution?
Yes, to a range of different resolutions
ToThePoint said:I realise that you use a projector, I just wondered if you had a monitor as well.
I do, but DSR doesn't work on my projector. It only downsamples to the native resolution (1080p), but I need it to go down to 720p in order to get 60hz 3d.

#29
Posted 03/30/2015 04:45 PM   
I just received my titan-x and 3D is working as it was on 980 GTX. Games Tested Far Cry 4 with patch Dying Light Lords of the Fallen Hard Reset Shadow of Mordor Thief
I just received my titan-x and 3D is working as it was on 980 GTX.

Games Tested
Far Cry 4 with patch
Dying Light
Lords of the Fallen
Hard Reset
Shadow of Mordor
Thief

Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55

#30
Posted 03/30/2015 05:21 PM   
  2 / 8    
Scroll To Top