About to buying a 3d ready and g-sync monitor. So many questions! help please
  1 / 3    
Greetings. I'm about to buy a new monitor. The point is to replace my old samsung 2233rz with a 27" in order to have a better 3d experience (2233rz has NOT lightboost) and take advantage of g-sync on the way I'm almost sure that the winner is the ACER XB270HA http://www.newegg.ca/Product/Product.aspx?Item=N82E16824009657 I can purchase it here in Spain at 479€ The Asus rog swift is too expensive, and the future BenqXL2720G still hasn't release date (and it'll be expensive too) NOW, the questions. 1) I do know that I can't use 3d and g sync at the same time. But "old generation" 3d 27" monitors as Asus vg278He or benq XL2720Z still cost 400€, so for 79€ more I can get one 144hz 3d ready AND g.sync monitor. What do you think? 2) About LIGHTBOOST AND ULMB: One of the things I'm searching the most is the lightboost feature. With my 2233rz the 3d experience is good but very dark. The thing is that the nwe g sync monitors doesn't have lightboost but ULMB. Do these things do the same job? I mean, the 3d experience will be brighter that on my current monitor, i hope, right? or not? 3) I have a gtx 680 SLI rig. And I'm reading MANY complaints about g sync not working properly with SLI. Anyone has experienced this? Anyway, the fact is that I want a good nvidia 3D ready monitor, but on the way and just for 79€ more I can get one with g-sync. Should I do it? Or will it be a pain in the ass? (I mean, 3d, g-sync, ulmb, SLI! and (i forgot it) a second monitor, all this in the shaker) Thanks for reading me. I hope you can help me. And sorry for my bad "google translated" english. ;-)
Greetings.
I'm about to buy a new monitor. The point is to replace my old samsung 2233rz with a 27" in order to have a better 3d experience (2233rz has NOT lightboost) and take advantage of g-sync on the way

I'm almost sure that the winner is the ACER XB270HA

http://www.newegg.ca/Product/Product.aspx?Item=N82E16824009657


I can purchase it here in Spain at 479€

The Asus rog swift is too expensive, and the future BenqXL2720G still hasn't release date (and it'll be expensive too)

NOW, the questions.

1) I do know that I can't use 3d and g sync at the same time. But "old generation" 3d 27" monitors as Asus vg278He or benq XL2720Z still cost 400€, so for 79€ more I can get one 144hz 3d ready AND g.sync monitor. What do you think?

2) About LIGHTBOOST AND ULMB: One of the things I'm searching the most is the lightboost feature. With my 2233rz the 3d experience is good but very dark. The thing is that the nwe g sync monitors doesn't have lightboost but ULMB. Do these things do the same job? I mean, the 3d experience will be brighter that on my current monitor, i hope, right? or not?

3) I have a gtx 680 SLI rig. And I'm reading MANY complaints about g sync not working properly with SLI. Anyone has experienced this?

Anyway, the fact is that I want a good nvidia 3D ready monitor, but on the way and just for 79€ more I can get one with g-sync. Should I do it? Or will it be a pain in the ass? (I mean, 3d, g-sync, ulmb, SLI! and (i forgot it) a second monitor, all this in the shaker)


Thanks for reading me. I hope you can help me. And sorry for my bad "google translated" english. ;-)

#1
Posted 12/26/2014 04:21 PM   
1. My personal, very subjective (!) point of view: not only you cannot use 3D with g-sync, but also no ULMB/low persistence mode. For me G-sync is pointless. If I had such monitor, then I wouldn't EVER even turn this feature on. Low persistence, when you achieve stable 60fps (or 2x60fps in 3D) gives so much more than g-sync. And if you don't want to play with lag, then there's always option to play at 144Hz or even with v-sync off, which doesn't exactly have to give you worse picture quality overall, than g-synced but blurred image. This is a different topic, and a big one too (mouse rates, 160 or 480fps for 120hz display, and such). For me anyway - it's not worth adding even 10$ to the monitor. Besides I don't know to what are you comparing the prices now. 400€ = Benq 2720z, but 479€ = ??? 2. This is the same thing, although might be a little different for your particular monitor. I mean - ULMB mode and Lightboost mode might be treated as one thing by your monitor, or it can be treated as two separate things. I don't have such monitor so I don't want to guess, but I do have Benq monitor, and in this case - Benq Blur Reduction mode differs from Lightboost - there are different menu options. Physically it's the same thing, though. 3. I wouldn't worry about g-sync and SLI, as you might guess by now, but beware - there is a HUGE problem with 3D and SLI now, to the point, that there are people now, who prefer to stay on their older cards, instead of buying gtx970/980s, just because they wouldn't be able to use older drivers, the only ones that work well with SLI. Back to the monitor dilema. Asus 1440p ROG would cost 700-800€. If you choose it, be sure to read the topic about this monitor model having very popular flaw that makes 3D image somewhat broken/worse. I don't know the topic, but I saw a discussion here, search the forum for it, or go back a few weeks, to older posts, and you'll find it. Besides this problem, which might be avoidable, this monitor has much better image. It's still a TN, don't forget about that, but it's a well calibrated one, and one of the most expensive, contrary to what you'll find in Benq's offerings. But of course - TN and "expensive" don't exactly much. Let's say cheap, and cheapest. ;) Asus monitor doesn't support 60Hz low persistence mode, and no low persistence above 120Hz. If it's important or not - it's highly subjective. The last difference between those monitors, is the resolution. All in all - you'll have much better comfort playing Battlefield 4 on Asus 1440p, than on Benq 1080p, but when playing in 2D. How exactly 3D looks on Asus monitors - you'd have to ask the guys discussing in mentioned thread. As to the image brightness. Nope. It's not going to be better. It's strobing. Plus shutter glasses. It will stay a little too dark. But you know what I do in games I really want to have a brighter image? I play with a little tuned Nvidia Panel settings, I add a little more colors, I set lightboost at 100% (the motion clarity is noticeably worse, but still good enough in most cases, still awesome), then there is contrast. It depends on the scene. If it's a cloudy weather, I turn it up, sometimes all the way up to 80%. If the contrasts artifacts are visible, for example in sunny locations, I lower it, to the point it's acceptable. Sometimes to 60%, sometimes to 10%. It's not that bad. I just played, turned the light in my room off, and played a highly colorful and good contrast level of Sonic Racing Transformed. The image quality, colors, blacks, and all - were amazing even in 3D. And once again, I had to search for my jaw. One flying level above the ocean, and the last level unlockable in this game. They look simply A W E S O M E. 10/10, A+ etc. :) I'm on Benq 2411Z + Vision 2 glasses, by the way. edit: I thought you already have a Lightboost monitor, but now I see it's and old Samsung model. So you'll have a noticeably brighter image. It still would be a little too dark, for example in Battlefield 4 (I played about 10 hours in 3D, but not on much different levels, mostly 3 maps + 3 other sometimes, so your opinion may vary).
1. My personal, very subjective (!) point of view: not only you cannot use 3D with g-sync, but also no ULMB/low persistence mode. For me G-sync is pointless. If I had such monitor, then I wouldn't EVER even turn this feature on. Low persistence, when you achieve stable 60fps (or 2x60fps in 3D) gives so much more than g-sync. And if you don't want to play with lag, then there's always option to play at 144Hz or even with v-sync off, which doesn't exactly have to give you worse picture quality overall, than g-synced but blurred image. This is a different topic, and a big one too (mouse rates, 160 or 480fps for 120hz display, and such). For me anyway - it's not worth adding even 10$ to the monitor.
Besides I don't know to what are you comparing the prices now. 400€ = Benq 2720z, but 479€ = ???
2. This is the same thing, although might be a little different for your particular monitor. I mean - ULMB mode and Lightboost mode might be treated as one thing by your monitor, or it can be treated as two separate things. I don't have such monitor so I don't want to guess, but I do have Benq monitor, and in this case - Benq Blur Reduction mode differs from Lightboost - there are different menu options. Physically it's the same thing, though.

3. I wouldn't worry about g-sync and SLI, as you might guess by now, but beware - there is a HUGE problem with 3D and SLI now, to the point, that there are people now, who prefer to stay on their older cards, instead of buying gtx970/980s, just because they wouldn't be able to use older drivers, the only ones that work well with SLI.


Back to the monitor dilema.
Asus 1440p ROG would cost 700-800€. If you choose it, be sure to read the topic about this monitor model having very popular flaw that makes 3D image somewhat broken/worse. I don't know the topic, but I saw a discussion here, search the forum for it, or go back a few weeks, to older posts, and you'll find it.
Besides this problem, which might be avoidable, this monitor has much better image. It's still a TN, don't forget about that, but it's a well calibrated one, and one of the most expensive, contrary to what you'll find in Benq's offerings. But of course - TN and "expensive" don't exactly much. Let's say cheap, and cheapest. ;)
Asus monitor doesn't support 60Hz low persistence mode, and no low persistence above 120Hz.
If it's important or not - it's highly subjective.

The last difference between those monitors, is the resolution. All in all - you'll have much better comfort playing Battlefield 4 on Asus 1440p, than on Benq 1080p, but when playing in 2D.
How exactly 3D looks on Asus monitors - you'd have to ask the guys discussing in mentioned thread.

As to the image brightness.
Nope. It's not going to be better. It's strobing. Plus shutter glasses. It will stay a little too dark.
But you know what I do in games I really want to have a brighter image? I play with a little tuned Nvidia Panel settings, I add a little more colors, I set lightboost at 100% (the motion clarity is noticeably worse, but still good enough in most cases, still awesome), then there is contrast. It depends on the scene. If it's a cloudy weather, I turn it up, sometimes all the way up to 80%. If the contrasts artifacts are visible, for example in sunny locations, I lower it, to the point it's acceptable. Sometimes to 60%, sometimes to 10%.

It's not that bad. I just played, turned the light in my room off, and played a highly colorful and good contrast level of Sonic Racing Transformed. The image quality, colors, blacks, and all - were amazing even in 3D.
And once again, I had to search for my jaw. One flying level above the ocean, and the last level unlockable in this game. They look simply A W E S O M E. 10/10, A+ etc. :)
I'm on Benq 2411Z + Vision 2 glasses, by the way.











edit: I thought you already have a Lightboost monitor, but now I see it's and old Samsung model.
So you'll have a noticeably brighter image. It still would be a little too dark, for example in Battlefield 4 (I played about 10 hours in 3D, but not on much different levels, mostly 3 maps + 3 other sometimes, so your opinion may vary).

#2
Posted 12/26/2014 06:49 PM   
Wow RonsonPL. What a elaborated response. Thanks a lot. The 479€ is the price for the Acer (with g-sync). "Only" 79€ more than other similar monitors (gaming, 27" 1080p, 144 hz, etc etc...). So I think is a good inversion even though for using g-sync only when playing in 2d (less times). I know that the 1440p of the Asus rog swift is a clear winner (BTW, Philips and Acer -again- are preparing g sync gaming monitors at 1440p for this next year), but I can't spend so much money :-( AND I prefer that my gpu rig (2xgtx 680 SLI) runs smoother. About lightboost, right, I don't have it now. So I'm glad to hear that I'll have a brighter image. I already knew the nvidia control pannel trick btw ;.) I think that ULMB mode is the way to activate strobe system IN 2D, in order to reduce motion blur, but in 3d mode the lightboost system works as usual (I'm not sure about this, anyone knows?) And about SLI, I'm using right now 3d vision in SLI and the most of the times is fine. ALTHOUGH I'm using 337.88 drivers Thanks again for your response
Wow RonsonPL. What a elaborated response. Thanks a lot.

The 479€ is the price for the Acer (with g-sync). "Only" 79€ more than other similar monitors (gaming, 27" 1080p, 144 hz, etc etc...). So I think is a good inversion even though for using g-sync only when playing in 2d (less times).

I know that the 1440p of the Asus rog swift is a clear winner (BTW, Philips and Acer -again- are preparing g sync gaming monitors at 1440p for this next year), but I can't spend so much money :-( AND I prefer that my gpu rig (2xgtx 680 SLI) runs smoother.

About lightboost, right, I don't have it now. So I'm glad to hear that I'll have a brighter image. I already knew the nvidia control pannel trick btw ;.)
I think that ULMB mode is the way to activate strobe system IN 2D, in order to reduce motion blur, but in 3d mode the lightboost system works as usual (I'm not sure about this, anyone knows?)

And about SLI, I'm using right now 3d vision in SLI and the most of the times is fine. ALTHOUGH I'm using 337.88 drivers

Thanks again for your response

#3
Posted 12/26/2014 07:23 PM   
Stay away from the rog swift its nice and all but it has inversion problems in 3d which make the image look like its interlaced. I own one, and the money it costs I wish I never bought it. When something else comes along with the same specs I will get rid of it I think. Asus support for it is pathetic
Stay away from the rog swift its nice and all but it has inversion problems in 3d which make the image look like its interlaced. I own one, and the money it costs I wish I never bought it. When something else comes along with the same specs I will get rid of it I think. Asus support for it is pathetic

i7 4930K @ 4.4GHz
Asus P9X79 Pro
3 Way SLI Titan Black @ 1400mhz skyn3t VBIOS (Hardvolt Mod)
Mushkin Redline @ 2200MHz 32GB
Asus Xonar U7 Echelon Soundcard
Samsung Pro 256 GB SSD Games
Samsung Evo 256 GB SSD Windows 8.1 Pro
Samsung Evo 256 GB SSD Windows 7 Ultimate
Asus ROG Swift 1440p 144hz G-Sync
PSU Corsair AX1500i
Astro A50 Wireless Headset
Corsair 800D Case Custom Waterloop

#4
Posted 12/26/2014 08:01 PM   
I am in the same situation, I have samsung 2233RZ too, and I wish I had clear in my mind the perfect monitor to buy and change once for all, but..., I don't. This past days I have read about OLED new tech now available in TVs, and I am really asking myself if that may be the reason to move away from 3D (in case OLED is not compatible with Nvidia 3d vision). I really wonder what is better, whether 3d or a perfect black pixel. I love 3D, but with my TN monitor I really do not enjoy watching movies and other kind of things, and of course playing 2D is not a good idea. I think that today there is not a clear winner to change my monitor, and maybe the better idea is to wait until something happens. I am looking forward to listening good news about OLED monitors (3D or not compatibles), even though it seems that for the moment there is no plans to launch any one. Another thing is that I still have my GTX 660 TI (2 Gb ram), and it starts to be not enough. I have read good comments about GTX 970, but..., is not cheap, I think it is about 350€.
I am in the same situation, I have samsung 2233RZ too, and I wish I had clear in my mind the perfect monitor to buy and change once for all, but..., I don't. This past days I have read about OLED new tech now available in TVs, and I am really asking myself if that may be the reason to move away from 3D (in case OLED is not compatible with Nvidia 3d vision). I really wonder what is better, whether 3d or a perfect black pixel. I love 3D, but with my TN monitor I really do not enjoy watching movies and other kind of things, and of course playing 2D is not a good idea.

I think that today there is not a clear winner to change my monitor, and maybe the better idea is to wait until something happens. I am looking forward to listening good news about OLED monitors (3D or not compatibles), even though it seems that for the moment there is no plans to launch any one.

Another thing is that I still have my GTX 660 TI (2 Gb ram), and it starts to be not enough. I have read good comments about GTX 970, but..., is not cheap, I think it is about 350€.

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

#5
Posted 12/26/2014 09:23 PM   
I've got the VG248 with the DIY kit for G-sync. I've had that for about 6 months now, and I have to say that I can give a damn about g-sync because it does not work with 3D. There is no technology reason why, they are not incompatible, it's lack of imagination on NVidia's part. Realistically, g-sync is nice, but I have absolutely zero interest in playing anything in 2D, because I already have so many epic 3D games in my backlog. For the cyclops gamers, it's a great game changer, but for 3D gamers, it's a NOP. I wouldn't spend the money for it. For new screen- I still encourage people to bite the bullet and get a DLP projector. 3D experience is dramatically superior to any LCD monitor. People [i]always [/i]dismiss projectors because of low resolution, without taking into account how far you sit from the screen. Look up the pixel-arc-seconds of visual acuity, and understand why it doesn't matter. The other huge advantage of 720p is lowered performance cost, so that you don't need to upgrade your GPUs. 720p DLP projector still provides the best 3D picture today.
I've got the VG248 with the DIY kit for G-sync. I've had that for about 6 months now, and I have to say that I can give a damn about g-sync because it does not work with 3D. There is no technology reason why, they are not incompatible, it's lack of imagination on NVidia's part.

Realistically, g-sync is nice, but I have absolutely zero interest in playing anything in 2D, because I already have so many epic 3D games in my backlog. For the cyclops gamers, it's a great game changer, but for 3D gamers, it's a NOP. I wouldn't spend the money for it.


For new screen- I still encourage people to bite the bullet and get a DLP projector. 3D experience is dramatically superior to any LCD monitor.

People always dismiss projectors because of low resolution, without taking into account how far you sit from the screen. Look up the pixel-arc-seconds of visual acuity, and understand why it doesn't matter. The other huge advantage of 720p is lowered performance cost, so that you don't need to upgrade your GPUs.


720p DLP projector still provides the best 3D picture today.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#6
Posted 12/27/2014 05:37 AM   
bo3d I almost agree. The parts when I don't: 1.[quote="bo3b"] For the cyclops gamers, it's a great game changer [/quote] This is subjective, but for me, even in 2D, low persistence beats g-sync mode easily. 2. The part about DLP projectors is probably true for most cases. But not for all. The difference in size obviously makes a huge difference, but you aren't forced to loose it, if you choose a monitor. So here's a case, when monitor would beat DLP projector: A gamer that sits close to the monitor or bought an extension arm, so he can place his monitor over his comfortable armchair of bed, or wherever he feels most comfortable. This way you can watch the screen not smaller than the projection screen. Of course you need to use high separation and adjust convergence, but I do it, and after comparing to my friend's 3D projector (a good one, and with a good, expensive 110" screen), I cannot say I'm loosing in size department. Besides - there is no low persistence 1080p120 (true 60Hz per eye) projector on the market. And I get you - I played on a projector with resolution of 800x600 pixels. But on the other hand - when I play from a close distance on my monitor, I am annoyed by the low (1080p) resolution. It would really help to have at least 1440p. I wouldn't want to go back to 720p. And I have tons of great games that run in 60fps in 3D at 1080p on a single gtx760. In the times of GTX970, I don't think we should consider lower requirements (720p) as a significant argument. And as I said many times, being an addict of low persistence, I'd rather have low details at 1080p instead of ultra, but with blurred image. And most DLP projectors don't have low persistence mode. So it mostly depends on the person. For some - DLP might be better. For others, monitors.
bo3d

I almost agree. The parts when I don't:
1.
bo3b said: For the cyclops gamers, it's a great game changer

This is subjective, but for me, even in 2D, low persistence beats g-sync mode easily.

2. The part about DLP projectors is probably true for most cases. But not for all. The difference in size obviously makes a huge difference, but you aren't forced to loose it, if you choose a monitor.
So here's a case, when monitor would beat DLP projector:
A gamer that sits close to the monitor or bought an extension arm, so he can place his monitor over his comfortable armchair of bed, or wherever he feels most comfortable. This way you can watch the screen not smaller than the projection screen. Of course you need to use high separation and adjust convergence, but I do it, and after comparing to my friend's 3D projector (a good one, and with a good, expensive 110" screen), I cannot say I'm loosing in size department.
Besides - there is no low persistence 1080p120 (true 60Hz per eye) projector on the market.
And I get you - I played on a projector with resolution of 800x600 pixels. But on the other hand - when I play from a close distance on my monitor, I am annoyed by the low (1080p) resolution. It would really help to have at least 1440p. I wouldn't want to go back to 720p. And I have tons of great games that run in 60fps in 3D at 1080p on a single gtx760. In the times of GTX970, I don't think we should consider lower requirements (720p) as a significant argument. And as I said many times, being an addict of low persistence, I'd rather have low details at 1080p instead of ultra, but with blurred image. And most DLP projectors don't have low persistence mode.

So it mostly depends on the person. For some - DLP might be better. For others, monitors.

#7
Posted 12/27/2014 10:14 AM   
[quote="RonsonPL"]This is subjective, but for me, even in 2D, low persistence beats g-sync mode easily.[/quote] That's only true if you can sustain high frame rates. The entire point of g-sync is to give you that same experience with low frame rates. The very best hardware today has no prayer of running even 1440p at high enough frame rates to get into low persistence/motion blur. Running at 144Hz at that resolution is not really viable yet. [quote="RonsonPL"]2. The part about DLP projectors is probably true for most cases. But not for all. The difference in size obviously makes a huge difference, but you aren't forced to loose it, if you choose a monitor. So here's a case, when monitor would beat DLP projector: A gamer that sits close to the monitor or bought an extension arm, so he can place his monitor over his comfortable armchair of bed, or wherever he feels most comfortable. This way you can watch the screen not smaller than the projection screen. Of course you need to use high separation and adjust convergence, but I do it, and after comparing to my friend's 3D projector (a good one, and with a good, expensive 110" screen), I cannot say I'm loosing in size department.[/quote] I'm less sure. I have a 1080p monitor on an arm, and whenever I play it instead of projector, it's not nearly as immersive. I'm not sure why, because it is roughly the same field of view. My pure speculation is that with my focus on an up-close thing, my brain doesn't consider it 'real' because it knows nothing can be that big that close. [quote="RonsonPL"]Besides - there is no low persistence 1080p120 (true 60Hz per eye) projector on the market. And I get you - I played on a projector with resolution of 800x600 pixels. But on the other hand - when I play from a close distance on my monitor, I am annoyed by the low (1080p) resolution. It would really help to have at least 1440p. I wouldn't want to go back to 720p. And I have tons of great games that run in 60fps in 3D at 1080p on a single gtx760. In the times of GTX970, I don't think we should consider lower requirements (720p) as a significant argument. And as I said many times, being an addict of low persistence, I'd rather have low details at 1080p instead of ultra, but with blurred image. And most DLP projectors don't have low persistence mode. So it mostly depends on the person. For some - DLP might be better. For others, monitors.[/quote] You are still confusing 720p projector with monitors. Of course no one would use a 720p monitor up close, but 720p at 10 feet is not even remotely comparable. You seem to have a bias against the resolution, without considering whether you can see the pixels or not. I said it before, and I'll say it again: pixel-arc-seconds. It matters. [url]http://en.wikipedia.org/wiki/Visual_acuity[/url] As far as persistence on DLP projector, I'm not sure what you are thinking of, but DLP technology has zero-persistence, and pixel switch times in microseconds, not milliseconds. It's a micromirror- once it's turned, the light is out, there is no persistence. I still think 720p on DLP is a giant, giant win. I can turn up ALL of the settings in nearly every game, all the eye candy, and still have a killer frame rate. Playing at 1440p up close and having to turn everything down to get sustainable frame rates is an inferior experience in my judgment. Secondly, I don't presently have any need to upgrade past SLI 760. So $400 for two 4G cards gives me the best experience possible today.
RonsonPL said:This is subjective, but for me, even in 2D, low persistence beats g-sync mode easily.

That's only true if you can sustain high frame rates. The entire point of g-sync is to give you that same experience with low frame rates. The very best hardware today has no prayer of running even 1440p at high enough frame rates to get into low persistence/motion blur. Running at 144Hz at that resolution is not really viable yet.


RonsonPL said:2. The part about DLP projectors is probably true for most cases. But not for all. The difference in size obviously makes a huge difference, but you aren't forced to loose it, if you choose a monitor.
So here's a case, when monitor would beat DLP projector:
A gamer that sits close to the monitor or bought an extension arm, so he can place his monitor over his comfortable armchair of bed, or wherever he feels most comfortable. This way you can watch the screen not smaller than the projection screen. Of course you need to use high separation and adjust convergence, but I do it, and after comparing to my friend's 3D projector (a good one, and with a good, expensive 110" screen), I cannot say I'm loosing in size department.

I'm less sure. I have a 1080p monitor on an arm, and whenever I play it instead of projector, it's not nearly as immersive. I'm not sure why, because it is roughly the same field of view. My pure speculation is that with my focus on an up-close thing, my brain doesn't consider it 'real' because it knows nothing can be that big that close.


RonsonPL said:Besides - there is no low persistence 1080p120 (true 60Hz per eye) projector on the market.
And I get you - I played on a projector with resolution of 800x600 pixels. But on the other hand - when I play from a close distance on my monitor, I am annoyed by the low (1080p) resolution. It would really help to have at least 1440p. I wouldn't want to go back to 720p. And I have tons of great games that run in 60fps in 3D at 1080p on a single gtx760. In the times of GTX970, I don't think we should consider lower requirements (720p) as a significant argument. And as I said many times, being an addict of low persistence, I'd rather have low details at 1080p instead of ultra, but with blurred image. And most DLP projectors don't have low persistence mode.

So it mostly depends on the person. For some - DLP might be better. For others, monitors.

You are still confusing 720p projector with monitors. Of course no one would use a 720p monitor up close, but 720p at 10 feet is not even remotely comparable. You seem to have a bias against the resolution, without considering whether you can see the pixels or not. I said it before, and I'll say it again: pixel-arc-seconds. It matters.

http://en.wikipedia.org/wiki/Visual_acuity


As far as persistence on DLP projector, I'm not sure what you are thinking of, but DLP technology has zero-persistence, and pixel switch times in microseconds, not milliseconds. It's a micromirror- once it's turned, the light is out, there is no persistence.


I still think 720p on DLP is a giant, giant win. I can turn up ALL of the settings in nearly every game, all the eye candy, and still have a killer frame rate. Playing at 1440p up close and having to turn everything down to get sustainable frame rates is an inferior experience in my judgment.

Secondly, I don't presently have any need to upgrade past SLI 760. So $400 for two 4G cards gives me the best experience possible today.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#8
Posted 12/27/2014 10:52 AM   
Lol this is getting too technical for me xD
Lol this is getting too technical for me xD

#9
Posted 12/27/2014 11:54 AM   
bo3b Feel free to skip it. I won't mind. I should make it 3x shorter. I'm not so good in short "get to the point" texts, sorry. ;) I start to feel guilty of taking up your precious time (which could be spent better for playing or working for the good of the 3D community :) ), sorry for that. If I was biased for high resolution, I wouldn't say that I'd prefer 1080p with low persistence instead of 2160p without it. As I said - I do know that resolution isn't everything, because I did use 800x600 projector with 100" display area, and played like this from quite close distance. About DLP - it's irrelevant how fast pixels switch. It has to be supported. As far as I know (got the info about a year ago, but if the situation changed, I would know about this, for example from blurbusters.com news/twitter pages) there is no DLP projector that supports low persistence mode in 1080p60 (which means accepting 1080p120p(full frame, progressive) signal. You're right about 1440p requiring a lot, but there's 1080p between 720p and 1440p. I learned that you can, in most cases, lower the unimportant detail settings, to achieve stable 60fps. Sure, probably not in newest games, while gaming on a single gtx760, but I don't even know if I'll get through my "must play" list of games from 2003-2014 this year, games that run great on my hardware, and will bring me tons of fun. It might be an important argument for someone buying a display for 3D gaming. "whenever I play it instead of projector, it's not nearly as immersive" I have no idea why, or how this can even happen. Are you sure you play from [u]very[/u] close distance, have maximum separation and altered convergence settings? I was even shocked how immersive my monitor stayed, after a few weeks of playing only on Oculus Rift's DK2. I was afraid I would loose the "wow!" moments, if I go back from DK2 to this, but I was searching for my jaw even yesterday. Again. It's still SO immersive. :) ""There is no technology reason why, they are not incompatible, it's lack of imagination on NVidia's part. "" There is. We discussed about this with "Chief Blurbuster" on blurbusters.com page, we listened to what Carmack said, when asked about this, and the conclusion is this: it's really difficult, and even might turn out to be impossible. Variable refreshrate would give you a flickering image, since the brightness levels would differ. Imagine one frame with full persistence and next one with LP, in LP the display is switched off for most of the time. And that's not the only technical difficulty there is. Carmack has hopes that one day, in can be achieved, but it can take a few years, IF it can be even achieved. VR needs both low latency, low persistence, low requirements in terms of processing power - all of that. If it was easy to achieve, then they would already did it at Oculus. But it is now a full year after the mentioned Carmack opinion, and still no sign of even trying to combine LP with variable refresh rate. And as to low persistence requiring stable 60fps (or more in 2D modes), that's true. G-sync was invented to help with the drops, but it's still a broken gaming if you ask me. I don't care if 40fps looks better with g-sync. It still looks like crap. It looks like crap even at 60fps, if no LP mode. So why bother? There is so many games to choose from. Why play those not sustaining stable 60fps? Either lower the details, upgrade PC, or simply wait, play the game some other year, when you'd be using a more powerful PC anyway. [i]Luckily for me, the only game I'd really like to play right now, that is requiring more than my PC can handle in true 3D mode, is The Crew. Unless one of you, superheroes, makes a fix, I can happily wait, playing other games in the meantime. Beta worked well in fake/compatibility mode, but true 3D mode did not. It could end up on another upgrade, for which i definitively don't have money right now, so I'm kinda glad there's no fix for The Crew yet. ;)[/i]
bo3b

Feel free to skip it. I won't mind.
I should make it 3x shorter. I'm not so good in short "get to the point" texts, sorry. ;)
I start to feel guilty of taking up your precious time (which could be spent better for playing or working for the good of the 3D community :) ), sorry for that.


If I was biased for high resolution, I wouldn't say that I'd prefer 1080p with low persistence instead of 2160p without it. As I said - I do know that resolution isn't everything, because I did use 800x600 projector with 100" display area, and played like this from quite close distance.

About DLP - it's irrelevant how fast pixels switch. It has to be supported. As far as I know (got the info about a year ago, but if the situation changed, I would know about this, for example from blurbusters.com news/twitter pages) there is no DLP projector that supports low persistence mode in 1080p60 (which means accepting 1080p120p(full frame, progressive) signal.

You're right about 1440p requiring a lot, but there's 1080p between 720p and 1440p. I learned that you can, in most cases, lower the unimportant detail settings, to achieve stable 60fps. Sure, probably not in newest games, while gaming on a single gtx760, but I don't even know if I'll get through my "must play" list of games from 2003-2014 this year, games that run great on my hardware, and will bring me tons of fun. It might be an important argument for someone buying a display for 3D gaming.


"whenever I play it instead of projector, it's not nearly as immersive"

I have no idea why, or how this can even happen. Are you sure you play from very close distance, have maximum separation and altered convergence settings? I was even shocked how immersive my monitor stayed, after a few weeks of playing only on Oculus Rift's DK2. I was afraid I would loose the "wow!" moments, if I go back from DK2 to this, but I was searching for my jaw even yesterday. Again. It's still SO immersive. :)


""There is no technology reason why, they are not incompatible, it's lack of imagination on NVidia's part. ""

There is. We discussed about this with "Chief Blurbuster" on blurbusters.com page, we listened to what Carmack said, when asked about this, and the conclusion is this: it's really difficult, and even might turn out to be impossible. Variable refreshrate would give you a flickering image, since the brightness levels would differ. Imagine one frame with full persistence and next one with LP, in LP the display is switched off for most of the time. And that's not the only technical difficulty there is. Carmack has hopes that one day, in can be achieved, but it can take a few years, IF it can be even achieved.
VR needs both low latency, low persistence, low requirements in terms of processing power - all of that. If it was easy to achieve, then they would already did it at Oculus. But it is now a full year after the mentioned Carmack opinion, and still no sign of even trying to combine LP with variable refresh rate.


And as to low persistence requiring stable 60fps (or more in 2D modes), that's true. G-sync was invented to help with the drops, but it's still a broken gaming if you ask me. I don't care if 40fps looks better with g-sync. It still looks like crap. It looks like crap even at 60fps, if no LP mode. So why bother? There is so many games to choose from. Why play those not sustaining stable 60fps? Either lower the details, upgrade PC, or simply wait, play the game some other year, when you'd be using a more powerful PC anyway.
Luckily for me, the only game I'd really like to play right now, that is requiring more than my PC can handle in true 3D mode, is The Crew. Unless one of you, superheroes, makes a fix, I can happily wait, playing other games in the meantime. Beta worked well in fake/compatibility mode, but true 3D mode did not. It could end up on another upgrade, for which i definitively don't have money right now, so I'm kinda glad there's no fix for The Crew yet. ;)

#10
Posted 12/27/2014 01:06 PM   
@RonsonPL: no worries, these are really interesting topics to me, and I appreciate other viewpoints. For the DLP persistence, I think we might be using the word 'persistence' in a different way. I'm not sure that the BlurBusters definition makes sense. It seems to me that they are simply talking about duration, not persistence. This H5360 DLP runs at 120Hz, 60 fps per eye. That's where the extra resolution comes in, and makes it 1280x720*2. This is why it doesn't seem like 720 when not using glasses. In 2D, no glasses, 720p at wall distance, I can make out individual pixels. In 3D, I cannot. Maybe you mean that 60 fps is not sufficient (60 frames for each eye). There is no persistence (using the dictionary definition) of any image on DLP. The micromirrors switch in microseconds, so after a frame is done, there is no residual/persistent image like on a CRT or LCD. I don't know what you mean by 'low persistence mode' on a DLP. I agree, there are no 1080p projectors that do 120Hz, but from my perspective that is irrelevant because 720p@120 projectors do exist. As noted in depth above, I see no need to chase resolution. With regard to g-sync in 3D, I hate to argue with someone like Carmack, but I really think that answer was a classic 'looking for reasons why not', instead of looking for solutions. I completely disagree with the idea that it will flicker or change brightness. The point of g-sync is to give you a better [i]range [/i]of quality, not force it to a specific spot. So, if we determine that 50Hz per eye is the absolute minimum we will accept to avoid flicker, then g-sync can do that. That's the low bound. If it's shuttering at no less than 100Hz, then I don't see how you'd get any 'persistence' problems or brightening or darkening. There isn't enough time per frame to cause a problem. Maybe that range from 100-144Hz is not enough to matter for 3D, but I'd really like to try it to see. For me, I know I'm not susceptible to flicker and I don't care about 'persistence', so I'd like to try 60Hz (30 per eye) for a range of 60-144Hz. I believe this could dramatically improve the smoothness of 3D during big fights when we get dips. Sort of a moot point, none of these guys give a damn about 3D anyway, so g-sync on 3D is never going to happen.
@RonsonPL: no worries, these are really interesting topics to me, and I appreciate other viewpoints.


For the DLP persistence, I think we might be using the word 'persistence' in a different way. I'm not sure that the BlurBusters definition makes sense. It seems to me that they are simply talking about duration, not persistence.

This H5360 DLP runs at 120Hz, 60 fps per eye. That's where the extra resolution comes in, and makes it 1280x720*2. This is why it doesn't seem like 720 when not using glasses. In 2D, no glasses, 720p at wall distance, I can make out individual pixels. In 3D, I cannot.

Maybe you mean that 60 fps is not sufficient (60 frames for each eye). There is no persistence (using the dictionary definition) of any image on DLP. The micromirrors switch in microseconds, so after a frame is done, there is no residual/persistent image like on a CRT or LCD.

I don't know what you mean by 'low persistence mode' on a DLP. I agree, there are no 1080p projectors that do 120Hz, but from my perspective that is irrelevant because 720p@120 projectors do exist. As noted in depth above, I see no need to chase resolution.


With regard to g-sync in 3D, I hate to argue with someone like Carmack, but I really think that answer was a classic 'looking for reasons why not', instead of looking for solutions.

I completely disagree with the idea that it will flicker or change brightness. The point of g-sync is to give you a better range of quality, not force it to a specific spot. So, if we determine that 50Hz per eye is the absolute minimum we will accept to avoid flicker, then g-sync can do that. That's the low bound. If it's shuttering at no less than 100Hz, then I don't see how you'd get any 'persistence' problems or brightening or darkening. There isn't enough time per frame to cause a problem.

Maybe that range from 100-144Hz is not enough to matter for 3D, but I'd really like to try it to see. For me, I know I'm not susceptible to flicker and I don't care about 'persistence', so I'd like to try 60Hz (30 per eye) for a range of 60-144Hz. I believe this could dramatically improve the smoothness of 3D during big fights when we get dips.

Sort of a moot point, none of these guys give a damn about 3D anyway, so g-sync on 3D is never going to happen.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#11
Posted 12/27/2014 03:19 PM   
Some doubts I have about g-sync and tearing: Is it supposed that tearing ONLY happens when the fps are HIGHER than the monitor's refresh rate? 'Cause if this is correct, does the problem is not solved by simply adding a frame capper to the game? SO, if tearing only happens at HIGER framerates, and a capper is the solution, g sync has no sense, right? AND if the gpu framerate is LOWER than the refresh rate I don't need any solution to avoid tearing neither capper, v sync or g sync, right? Ilumine me please.
Some doubts I have about g-sync and tearing:

Is it supposed that tearing ONLY happens when the fps are HIGHER than the monitor's refresh rate? 'Cause if this is correct, does the problem is not solved by simply adding a frame capper to the game?

SO, if tearing only happens at HIGER framerates, and a capper is the solution, g sync has no sense, right?
AND if the gpu framerate is LOWER than the refresh rate I don't need any solution to avoid tearing neither capper, v sync or g sync, right?

Ilumine me please.

#12
Posted 12/27/2014 05:41 PM   
Funny things is I have gone away from 3d for a bit and as soon as I came back to it. It sill looks awesome!!! One game I have to Finnish in 3d is black flag, I got about half way through then lost my saved game. Nvidia is strange the like to bring new technology's out all the time but have no interest in supporting older ones. PhysX is a waste of time now for anyone that has a dedicated card, this what made me go for 3 way sli. But yea if your looking for a monitor with 680 sli stick to 1080p and don't go higher. 1440p requires a lot of graphics power for 3d
Funny things is I have gone away from 3d for a bit and as soon as I came back to it. It sill looks awesome!!! One game I have to Finnish in 3d is black flag, I got about half way through then lost my saved game. Nvidia is strange the like to bring new technology's out all the time but have no interest in supporting older ones. PhysX is a waste of time now for anyone that has a dedicated card, this what made me go for 3 way sli. But yea if your looking for a monitor with 680 sli stick to 1080p and don't go higher. 1440p requires a lot of graphics power for 3d

i7 4930K @ 4.4GHz
Asus P9X79 Pro
3 Way SLI Titan Black @ 1400mhz skyn3t VBIOS (Hardvolt Mod)
Mushkin Redline @ 2200MHz 32GB
Asus Xonar U7 Echelon Soundcard
Samsung Pro 256 GB SSD Games
Samsung Evo 256 GB SSD Windows 8.1 Pro
Samsung Evo 256 GB SSD Windows 7 Ultimate
Asus ROG Swift 1440p 144hz G-Sync
PSU Corsair AX1500i
Astro A50 Wireless Headset
Corsair 800D Case Custom Waterloop

#13
Posted 12/27/2014 06:44 PM   
I disagree with some things, I have an Acer H5360 projector and I hardly ever play with it, and instead I usually play with my Samsung 2233RZ monitor. Why?... the main reason is "comfortability", and I can think on other reasons that I prefer not to describe to not avoid the main point of this topic. Anyway the purpose of this thread I think was to discuss about monitors, and I would like to have a clear idea about the real differences among the best monitors, and what is exactly the best choice today (if that is possible), in case anybody wants to buy just now. It would be also great to have an opinion about OLED related to Nvidia 3d Vision (if that is possible), because OLED offers apparently a real difference in terms of visual quality.
I disagree with some things, I have an Acer H5360 projector and I hardly ever play with it, and instead I usually play with my Samsung 2233RZ monitor. Why?... the main reason is "comfortability", and I can think on other reasons that I prefer not to describe to not avoid the main point of this topic.

Anyway the purpose of this thread I think was to discuss about monitors, and I would like to have a clear idea about the real differences among the best monitors, and what is exactly the best choice today (if that is possible), in case anybody wants to buy just now.

It would be also great to have an opinion about OLED related to Nvidia 3d Vision (if that is possible), because OLED offers apparently a real difference in terms of visual quality.

- Windows 7 64bits (SSD OCZ-Vertez2 128Gb)
- "ASUS P6X58D-E" motherboard
- "MSI GTX 660 TI"
- "Intel Xeon X5670" @4000MHz CPU (20.0[12-25]x200MHz)
- RAM 16 Gb DDR3 1600
- "Dell S2716DG" monitor (2560x1440 @144Hz)
- "Corsair Carbide 600C" case
- Labrador dog (cinnamon edition)

#14
Posted 12/27/2014 07:37 PM   
anyone know if any site is selling the ir emitter alone?
anyone know if any site is selling the ir emitter alone?

#15
Posted 12/28/2014 05:13 AM   
  1 / 3    
Scroll To Top