All G-Sync monitors are 3D Vision compatible
  3 / 5    
It's an architecture change, Maxwell will be huge. 680 to 780 wasn't and look at the difference, 8xx will be even greater. I'm not saying it will be cheap, it will conform to a price that is a little better than what AMD has out at the minute, but Maxwell will mean you'll only need a card or two for 4K rather than putting 780's in quad SLI as you would now. 4K is coming fast and I'm sure Nvidia wants to make sales of cards on people that want 4K, it would be stupid not to put out cards fast enough to support it. Gsync 4K displays practically confirm this, why bother putting a gaming feature on a 4K display if no one can game at that resolution? You'd be surprised at what you can get away with even now. I've messed around with downscaling and such to force 3840x2160 on Crysis 3, performance was more than acceptable once settings were knocked down to medium or so. In a year or so 4K will be the new 1440p, resolution moves fast, look at reviews for cards 4 years ago, 1080p was as high as benchmarks get. 2 years ago you start to get 1440p benchmarks as the new standard, it wont be long before 4K is the new 'how shitting fast is your rig' benchmark.
It's an architecture change, Maxwell will be huge. 680 to 780 wasn't and look at the difference, 8xx will be even greater. I'm not saying it will be cheap, it will conform to a price that is a little better than what AMD has out at the minute, but Maxwell will mean you'll only need a card or two for 4K rather than putting 780's in quad SLI as you would now.

4K is coming fast and I'm sure Nvidia wants to make sales of cards on people that want 4K, it would be stupid not to put out cards fast enough to support it. Gsync 4K displays practically confirm this, why bother putting a gaming feature on a 4K display if no one can game at that resolution?

You'd be surprised at what you can get away with even now. I've messed around with downscaling and such to force 3840x2160 on Crysis 3, performance was more than acceptable once settings were knocked down to medium or so. In a year or so 4K will be the new 1440p, resolution moves fast, look at reviews for cards 4 years ago, 1080p was as high as benchmarks get. 2 years ago you start to get 1440p benchmarks as the new standard, it wont be long before 4K is the new 'how shitting fast is your rig' benchmark.

#31
Posted 01/12/2014 11:47 PM   
[quote="Cookybiscuit"]Hows this. Build a 4K, Gsync, PLS/IPS display, and have it do passive 3D. The vertical resolution being cut in half wont matter seeing as you have more pixels to play with, sure there may be ghosting but its better than having a shitty TN display for your balls to the walls 4K screen that is meant to be beautiful.[/quote] I'm using a passive IPS monitor and the 3D crosstalk is unplayably bad. So I use a projector for gaming, which is awesome but less convenient and only 720p. It's true I generally don't care that much about response time, but I guess for 3D the faster response time of TN is what enables it to display with minimal crosstalk, correct? I would possibly settle for a TN monitor if I was going to use it for gaming only, but that isn't the case. Any other potential tech out there that could do acceptable active 3D besides TN and DLP?
Cookybiscuit said:Hows this. Build a 4K, Gsync, PLS/IPS display, and have it do passive 3D. The vertical resolution being cut in half wont matter seeing as you have more pixels to play with, sure there may be ghosting but its better than having a shitty TN display for your balls to the walls 4K screen that is meant to be beautiful.

I'm using a passive IPS monitor and the 3D crosstalk is unplayably bad. So I use a projector for gaming, which is awesome but less convenient and only 720p.

It's true I generally don't care that much about response time, but I guess for 3D the faster response time of TN is what enables it to display with minimal crosstalk, correct? I would possibly settle for a TN monitor if I was going to use it for gaming only, but that isn't the case.

Any other potential tech out there that could do acceptable active 3D besides TN and DLP?

#32
Posted 01/13/2014 12:30 AM   
Guys all the monitors from now on will be G-Sync integrated: [url]http://3dvision-blog.com/9157-the-new-nvidia-g-sync-technology-will-support-3d-vision-as-well/[/url] What you need to be worrying about now and demanding for is: - 3D gaming monitors with HIGHER Resolutions - 1440p and 4K! - Improvement to 3D monitors with higher refresh rates - 144 Hz (instead of 120Hz) - Nvidia to improve their 3D vision technology - 3D capture support from ShadowPlay [b]Lastly[/b] - The new Oculus Rift Consumer display to be armed with 3D screen at least 1080p, supporting 3D vision and most importantly, which surely it will have integrated in - [b]G-Sync[/b]!!! Remember this will be launching Q3 this year!!!
Guys all the monitors from now on will be G-Sync integrated:
http://3dvision-blog.com/9157-the-new-nvidia-g-sync-technology-will-support-3d-vision-as-well/

What you need to be worrying about now and demanding for is:
- 3D gaming monitors with HIGHER Resolutions - 1440p and 4K!
- Improvement to 3D monitors with higher refresh rates - 144 Hz (instead of 120Hz)
- Nvidia to improve their 3D vision technology
- 3D capture support from ShadowPlay
Lastly - The new Oculus Rift Consumer display to be armed with 3D screen at least 1080p, supporting 3D vision and most importantly, which surely it will have integrated in - G-Sync!!! Remember this will be launching Q3 this year!!!

#33
Posted 01/13/2014 04:01 PM   
/\/\ I don't get the 3D capture support thing with Shadowplay. Did they change something? Last time I used it, it recorded SBS @1080p. Do you mean full resolution? 3840*1080 support? The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess. G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.
/\/\

I don't get the 3D capture support thing with Shadowplay. Did they change something? Last time I used it, it recorded SBS @1080p. Do you mean full resolution? 3840*1080 support?

The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess.

G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.

#34
Posted 01/13/2014 05:51 PM   
Everyone wants everything technology related especially concerning PC... Till its released and then price is the factor. Thats probably why kickstarter is doing so well. It grabs you by the balls beforehand.
Everyone wants everything technology related especially concerning PC... Till its released and then price is the factor. Thats probably why kickstarter is doing so well. It grabs you by the balls beforehand.

Co-founder of helixmod.blog.com

If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com

#35
Posted 01/13/2014 06:57 PM   
An increase in vertical resolution is always a good thing. For 2D I game on a 2560x1600 monitor, so I'm sort of in the middle to 4K over 1080p (yes, different aspect ratio.. just sayin') and more modern games that have much more detail, especially in the textures see a huge improvement. The effect from ambient occlusion becomes less subtle too. If I throw on my heavily moded Skyrim there is a jaw dropping increase in image quality. It's hard to do it justice by just explaining it. Throw on Half-Life: Source and it's pretty much the same image. Super high rez monitors are going to bring out the quality of more modern games as time goes on and the rendering techniques become more advanced. Yes, it'll be a while before we're able to render it with any justice in 3D, but I'm sure we'll get there.
An increase in vertical resolution is always a good thing. For 2D I game on a 2560x1600 monitor, so I'm sort of in the middle to 4K over 1080p (yes, different aspect ratio.. just sayin') and more modern games that have much more detail, especially in the textures see a huge improvement. The effect from ambient occlusion becomes less subtle too.

If I throw on my heavily moded Skyrim there is a jaw dropping increase in image quality. It's hard to do it justice by just explaining it. Throw on Half-Life: Source and it's pretty much the same image.

Super high rez monitors are going to bring out the quality of more modern games as time goes on and the rendering techniques become more advanced. Yes, it'll be a while before we're able to render it with any justice in 3D, but I'm sure we'll get there.

#36
Posted 01/13/2014 08:22 PM   
[quote="Paul33993"]/\/\ I don't get the 3D capture support thing with Shadowplay. Did they change something? Last time I used it, it recorded SBS @1080p. Do you mean full resolution? 3840*1080 support? The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess. G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.[/quote] They did not change anything in ShadowPlay, they are about to add it, i just don't know when, they said in future...ShadowPlay is not yet completed as a capturing software... You are maybe right for that G-sync won't be good to be in the Rift, but who knows? Remember that VR is very new technology, Oculus knows their way of VR engineering...Even if people want G-Sync in there, they might not implement it but find another solution if something's wrong. It's their own display... Oculus very well knows about G-Sync and what it can do, so it will be really their choice. However, cuz everyone is gone crazy with it, that's why they demand G-Sync everywhere - everyone wants ripping and stuttering solved ones of for all. I realize that 3D was ones a big thing just like G-Sync now. They wanted to try it on TVs, Monitors...everywhere. Now a lot of people have 3D TVs and I guess only 1/3 actually use it to watch 3D stuff... For the gaming, it's not for competitive, but for MMORPGs or even for Horror games, it's just perfect. Any MMORPG will go great with 3D and that's why 3D should be continued to be improving. All genre of games will evolving however different people will have different preferences... I wish I could play Guild Wars 2 on 1440p 3D monitor with G-Sync. It will look awesome from whatever angle I look at it. I simply wish Nvidia could go back a little and try improve what they've got as 3D Vision technology in some way and other companies like Asus to start implementing 3D in higher resolution monitors like 1440p and 4K. Even after Oculus Rift VR display is released, there are still not many games which support it however many of them CAN be played 3D and PC gaming on 2D will also continue even when VR is around...
Paul33993 said:/\/\

I don't get the 3D capture support thing with Shadowplay. Did they change something? Last time I used it, it recorded SBS @1080p. Do you mean full resolution? 3840*1080 support?

The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess.

G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.


They did not change anything in ShadowPlay, they are about to add it, i just don't know when, they said in future...ShadowPlay is not yet completed as a capturing software...

You are maybe right for that G-sync won't be good to be in the Rift, but who knows? Remember that VR is very new technology, Oculus knows their way of VR engineering...Even if people want G-Sync in there, they might not implement it but find another solution if something's wrong. It's their own display... Oculus very well knows about G-Sync and what it can do, so it will be really their choice. However, cuz everyone is gone crazy with it, that's why they demand G-Sync everywhere - everyone wants ripping and stuttering solved ones of for all.

I realize that 3D was ones a big thing just like G-Sync now. They wanted to try it on TVs, Monitors...everywhere. Now a lot of people have 3D TVs and I guess only 1/3 actually use it to watch 3D stuff... For the gaming, it's not for competitive, but for MMORPGs or even for Horror games, it's just perfect. Any MMORPG will go great with 3D and that's why 3D should be continued to be improving. All genre of games will evolving however different people will have different preferences... I wish I could play Guild Wars 2 on 1440p 3D monitor with G-Sync. It will look awesome from whatever angle I look at it.

I simply wish Nvidia could go back a little and try improve what they've got as 3D Vision technology in some way and other companies like Asus to start implementing 3D in higher resolution monitors like 1440p and 4K. Even after Oculus Rift VR display is released, there are still not many games which support it however many of them CAN be played 3D and PC gaming on 2D will also continue even when VR is around...

#37
Posted 01/14/2014 01:03 AM   
[quote="Paul33993"]The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess. G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.[/quote]Based on my understanding of GSync, I don't agree with this. If anything, GSync inside Rift will be even more important than on a monitor. On a monitor, you can also think of it as a non-stationary display if you are wildly whipping around all the time. Every frame, every pixel changes. Very similar to moving your head. In Rift, and on monitor, if you can [i]sustain[/i] 120fps, you won't need GSync. But just like a monitor, if you dip below 120Hz for any reason, that last frame is stale. On a monitor you have the option to disable vSync, but in Rift screen tearing is particularly jarring and probably a non-starter for nearly everyone. In the vSync ON case, in Rift, if you dip below the refresh rate of 120Hz, then you have to wait another entire frame before updating the screen. Whether this is by flashing the pixel or sample and hold is really irrelevant- the data is stale, and it will look like smear in the Rift. It will also add perceived latency, because as you move your head you get slow updates which will damage the immersion. GSync helps this case, because instead of waiting for an entire refresh cycle, we can pause the current frame until the data arrives and then put it on screen. Instead of a full frame drop, we get a partial frame drop, which can only help. At my hypothetical 120Hz, if we dip to 115Hz for some firefight, instead of pixel being stale for 8.3ms, we are stale for only 400us.
Paul33993 said:The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess.

G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.
Based on my understanding of GSync, I don't agree with this. If anything, GSync inside Rift will be even more important than on a monitor.

On a monitor, you can also think of it as a non-stationary display if you are wildly whipping around all the time. Every frame, every pixel changes. Very similar to moving your head.

In Rift, and on monitor, if you can sustain 120fps, you won't need GSync. But just like a monitor, if you dip below 120Hz for any reason, that last frame is stale. On a monitor you have the option to disable vSync, but in Rift screen tearing is particularly jarring and probably a non-starter for nearly everyone.

In the vSync ON case, in Rift, if you dip below the refresh rate of 120Hz, then you have to wait another entire frame before updating the screen. Whether this is by flashing the pixel or sample and hold is really irrelevant- the data is stale, and it will look like smear in the Rift. It will also add perceived latency, because as you move your head you get slow updates which will damage the immersion.

GSync helps this case, because instead of waiting for an entire refresh cycle, we can pause the current frame until the data arrives and then put it on screen. Instead of a full frame drop, we get a partial frame drop, which can only help. At my hypothetical 120Hz, if we dip to 115Hz for some firefight, instead of pixel being stale for 8.3ms, we are stale for only 400us.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#38
Posted 01/14/2014 04:47 AM   
I'm not worried about tearing. Because I know these guys know it's a problem that needs to be accounted for. So I'm sure they'll have some kind of solution. It's just not going to be G-Sync and that's perfectly fine. One thing I do wonder about is this: an OLED draws it's image many times faster than LCD. And it's not using sample and hold. So I'm not even sure how applicable mismatched draws would even be. I would assume it'd be greatly reduced just because of the fundamental differences. And maybe a couple tweaks here could get rid of the rest of the issues. I mean, if draw times are almost instant, and you're not leaving images on the screen (just pulsing it once), how can you even have tearing? I mean I'm sure you could, but because of the display techniques they're using, it seems like it's fairly easily addressed at that stage.
I'm not worried about tearing. Because I know these guys know it's a problem that needs to be accounted for. So I'm sure they'll have some kind of solution. It's just not going to be G-Sync and that's perfectly fine.

One thing I do wonder about is this: an OLED draws it's image many times faster than LCD. And it's not using sample and hold. So I'm not even sure how applicable mismatched draws would even be. I would assume it'd be greatly reduced just because of the fundamental differences. And maybe a couple tweaks here could get rid of the rest of the issues.

I mean, if draw times are almost instant, and you're not leaving images on the screen (just pulsing it once), how can you even have tearing? I mean I'm sure you could, but because of the display techniques they're using, it seems like it's fairly easily addressed at that stage.

#39
Posted 01/14/2014 03:27 PM   
[quote="bo3b"][quote="Paul33993"]The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess. G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.[/quote]Based on my understanding of GSync, I don't agree with this. If anything, GSync inside Rift will be even more important than on a monitor. On a monitor, you can also think of it as a non-stationary display if you are wildly whipping around all the time. Every frame, every pixel changes. Very similar to moving your head. In Rift, and on monitor, if you can [i]sustain[/i] 120fps, you won't need GSync. But just like a monitor, if you dip below 120Hz for any reason, that last frame is stale. On a monitor you have the option to disable vSync, but in Rift screen tearing is particularly jarring and probably a non-starter for nearly everyone. In the vSync ON case, in Rift, if you dip below the refresh rate of 120Hz, then you have to wait another entire frame before updating the screen. Whether this is by flashing the pixel or sample and hold is really irrelevant- the data is stale, and it will look like smear in the Rift. It will also add perceived latency, because as you move your head you get slow updates which will damage the immersion. GSync helps this case, because instead of waiting for an entire refresh cycle, we can pause the current frame until the data arrives and then put it on screen. Instead of a full frame drop, we get a partial frame drop, which can only help. At my hypothetical 120Hz, if we dip to 115Hz for some firefight, instead of pixel being stale for 8.3ms, we are stale for only 400us.[/quote] Not to poke a hole in this, but the Oculus Rift runs at 60hz, not 120hz. The way the rift does 3D is one screen, at 60hz, where the image is split in two parts. So working on an assumption of a 1080p screen, it is rendering two 960x1080 images at 60hz, rather than a typical 3D vision-like display which is rendering two 1920x1080 images and requires a 120hz display.
bo3b said:
Paul33993 said:The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess.

G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.
Based on my understanding of GSync, I don't agree with this. If anything, GSync inside Rift will be even more important than on a monitor.

On a monitor, you can also think of it as a non-stationary display if you are wildly whipping around all the time. Every frame, every pixel changes. Very similar to moving your head.

In Rift, and on monitor, if you can sustain 120fps, you won't need GSync. But just like a monitor, if you dip below 120Hz for any reason, that last frame is stale. On a monitor you have the option to disable vSync, but in Rift screen tearing is particularly jarring and probably a non-starter for nearly everyone.

In the vSync ON case, in Rift, if you dip below the refresh rate of 120Hz, then you have to wait another entire frame before updating the screen. Whether this is by flashing the pixel or sample and hold is really irrelevant- the data is stale, and it will look like smear in the Rift. It will also add perceived latency, because as you move your head you get slow updates which will damage the immersion.

GSync helps this case, because instead of waiting for an entire refresh cycle, we can pause the current frame until the data arrives and then put it on screen. Instead of a full frame drop, we get a partial frame drop, which can only help. At my hypothetical 120Hz, if we dip to 115Hz for some firefight, instead of pixel being stale for 8.3ms, we are stale for only 400us.


Not to poke a hole in this, but the Oculus Rift runs at 60hz, not 120hz.

The way the rift does 3D is one screen, at 60hz, where the image is split in two parts.

So working on an assumption of a 1080p screen, it is rendering two 960x1080 images at 60hz, rather than a typical 3D vision-like display which is rendering two 1920x1080 images and requires a 120hz display.

#40
Posted 01/14/2014 03:51 PM   
[quote="Alo81"][quote="bo3b"][quote="Paul33993"]The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess. G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.[/quote]Based on my understanding of GSync, I don't agree with this. If anything, GSync inside Rift will be even more important than on a monitor. On a monitor, you can also think of it as a non-stationary display if you are wildly whipping around all the time. Every frame, every pixel changes. Very similar to moving your head. In Rift, and on monitor, if you can [i]sustain[/i] 120fps, you won't need GSync. But just like a monitor, if you dip below 120Hz for any reason, that last frame is stale. On a monitor you have the option to disable vSync, but in Rift screen tearing is particularly jarring and probably a non-starter for nearly everyone. In the vSync ON case, in Rift, if you dip below the refresh rate of 120Hz, then you have to wait another entire frame before updating the screen. Whether this is by flashing the pixel or sample and hold is really irrelevant- the data is stale, and it will look like smear in the Rift. It will also add perceived latency, because as you move your head you get slow updates which will damage the immersion. GSync helps this case, because instead of waiting for an entire refresh cycle, we can pause the current frame until the data arrives and then put it on screen. Instead of a full frame drop, we get a partial frame drop, which can only help. At my hypothetical 120Hz, if we dip to 115Hz for some firefight, instead of pixel being stale for 8.3ms, we are stale for only 400us.[/quote] Not to poke a hole in this, but the Oculus Rift runs at 60hz, not 120hz. The way the rift does 3D is one screen, at 60hz, where the image is split in two parts. So working on an assumption of a 1080p screen, it is rendering two 960x1080 images at 60hz, rather than a typical 3D vision-like display which is rendering two 1920x1080 images and requires a 120hz display. [/quote] Except it's not rendering at 60hz. Every CES article has stated it's running at a high HZ OLED screen. And if you watch this video: http://www.youtube.com/watch?v=3Gs8iy6AdD8&feature=youtu.be Palmer is specifically asked about framerate and while he won't say exactly what the hz is, he plainly declares it's not 60hz. That it's "significantly" higher than 60hz. Significantly is open to interpretation, but it's definitely not 60.
Alo81 said:
bo3b said:
Paul33993 said:The hype about G-Sync has gotten a little out of hand IMO. People are demanding this for the Rift on every message board in existence. G-Sync is the solution to sample and hold on stationary displays. The Rift is neither a stationary display (you're inside the world) and it's low persistence. Rift is flashing the image and going blank. G-sync with this type of display would be awful. Motion when moving your head would be choppy (at low hz) and strobing the rendering at 35hz would be a flickering mess.

G-Sync is great, but it's really not applicable, at all, to an OLED screen that's outputting at low persistence. It would not be desirable at all.
Based on my understanding of GSync, I don't agree with this. If anything, GSync inside Rift will be even more important than on a monitor.

On a monitor, you can also think of it as a non-stationary display if you are wildly whipping around all the time. Every frame, every pixel changes. Very similar to moving your head.

In Rift, and on monitor, if you can sustain 120fps, you won't need GSync. But just like a monitor, if you dip below 120Hz for any reason, that last frame is stale. On a monitor you have the option to disable vSync, but in Rift screen tearing is particularly jarring and probably a non-starter for nearly everyone.

In the vSync ON case, in Rift, if you dip below the refresh rate of 120Hz, then you have to wait another entire frame before updating the screen. Whether this is by flashing the pixel or sample and hold is really irrelevant- the data is stale, and it will look like smear in the Rift. It will also add perceived latency, because as you move your head you get slow updates which will damage the immersion.

GSync helps this case, because instead of waiting for an entire refresh cycle, we can pause the current frame until the data arrives and then put it on screen. Instead of a full frame drop, we get a partial frame drop, which can only help. At my hypothetical 120Hz, if we dip to 115Hz for some firefight, instead of pixel being stale for 8.3ms, we are stale for only 400us.


Not to poke a hole in this, but the Oculus Rift runs at 60hz, not 120hz.

The way the rift does 3D is one screen, at 60hz, where the image is split in two parts.

So working on an assumption of a 1080p screen, it is rendering two 960x1080 images at 60hz, rather than a typical 3D vision-like display which is rendering two 1920x1080 images and requires a 120hz display.


Except it's not rendering at 60hz. Every CES article has stated it's running at a high HZ OLED screen. And if you watch this video:

;feature=youtu.be

Palmer is specifically asked about framerate and while he won't say exactly what the hz is, he plainly declares it's not 60hz. That it's "significantly" higher than 60hz. Significantly is open to interpretation, but it's definitely not 60.

#41
Posted 01/14/2014 07:41 PM   
He was talking to hiphopgamer... he may of had an aneurism.[joke.maybe] Its prototype though, who knows whats final. I would be shocked/displeased if they had to use 120hz though. They got to do what they go to do I suppose. Thats really cuts into the future of higher res displays.
He was talking to hiphopgamer... he may of had an aneurism.[joke.maybe]
Its prototype though, who knows whats final. I would be shocked/displeased if they had to use 120hz though. They got to do what they go to do I suppose. Thats really cuts into the future of higher res displays.

Co-founder of helixmod.blog.com

If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com

#42
Posted 01/14/2014 08:29 PM   
I saw somewhere where they were speculating Eve Valkyrie was rendering at 78fps (supposed visable in a video when they switched modes between sample and hold and low persistence. Also, Venturebeat put up a really good interview today: [i]What low persistence does — because we’re only going to be able to get latency of motion down to 15 or 20 milliseconds – it helps avoid that problem. When you get the image, it’s great for the first one or two milliseconds, and then instead of keeping it on the screen, we turn off the screen. Normally it would be good for a few milliseconds, then bad, bad, bad, then good again when you got another image, then bad. Now it’s good, then off, and then you get another good one, then off. You’re getting this image, and then it goes dark on the screen for the next 10 or 11 milliseconds until you get the new image. We do it fast enough, at a really high refresh rate, that you don’t see it. You can’t see the flicker that’s caused. It’s the same latency you would have gotten. It’s just that the persistence of that bad image is no longer there. You could avoid low persistence if you could run the screen at a few thousand hertz and only have, say, one millisecond of time between frames. But that’s not practical to tell game developers, “Hey, if you want to make VR games, you have to run at 1,000 FPS.” We want to say, “You can make great virtual reality, and you only need to run at 40, 50, 60 on your game engine.” The rendering engine will need to run a little bit faster, in sync with the refresh rate of the screen. But it’s very practical. People shouldn’t have too hard a time with where they are today.[/i] I don't really understand how they're pulling that off, but I'll give them the benefit of the doubt.
I saw somewhere where they were speculating Eve Valkyrie was rendering at 78fps (supposed visable in a video when they switched modes between sample and hold and low persistence.

Also, Venturebeat put up a really good interview today:


What low persistence does — because we’re only going to be able to get latency of motion down to 15 or 20 milliseconds – it helps avoid that problem. When you get the image, it’s great for the first one or two milliseconds, and then instead of keeping it on the screen, we turn off the screen. Normally it would be good for a few milliseconds, then bad, bad, bad, then good again when you got another image, then bad. Now it’s good, then off, and then you get another good one, then off. You’re getting this image, and then it goes dark on the screen for the next 10 or 11 milliseconds until you get the new image. We do it fast enough, at a really high refresh rate, that you don’t see it. You can’t see the flicker that’s caused.

It’s the same latency you would have gotten. It’s just that the persistence of that bad image is no longer there. You could avoid low persistence if you could run the screen at a few thousand hertz and only have, say, one millisecond of time between frames. But that’s not practical to tell game developers, “Hey, if you want to make VR games, you have to run at 1,000 FPS.” We want to say, “You can make great virtual reality, and you only need to run at 40, 50, 60 on your game engine.” The rendering engine will need to run a little bit faster, in sync with the refresh rate of the screen. But it’s very practical. People shouldn’t have too hard a time with where they are today.


I don't really understand how they're pulling that off, but I'll give them the benefit of the doubt.

#43
Posted 01/14/2014 09:42 PM   
The only thing I am confused about is how much "performance power" does 3D want to be at 60FPS or more? Some people in EVGA's community said that even a GTX 780 Ti won't be able to keep those rates on all games on max at 1080p ones all Nvidia techs are combined. So, because of that reason, and also for some other reasons, I decided to wait for Haswell-E CPUs and Maxwell GPUs to make sure performance is enough for this and next 2 generations of games for 3D on 1080p and 1440p monitors. For the 4K, I will wait, the prices won't go down soon...
The only thing I am confused about is how much "performance power" does 3D want to be at 60FPS or more? Some people in EVGA's community said that even a GTX 780 Ti won't be able to keep those rates on all games on max at 1080p ones all Nvidia techs are combined.
So, because of that reason, and also for some other reasons, I decided to wait for Haswell-E CPUs and Maxwell GPUs to make sure performance is enough for this and next 2 generations of games for 3D on 1080p and 1440p monitors. For the 4K, I will wait, the prices won't go down soon...

#44
Posted 01/15/2014 05:31 PM   
At last, Asus has finally confirmed the PG278Q will be 3D Vision Capable!! Read last post here by Asus Rep; http://rog.asus.com/forum/showthread.php?42516-ROG-SWIFT-PG278Q-Nvidia-3D-Vision I'll take 3 thanks!!
At last, Asus has finally confirmed the PG278Q will be 3D Vision Capable!! Read last post here by Asus Rep;


http://rog.asus.com/forum/showthread.php?42516-ROG-SWIFT-PG278Q-Nvidia-3D-Vision


I'll take 3 thanks!!

#45
Posted 01/15/2014 08:55 PM   
  3 / 5    
Scroll To Top