[quote="SAproX1"][quote="xXxStarManxXx"][quote="SAproX1"]Hello guys,
I just installed the NaturalVision Remastered Graphic Mod and I tested it with 3D mode (the quality is insanely good) but having my GPU at 50-60% usage, in 2D mode the GPU is 99%.
Did anyone experienced the same situation or have any idea on how to fix it?
Thanks![/quote]
3D Vision causes games to only utilize 3 cores, it's known colloquially as the "3 Core Bug". This introduces severe CPU bottlenecking in titles that are heavily CPU dependent, GTA 5 beyond one of them.
A newer processor with greater IPC count and freq helps, my 3D Vision FPS in GTA 5, overlooking Los Santos from Franklin's balcony went from like 35 to 55 going from a 4930k @ 4.5 GHz to 8700k @ 5.0 GHz. That said I still had brutal dips down to 50 FPS in that area and the game ran so stuttery that I opted for fluid 120+ FPS with G-Sync over 3D Vision. A lot of titles are like this, like Batman: AK, where flying around causes the game to dip down to 40 FPS in certain areas with severe hitching because of the same 3 Core Bug.
But honestly GTA 5 just isn't fun anymore, especially the multiplayer aspect. Once you get over single player there really isn't a point in keeping the game. [/quote]
Good information, thanks. I'm trying mods and finding them really amazing, testing some cars mods with NVR is something that I'm actually enjoying, maybe I'm late but for the performance I think we just reach (2080 Ti) the ability to play these mods at max settings with 8msaa (the difference is visible between 4msaa and 8msaa in 2K btw) at + 120fps in 2K and more.[/quote]
Please see my new thread about GTAV settings....
I just installed the NaturalVision Remastered Graphic Mod and I tested it with 3D mode (the quality is insanely good) but having my GPU at 50-60% usage, in 2D mode the GPU is 99%.
Did anyone experienced the same situation or have any idea on how to fix it?
Thanks!
3D Vision causes games to only utilize 3 cores, it's known colloquially as the "3 Core Bug". This introduces severe CPU bottlenecking in titles that are heavily CPU dependent, GTA 5 beyond one of them.
A newer processor with greater IPC count and freq helps, my 3D Vision FPS in GTA 5, overlooking Los Santos from Franklin's balcony went from like 35 to 55 going from a 4930k @ 4.5 GHz to 8700k @ 5.0 GHz. That said I still had brutal dips down to 50 FPS in that area and the game ran so stuttery that I opted for fluid 120+ FPS with G-Sync over 3D Vision. A lot of titles are like this, like Batman: AK, where flying around causes the game to dip down to 40 FPS in certain areas with severe hitching because of the same 3 Core Bug.
But honestly GTA 5 just isn't fun anymore, especially the multiplayer aspect. Once you get over single player there really isn't a point in keeping the game.
Good information, thanks. I'm trying mods and finding them really amazing, testing some cars mods with NVR is something that I'm actually enjoying, maybe I'm late but for the performance I think we just reach (2080 Ti) the ability to play these mods at max settings with 8msaa (the difference is visible between 4msaa and 8msaa in 2K btw) at + 120fps in 2K and more.
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
[quote="RAGEdemon"]Possible alleviation of the problem to a degree:
A common misconception is that low FPS causes stuttering - to an extent it does, but majority of the time, this is not the case. The perceived stuttering actually comes from frames produced out of sync because of some bottleneck - and this is what GSync / FreeSync were designed to combat - this is why you can have low FPS with GSync and Freesync but still perceive that the game play is fluid.
Of course we can't have this with 3DVision (well you can but the driver doesn't implement my personal idea of an improved "3D Vision" to allow for such a thing, including 120FPS in 3DV on 120Hz+ monitors - proven to work).
We can, however, have something simple which does the job - if 98% of your FPS are above 50FPS, then lock your max FPS to 50, e.g. by using nVidia Inspector (option next to VSync settings IIRC). What this does is allow your gaming system some breathing room so it can generate frames in sync.
This will give you the 3D vision FPS you desire at a smooth FPS, albeit, at a slightly dropped FPS :) [/quote]
I've actually tried this without success in the past (The Witcher 3), setting the refresh rate to 100 Hz (PG278Q) which makes for 50 Hz 3D Vision however did work and it was fluid like butter, albeit with a subtle flicker on the horizon and the image was noticeably dimmer (the higher the refresh rate the brighter the image).
In the end I couldn't take the flicker, especially noticeable in the skybox in The Witcher 3 and went back to 120 Hz in Control Panel and just turned a few settings down.
But yeah, for whatever reason, artificially limiting the FPS via RTSS never worked for me. What sucks is that the FPS can be 55 FPS and it will stutter. If it's not 60 FPS solid there's stutter and it's really aggravating and probably the number one reason I opt for 2D G-Sync over 3D Vision for a lot of my games. Being able to turn settings down and off that affect GPU load is one thing, but when you end up with a CPU bottleneck and there's not a whole lot you can do to mitigate it this is extremely frustrating.
Honestly I don't even know why this 3 Core bug is even an issue. Why doesn't Nvidia do something about it? Are they trying to kill off 3D Vision? Because that's what it looks like.
RAGEdemon said:Possible alleviation of the problem to a degree:
A common misconception is that low FPS causes stuttering - to an extent it does, but majority of the time, this is not the case. The perceived stuttering actually comes from frames produced out of sync because of some bottleneck - and this is what GSync / FreeSync were designed to combat - this is why you can have low FPS with GSync and Freesync but still perceive that the game play is fluid.
Of course we can't have this with 3DVision (well you can but the driver doesn't implement my personal idea of an improved "3D Vision" to allow for such a thing, including 120FPS in 3DV on 120Hz+ monitors - proven to work).
We can, however, have something simple which does the job - if 98% of your FPS are above 50FPS, then lock your max FPS to 50, e.g. by using nVidia Inspector (option next to VSync settings IIRC). What this does is allow your gaming system some breathing room so it can generate frames in sync.
This will give you the 3D vision FPS you desire at a smooth FPS, albeit, at a slightly dropped FPS :)
I've actually tried this without success in the past (The Witcher 3), setting the refresh rate to 100 Hz (PG278Q) which makes for 50 Hz 3D Vision however did work and it was fluid like butter, albeit with a subtle flicker on the horizon and the image was noticeably dimmer (the higher the refresh rate the brighter the image).
In the end I couldn't take the flicker, especially noticeable in the skybox in The Witcher 3 and went back to 120 Hz in Control Panel and just turned a few settings down.
But yeah, for whatever reason, artificially limiting the FPS via RTSS never worked for me. What sucks is that the FPS can be 55 FPS and it will stutter. If it's not 60 FPS solid there's stutter and it's really aggravating and probably the number one reason I opt for 2D G-Sync over 3D Vision for a lot of my games. Being able to turn settings down and off that affect GPU load is one thing, but when you end up with a CPU bottleneck and there's not a whole lot you can do to mitigate it this is extremely frustrating.
Honestly I don't even know why this 3 Core bug is even an issue. Why doesn't Nvidia do something about it? Are they trying to kill off 3D Vision? Because that's what it looks like.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
If we can manage to keep it Civil, xXxStarManxXx, this thread might serve as a valuable info resource for posterity :)
What we want to do is keep the refresh to 120Hz but at the same time, limit the FPS to e.g. 50fps. I have never tried RTSS but can confirm Nvidia Profile Inspector works flawlessly (it looks complicated but it's actually extremely easy to use!):
[url]https://github.com/Orbmu2k/nvidiaProfileInspector/releases[/url]
Detailed Instructions here:
[url]https://forums.guru3d.com/threads/nvidia-inspector-introduction-and-guide.403676/[/url]
Another factor is that nVidia's marketing team always show GSync videos compared to double buffered VSync which makes it look fantastic. This is an unfair comparison because most games nowadays, OpenGL and DX, are coded with triple buffering support. Regardless, 3DV forces triple buffering by default, so simply capping the FPS ensures fluidity in almost all cases, assuming one's FPS does not drop below the set FPS!
Try it out - you might find that the result is good enough vs real G-Sync for you to fall in love with 3DV again.
If we can manage to keep it Civil, xXxStarManxXx, this thread might serve as a valuable info resource for posterity :)
What we want to do is keep the refresh to 120Hz but at the same time, limit the FPS to e.g. 50fps. I have never tried RTSS but can confirm Nvidia Profile Inspector works flawlessly (it looks complicated but it's actually extremely easy to use!):
Another factor is that nVidia's marketing team always show GSync videos compared to double buffered VSync which makes it look fantastic. This is an unfair comparison because most games nowadays, OpenGL and DX, are coded with triple buffering support. Regardless, 3DV forces triple buffering by default, so simply capping the FPS ensures fluidity in almost all cases, assuming one's FPS does not drop below the set FPS!
Try it out - you might find that the result is good enough vs real G-Sync for you to fall in love with 3DV again.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Capped 50fps at 60Hz has judder. Predictable judder, but it exists. I prefer the 100Hz 3D option.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.
Capped 50fps at 60Hz has judder. Predictable judder, but it exists. I prefer the 100Hz 3D option.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.
I can confirm masterotaku's observations in GTA5 - stuttering at both 50fps and 48fps locked, as well as 3DV anomalies. Also tried FastSync, VSync off, etc.
30FPS is fluidly fine as expected, but the FPS is intolerably low and laggy.
+ The game displays 3DV anomalies accompanying stuttering. I think the game's implementation of native 3DV is fundamentally different in some ways than autostereo games... Fascinating.
I can confirm masterotaku's observations in GTA5 - stuttering at both 50fps and 48fps locked, as well as 3DV anomalies. Also tried FastSync, VSync off, etc.
30FPS is fluidly fine as expected, but the FPS is intolerably low and laggy.
+ The game displays 3DV anomalies accompanying stuttering. I think the game's implementation of native 3DV is fundamentally different in some ways than autostereo games... Fascinating.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
[quote="masterotaku"]Capped 50fps at 60Hz has judder. Predictable judder, but it exists. I prefer the 100Hz 3D option.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.[/quote]
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
masterotaku said:Capped 50fps at 60Hz has judder. Predictable judder, but it exists. I prefer the 100Hz 3D option.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
[quote="RAGEdemon"]If we can manage to keep it Civil, xXxStarManxXx, this thread might serve as a valuable info resource for posterity :)
What we want to do is keep the refresh to 120Hz but at the same time, limit the FPS to e.g. 50fps. I have never tried RTSS but can confirm Nvidia Profile Inspector works flawlessly (it looks complicated but it's actually extremely easy to use!):
[url]https://github.com/Orbmu2k/nvidiaProfileInspector/releases[/url]
Detailed Instructions here:
[url]https://forums.guru3d.com/threads/nvidia-inspector-introduction-and-guide.403676/[/url]
Another factor is that nVidia's marketing team always show GSync videos compared to double buffered VSync which makes it look fantastic. This is an unfair comparison because most games nowadays, OpenGL and DX, are coded with triple buffering support. Regardless, 3DV forces triple buffering by default, so simply capping the FPS ensures fluidity in almost all cases, assuming one's FPS does not drop below the set FPS!
Try it out - you might find that the result is good enough vs real G-Sync for you to fall in love with 3DV again.
[/quote]
I've been with Nvidia and AMD hardware since 2011, I know how to use Nvidia Inspector, thanks though!
RAGEdemon said:If we can manage to keep it Civil, xXxStarManxXx, this thread might serve as a valuable info resource for posterity :)
What we want to do is keep the refresh to 120Hz but at the same time, limit the FPS to e.g. 50fps. I have never tried RTSS but can confirm Nvidia Profile Inspector works flawlessly (it looks complicated but it's actually extremely easy to use!):
Another factor is that nVidia's marketing team always show GSync videos compared to double buffered VSync which makes it look fantastic. This is an unfair comparison because most games nowadays, OpenGL and DX, are coded with triple buffering support. Regardless, 3DV forces triple buffering by default, so simply capping the FPS ensures fluidity in almost all cases, assuming one's FPS does not drop below the set FPS!
Try it out - you might find that the result is good enough vs real G-Sync for you to fall in love with 3DV again.
I've been with Nvidia and AMD hardware since 2011, I know how to use Nvidia Inspector, thanks though!
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
[quote="xXxStarManxXx"][quote="masterotaku"]Capped 50fps at 60Hz has judder. Predictable judder, but it exists. I prefer the 100Hz 3D option.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.[/quote]
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
...
You can't see how resolution and Hz affect how bright the image is?
....
[/quote]
I'm (naturally) skeptical about this. Wouldn't a given display run different timings for the input pulses based on frequency? If you've talking about ULMB/Black line insertion, then yeah, I can see how that dims the image. But for a given display running at say 50hz rather than 100hz, wouldn't the actual total 'on/off' or firing times be exactly the same?
Genuine question, it's a new one to me and I've never heard the light bulb comparison either. It just doesn't sit true.
masterotaku said:Capped 50fps at 60Hz has judder. Predictable judder, but it exists. I prefer the 100Hz 3D option.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
...
You can't see how resolution and Hz affect how bright the image is?
....
I'm (naturally) skeptical about this. Wouldn't a given display run different timings for the input pulses based on frequency? If you've talking about ULMB/Black line insertion, then yeah, I can see how that dims the image. But for a given display running at say 50hz rather than 100hz, wouldn't the actual total 'on/off' or firing times be exactly the same?
Genuine question, it's a new one to me and I've never heard the light bulb comparison either. It just doesn't sit true.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote="xXxStarManxXx"]
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
[/quote]
Maybe it's dimmer on your monitor. Or just a bit in general. However it's a fact that Lightboost at 100Hz has more persistence:
[img]https://www.blurbusters.com/wp-content/uploads/2013/06/motion-blur-graph.png[/img]
More or less compensating the supposedly 20% more brightness of 120Hz. In one second, you get (at 100% Lightboost) 280ms worth of light at 100Hz and 288ms at 120Hz.
About 3D at 165Hz, it's absolutely worthless. I mean, it works, but there isn't any kind of Lightboost or ULMB at that refresh rate so you see the full pixel transitions per eye. And it's baaad. Almost like looking at it without glasses. Except for the bottom 20% of the screen maybe.
For 240Hz monitors, there's ULMB at 144Hz which can be pushed to 151-155Hz with custom timings (I don't remember the exact limit). I assume it won'tbe good for crosstalk anyway.
At any refresh rate other than 100Hz and 120Hz you get reversed eyes and the red Nvidia warning, though. And ULMB isn't as good as Lightboost for crosstalk. Maybe tolerable at low refresh rates.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
Maybe it's dimmer on your monitor. Or just a bit in general. However it's a fact that Lightboost at 100Hz has more persistence:
More or less compensating the supposedly 20% more brightness of 120Hz. In one second, you get (at 100% Lightboost) 280ms worth of light at 100Hz and 288ms at 120Hz.
About 3D at 165Hz, it's absolutely worthless. I mean, it works, but there isn't any kind of Lightboost or ULMB at that refresh rate so you see the full pixel transitions per eye. And it's baaad. Almost like looking at it without glasses. Except for the bottom 20% of the screen maybe.
For 240Hz monitors, there's ULMB at 144Hz which can be pushed to 151-155Hz with custom timings (I don't remember the exact limit). I assume it won'tbe good for crosstalk anyway.
At any refresh rate other than 100Hz and 120Hz you get reversed eyes and the red Nvidia warning, though. And ULMB isn't as good as Lightboost for crosstalk. Maybe tolerable at low refresh rates.
I see starfraud has gone mysteriously quiet again when confronted with evidence.
Presumably he's off researching youtube to see if 60hz bulbs in the US are brighter than 50hz bulbs in the EU.
I see starfraud has gone mysteriously quiet again when confronted with evidence.
Presumably he's off researching youtube to see if 60hz bulbs in the US are brighter than 50hz bulbs in the EU.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
[quote="masterotaku"][quote="xXxStarManxXx"]
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
[/quote]
Maybe it's dimmer on your monitor. Or just a bit in general. However it's a fact that Lightboost at 100Hz has more persistence:
[img]https://www.blurbusters.com/wp-content/uploads/2013/06/motion-blur-graph.png[/img]
More or less compensating the supposedly 20% more brightness of 120Hz. In one second, you get (at 100% Lightboost) 280ms worth of light at 100Hz and 288ms at 120Hz.
About 3D at 165Hz, it's absolutely worthless. I mean, it works, but there isn't any kind of Lightboost or ULMB at that refresh rate so you see the full pixel transitions per eye. And it's baaad. Almost like looking at it without glasses. Except for the bottom 20% of the screen maybe.
For 240Hz monitors, there's ULMB at 144Hz which can be pushed to 151-155Hz with custom timings (I don't remember the exact limit). I assume it won'tbe good for crosstalk anyway.
At any refresh rate other than 100Hz and 120Hz you get reversed eyes and the red Nvidia warning, though. And ULMB isn't as good as Lightboost for crosstalk. Maybe tolerable at low refresh rates.[/quote]
To my eyes there's a difference in brightness, I encourage you to go back and forth between 100 and 120 Hz 3D Vision to see what I'm talking about. It's one of the first things I noticed. I'm using PG278Q, maybe it's a limitation of Lightboost on this monitor, but yeah, there's a noticeable difference in brightness between 100 and 120 Hz 3D Vision on this panel.
[quote="rustyk21"]I see starfraud has gone mysteriously quiet again when confronted with evidence.
Presumably he's off researching youtube to see if 60hz bulbs in the US are brighter than 50hz bulbs in the EU. [/quote]
Shit-in-a-bowl attempts to refute scientific reality with anger driven opinion.
https://www.quora.com/Will-the-intensity-of-light-increase-with-increase-in-frequency
"Light and any form of electromagnetic radiation, such as radio waves, are really photons. A photon is the smallest possible quantum of light. In general when you turn up the intensity of light you are increasing the number of photons per second that are emitted by the light source. Therefore the intensity of the light can indeed be changed independently of the frequency (or color) of the light. So you can have high intensity (lots of photons), low frequency light or low intensity (few photons), high frequency light - they are independent variable.
However, one way of measuring the intensity of a light would be to measure the energy of the light. Now it turns out that the energy in a given photon is proportional to the frequency of the photon. In particular the equation is:
E=hν
Where E is the energy, h is Planck's constant and ν is the frequency of the photon. So this is one way in which you could say that the intensity does increase with frequency. If you have two light beams at different frequencies but with the same number of photons per second, the higher frequency light will have more energy in it's light beam.
In particular if you are trying to make very weak beams of light, the least you can do is to have only a single photon of light. In that case if you compare two single photons, the higher frequency photon will have more energy and thus be more "intense" than the lower frequency photon."
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
Maybe it's dimmer on your monitor. Or just a bit in general. However it's a fact that Lightboost at 100Hz has more persistence:
More or less compensating the supposedly 20% more brightness of 120Hz. In one second, you get (at 100% Lightboost) 280ms worth of light at 100Hz and 288ms at 120Hz.
About 3D at 165Hz, it's absolutely worthless. I mean, it works, but there isn't any kind of Lightboost or ULMB at that refresh rate so you see the full pixel transitions per eye. And it's baaad. Almost like looking at it without glasses. Except for the bottom 20% of the screen maybe.
For 240Hz monitors, there's ULMB at 144Hz which can be pushed to 151-155Hz with custom timings (I don't remember the exact limit). I assume it won'tbe good for crosstalk anyway.
At any refresh rate other than 100Hz and 120Hz you get reversed eyes and the red Nvidia warning, though. And ULMB isn't as good as Lightboost for crosstalk. Maybe tolerable at low refresh rates.
To my eyes there's a difference in brightness, I encourage you to go back and forth between 100 and 120 Hz 3D Vision to see what I'm talking about. It's one of the first things I noticed. I'm using PG278Q, maybe it's a limitation of Lightboost on this monitor, but yeah, there's a noticeable difference in brightness between 100 and 120 Hz 3D Vision on this panel.
rustyk21 said:I see starfraud has gone mysteriously quiet again when confronted with evidence.
Presumably he's off researching youtube to see if 60hz bulbs in the US are brighter than 50hz bulbs in the EU.
"Light and any form of electromagnetic radiation, such as radio waves, are really photons. A photon is the smallest possible quantum of light. In general when you turn up the intensity of light you are increasing the number of photons per second that are emitted by the light source. Therefore the intensity of the light can indeed be changed independently of the frequency (or color) of the light. So you can have high intensity (lots of photons), low frequency light or low intensity (few photons), high frequency light - they are independent variable.
However, one way of measuring the intensity of a light would be to measure the energy of the light. Now it turns out that the energy in a given photon is proportional to the frequency of the photon. In particular the equation is:
E=hν
Where E is the energy, h is Planck's constant and ν is the frequency of the photon. So this is one way in which you could say that the intensity does increase with frequency. If you have two light beams at different frequencies but with the same number of photons per second, the higher frequency light will have more energy in it's light beam.
In particular if you are trying to make very weak beams of light, the least you can do is to have only a single photon of light. In that case if you compare two single photons, the higher frequency photon will have more energy and thus be more "intense" than the lower frequency photon."
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
So it's got nothing to do with wattage then? The light output (measured in lumens) of an energy source is a function of frequency? Well, for some reason you googled frequency and came back with some impressive sounding science (which you don't understand) even when we were talking about displays, refresh rates and lightbulbs.
Do you think the photons coming out a 50hz display have less energy than the photons coming out of a 100hz display?
Do you understand the relationship between frequency, wavelength and colour?
You do know we're talking about visible light don't you?
You don't know what you're talking about and have been found out again. Plus the insults... You act like a rude child. Now run over to the 2080 thread to post some more banal drivel.
And where's my apology? You'd have thought as a 40 year old combat veteran you'd have a big more dignity and class.
So it's got nothing to do with wattage then? The light output (measured in lumens) of an energy source is a function of frequency? Well, for some reason you googled frequency and came back with some impressive sounding science (which you don't understand) even when we were talking about displays, refresh rates and lightbulbs.
Do you think the photons coming out a 50hz display have less energy than the photons coming out of a 100hz display?
Do you understand the relationship between frequency, wavelength and colour?
You do know we're talking about visible light don't you?
You don't know what you're talking about and have been found out again. Plus the insults... You act like a rude child. Now run over to the 2080 thread to post some more banal drivel.
And where's my apology? You'd have thought as a 40 year old combat veteran you'd have a big more dignity and class.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Please see my new thread about GTAV settings....
Gaming Rig 1
i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO
Gaming Rig 2
My new build
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
I've actually tried this without success in the past (The Witcher 3), setting the refresh rate to 100 Hz (PG278Q) which makes for 50 Hz 3D Vision however did work and it was fluid like butter, albeit with a subtle flicker on the horizon and the image was noticeably dimmer (the higher the refresh rate the brighter the image).
In the end I couldn't take the flicker, especially noticeable in the skybox in The Witcher 3 and went back to 120 Hz in Control Panel and just turned a few settings down.
But yeah, for whatever reason, artificially limiting the FPS via RTSS never worked for me. What sucks is that the FPS can be 55 FPS and it will stutter. If it's not 60 FPS solid there's stutter and it's really aggravating and probably the number one reason I opt for 2D G-Sync over 3D Vision for a lot of my games. Being able to turn settings down and off that affect GPU load is one thing, but when you end up with a CPU bottleneck and there's not a whole lot you can do to mitigate it this is extremely frustrating.
Honestly I don't even know why this 3 Core bug is even an issue. Why doesn't Nvidia do something about it? Are they trying to kill off 3D Vision? Because that's what it looks like.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
What we want to do is keep the refresh to 120Hz but at the same time, limit the FPS to e.g. 50fps. I have never tried RTSS but can confirm Nvidia Profile Inspector works flawlessly (it looks complicated but it's actually extremely easy to use!):
https://github.com/Orbmu2k/nvidiaProfileInspector/releases
Detailed Instructions here:
https://forums.guru3d.com/threads/nvidia-inspector-introduction-and-guide.403676/
Another factor is that nVidia's marketing team always show GSync videos compared to double buffered VSync which makes it look fantastic. This is an unfair comparison because most games nowadays, OpenGL and DX, are coded with triple buffering support. Regardless, 3DV forces triple buffering by default, so simply capping the FPS ensures fluidity in almost all cases, assuming one's FPS does not drop below the set FPS!
Try it out - you might find that the result is good enough vs real G-Sync for you to fall in love with 3DV again.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
If you have ULMB available, you can go even lower, down to 64Hz (32Hz per eye), but with reversed eyes and worse overdrive than Lightboost. I'm very resistant to flickering, but if 50Hz bothers you... yeah, don't go lower.
Btw, 100Hz shouldn't be dimmer. Lightboost uses a higher persistence level to compensate (so it has more motion blur).
I wish I could try a 240Hz monitor to see what's the limit of the emitter and glasses. 165Hz works.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
30FPS is fluidly fine as expected, but the FPS is intolerably low and laggy.
+ The game displays 3DV anomalies accompanying stuttering. I think the game's implementation of native 3DV is fundamentally different in some ways than autostereo games... Fascinating.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
It is absolutely dimmer.
Think of a light bulb. A 100 Hz light bulb is noticeably brighter than a 60 Hz one.
Also, running custom resolution of -10% in NVCP is also dimmer because there's less lines of resolution.
You can't see how resolution and Hz affect how bright the image is?
I can absolutely notice it, it's not something that is intolerable but it's there.
Just to clarify, this is only something you can see by limiting the actual refresh rate to 50 Hz. If you have a 60 Hz monitor and the FPS dips to 50 FPS it isn't the same thing as having the refresh rate itself at 50 Hz.
I was unaware that 3D Vision works at 165 Hz. So it's 80 FPS? What is that like with 3D Vision? Yeah I'm curious as to whether or not the glasses and emitter can do 3D Vision with a 240 Hz panel. Honestly, I wouldn't mind picking up a 240 Hz 1080p panel solely for 3D Vision if it works, they are relatively cheap.
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
I've been with Nvidia and AMD hardware since 2011, I know how to use Nvidia Inspector, thanks though!
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
I'm (naturally) skeptical about this. Wouldn't a given display run different timings for the input pulses based on frequency? If you've talking about ULMB/Black line insertion, then yeah, I can see how that dims the image. But for a given display running at say 50hz rather than 100hz, wouldn't the actual total 'on/off' or firing times be exactly the same?
Genuine question, it's a new one to me and I've never heard the light bulb comparison either. It just doesn't sit true.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Maybe it's dimmer on your monitor. Or just a bit in general. However it's a fact that Lightboost at 100Hz has more persistence:
More or less compensating the supposedly 20% more brightness of 120Hz. In one second, you get (at 100% Lightboost) 280ms worth of light at 100Hz and 288ms at 120Hz.
About 3D at 165Hz, it's absolutely worthless. I mean, it works, but there isn't any kind of Lightboost or ULMB at that refresh rate so you see the full pixel transitions per eye. And it's baaad. Almost like looking at it without glasses. Except for the bottom 20% of the screen maybe.
For 240Hz monitors, there's ULMB at 144Hz which can be pushed to 151-155Hz with custom timings (I don't remember the exact limit). I assume it won'tbe good for crosstalk anyway.
At any refresh rate other than 100Hz and 120Hz you get reversed eyes and the red Nvidia warning, though. And ULMB isn't as good as Lightboost for crosstalk. Maybe tolerable at low refresh rates.
CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: Gainward Phoenix 1080 GLH
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com
Presumably he's off researching youtube to see if 60hz bulbs in the US are brighter than 50hz bulbs in the EU.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
To my eyes there's a difference in brightness, I encourage you to go back and forth between 100 and 120 Hz 3D Vision to see what I'm talking about. It's one of the first things I noticed. I'm using PG278Q, maybe it's a limitation of Lightboost on this monitor, but yeah, there's a noticeable difference in brightness between 100 and 120 Hz 3D Vision on this panel.
Shit-in-a-bowl attempts to refute scientific reality with anger driven opinion.
https://www.quora.com/Will-the-intensity-of-light-increase-with-increase-in-frequency
"Light and any form of electromagnetic radiation, such as radio waves, are really photons. A photon is the smallest possible quantum of light. In general when you turn up the intensity of light you are increasing the number of photons per second that are emitted by the light source. Therefore the intensity of the light can indeed be changed independently of the frequency (or color) of the light. So you can have high intensity (lots of photons), low frequency light or low intensity (few photons), high frequency light - they are independent variable.
However, one way of measuring the intensity of a light would be to measure the energy of the light. Now it turns out that the energy in a given photon is proportional to the frequency of the photon. In particular the equation is:
E=hν
Where E is the energy, h is Planck's constant and ν is the frequency of the photon. So this is one way in which you could say that the intensity does increase with frequency. If you have two light beams at different frequencies but with the same number of photons per second, the higher frequency light will have more energy in it's light beam.
In particular if you are trying to make very weak beams of light, the least you can do is to have only a single photon of light. In that case if you compare two single photons, the higher frequency photon will have more energy and thus be more "intense" than the lower frequency photon."
i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703
https://www.3dmark.com/compare/fs/14520125/fs/11807761#
Do you think the photons coming out a 50hz display have less energy than the photons coming out of a 100hz display?
Do you understand the relationship between frequency, wavelength and colour?
You do know we're talking about visible light don't you?
You don't know what you're talking about and have been found out again. Plus the insults... You act like a rude child. Now run over to the 2080 thread to post some more banal drivel.
And where's my apology? You'd have thought as a 40 year old combat veteran you'd have a big more dignity and class.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310