Faking 120 hertz.....? If i were to output 2fps per refresh ping......
Hello, i have a question. when you're watching a 120 hz display, that means you're outputting more frames per second per hertz ping. Now, afterward, if you were to take a 75 hz monitor, and out two frames persecond per refresh ping, that is, the screen moves two frames per Hertz Ping, do you think you can fake 150Hz? this is just me doing some research and not wanting to go out and buy myself a 120 hertz monitor. i'd rather get something touchscreen.
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
Hello, i have a question. when you're watching a 120 hz display, that means you're outputting more frames per second per hertz ping. Now, afterward, if you were to take a 75 hz monitor, and out two frames persecond per refresh ping, that is, the screen moves two frames per Hertz Ping, do you think you can fake 150Hz? this is just me doing some research and not wanting to go out and buy myself a 120 hertz monitor. i'd rather get something touchscreen.
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
Hello, i have a question. when you're watching a 120 hz display, that means you're outputting more frames per second per hertz ping. Now, afterward, if you were to take a 75 hz monitor, and out two frames persecond per refresh ping, that is, the screen moves two frames per Hertz Ping, do you think you can fake 150Hz? this is just me doing some research and not wanting to go out and buy myself a 120 hertz monitor. i'd rather get something touchscreen.
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
Hello, i have a question. when you're watching a 120 hz display, that means you're outputting more frames per second per hertz ping. Now, afterward, if you were to take a 75 hz monitor, and out two frames persecond per refresh ping, that is, the screen moves two frames per Hertz Ping, do you think you can fake 150Hz? this is just me doing some research and not wanting to go out and buy myself a 120 hertz monitor. i'd rather get something touchscreen.
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
[quote name='maxmayhem' post='1103967' date='Aug 15 2010, 03:02 AM']IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.[/quote]
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
[quote name='maxmayhem' post='1103967' date='Aug 15 2010, 03:02 AM']IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
[quote name='maxmayhem' post='1103967' date='Aug 15 2010, 03:02 AM']IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.[/quote]
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
[quote name='maxmayhem' post='1103967' date='Aug 15 2010, 03:02 AM']IF for example, you make the FPS the graphics card output flicker too, you're flickering with the monitor too, i may be a little off, but if you flicker in time between flickers, won't that double your refresh rate?
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
i was thinking maybe you could squeeze a few extra frames in between a monitor's refresh rate,
therefor, faking an extra 60 hz through a 60 hz monitor.....
hmmmm
i was thinking maybe you could squeeze a few extra frames in between a monitor's refresh rate,
therefor, faking an extra 60 hz through a 60 hz monitor.....
hmmmm
[quote name='tritosine' post='1103984' date='Aug 15 2010, 01:15 PM']dude, the LCD screen draws the picture upside down.. You can't put in 2 frames while its drawing : ))))))
Get a CRT monitor, you might end up with better 3d.[/quote]
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
[quote name='tritosine' post='1103984' date='Aug 15 2010, 01:15 PM']dude, the LCD screen draws the picture upside down.. You can't put in 2 frames while its drawing : ))))))
Get a CRT monitor, you might end up with better 3d.
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
[quote name='tritosine' post='1103984' date='Aug 15 2010, 01:15 PM']dude, the LCD screen draws the picture upside down.. You can't put in 2 frames while its drawing : ))))))
Get a CRT monitor, you might end up with better 3d.[/quote]
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
[quote name='tritosine' post='1103984' date='Aug 15 2010, 01:15 PM']dude, the LCD screen draws the picture upside down.. You can't put in 2 frames while its drawing : ))))))
Get a CRT monitor, you might end up with better 3d.
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
no, no it couldnt. you cant "squeeze" an extra frame into the refresh cycle that a monitor completes.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
no, no it couldnt. you cant "squeeze" an extra frame into the refresh cycle that a monitor completes.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
no, no it couldnt. you cant "squeeze" an extra frame into the refresh cycle that a monitor completes.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
no, no it couldnt. you cant "squeeze" an extra frame into the refresh cycle that a monitor completes.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
so if @ 75hz = 2fps = 150 hz thoroughput.
do you think i'm right about something, that it's possible to run something like 3d nvision glasses? from nvidia? just a question.
what if they came up with technology called the 2-sync? or sometihng.
where vsync syncs a frame to the refresh ping,
what if you're outputting two fps per hertz ping.
Revove from the list of quoted messages Add to the list of quoted messages Quick edit Edit this message
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.[/quote]
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
~Nutz
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
~Nutz
---- Core System Components ----
(MBD) EVGA® Classified™ (x58) E760
(CPU) Intel® i7™ '980x' (OC'd) @ 4.8Ghz
(CPU) Corsair® (CPU) Cooling™ (H50)
(MEM) Corsair® (MEM) Dominator(GT)™ 12GB @ 2000Mhz
(PSU) PC)P&C™ (PSU)'T12W' @ 1200w
(CSE) Cooler Master® Stacker™ (830)
---- (3D) Graphics Sub'Sys ----
(2x) EVGA® GTX'970 (SC) - Nvidia® SLi™
(1x) EVGA® GTX'660 (Ti) - Nvidia® PhysX™
(1x) ACER® (GN) 246(HL) - Nvidia® 3DVision™
(1x) ASUS® (VG) 248(QE) - Nvidia® 3DVision™
(1x) ACER® (GN) 246(HL) - Nvidia® 3DVision™
---- Audio & System Control ----
(1x) ASUS® - Xonar™ (HDAV1.3)
(1x) VL'Sys® - MPlay202+ 'GUI' & (RF) Remote
---- Storage (HDD's) & Media (ODD's) PB & REC ----
(1x) (SSD) Samsung® - 850(PRO) '3D'Vertical™
(1x) (2TB) Seagate® - Hybrid Series™
(4x) (2TB) W.Digital® - 'Blacks'™
(2x) (ODD) LG® BluRay™ - 'Play'n'Burn'
---- Nvidia® (WHQL) Drivers (x64) In Use ----
(NV®)DR - v347.88 (WHQL) - Primary (GTA V)
(NV®)DR - v350.12 (WHQL) - Testing (Stable)
(NV®)DR - v353.06 (WHQL) - All Other Titles
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.[/quote]
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
~Nutz
hmmm.....
what i mean to say is, flicker from the graphics card, in tune with the monitor hertz flicker, do you think you could double refresh rates.
flickin' aye....
:P
and if there's a bottle neck of refresh rate, vs graphic output, you the graphic card can definitely work faster than the bottleneck.
Ok, umm huh? Trying to decipher what you are asking has put my brain into a frenzy, however, after a reboot of my thoughts, it seems like what you are asking is the basic idea behind interlaced video, which causes it's own issues that wreak havoc on picture quality. "Jaggies" and other video anomalies are the results of alternating scan lnes, similar syncing issues would be rather evident by using an altrenating frame system, just like the "Alternate Frame Rendering" option for SLI users which has been known to increase "Tearing" and "MicroStutter".
In fact, this has already been done before, I believe this is how such past 3D techs have worked, such as the 3D Glasses available for the Sega Master system, these were also shutter glasses that were able to mate with any standard television.
The obvious problem with that is the flickering caused by the low refresh rate of the TV's already interlaced image.
Nvidias current 3D offering requires a 120Hz display to work correctly, but that's where the confusion comes in for some. When you game in 3D you are arn't looking at a 120Hz image in the traditional sense, what you are seeing is 2 images at 60Hz each which are configured in a similar manner to which you speak, the only difference is that your eyes are treated to a refresh rate that, at 60Hz, provides a smooth framerate for stutter free gaming, and very low flicker while viewing in 3D.
Think of it this way, each one of the 2 frames could be sent to it's own standard 60Hz run of the mill LCD screen .. The 120Hz capabilities of the "3DVision Ready" displays really function as 2 seperate G0Hz displays converged in such a way as to provide a single image when viewed through the shutter glasses.
~Nutz
---- Core System Components ----
(MBD) EVGA® Classified™ (x58) E760
(CPU) Intel® i7™ '980x' (OC'd) @ 4.8Ghz
(CPU) Corsair® (CPU) Cooling™ (H50)
(MEM) Corsair® (MEM) Dominator(GT)™ 12GB @ 2000Mhz
(PSU) PC)P&C™ (PSU)'T12W' @ 1200w
(CSE) Cooler Master® Stacker™ (830)
---- (3D) Graphics Sub'Sys ----
(2x) EVGA® GTX'970 (SC) - Nvidia® SLi™
(1x) EVGA® GTX'660 (Ti) - Nvidia® PhysX™
(1x) ACER® (GN) 246(HL) - Nvidia® 3DVision™
(1x) ASUS® (VG) 248(QE) - Nvidia® 3DVision™
(1x) ACER® (GN) 246(HL) - Nvidia® 3DVision™
---- Audio & System Control ----
(1x) ASUS® - Xonar™ (HDAV1.3)
(1x) VL'Sys® - MPlay202+ 'GUI' & (RF) Remote
---- Storage (HDD's) & Media (ODD's) PB & REC ----
(1x) (SSD) Samsung® - 850(PRO) '3D'Vertical™
(1x) (2TB) Seagate® - Hybrid Series™
(4x) (2TB) W.Digital® - 'Blacks'™
(2x) (ODD) LG® BluRay™ - 'Play'n'Burn'
---- Nvidia® (WHQL) Drivers (x64) In Use ----
(NV®)DR - v347.88 (WHQL) - Primary (GTA V)
(NV®)DR - v350.12 (WHQL) - Testing (Stable)
(NV®)DR - v353.06 (WHQL) - All Other Titles
even at the graphic card level then...
i was thinking maybe you could squeeze a few extra frames in between a monitor's refresh rate,
therefor, faking an extra 60 hz through a 60 hz monitor.....
hmmmm
even at the graphic card level then...
i was thinking maybe you could squeeze a few extra frames in between a monitor's refresh rate,
therefor, faking an extra 60 hz through a 60 hz monitor.....
hmmmm
even at the graphic card level then...
i was thinking maybe you could squeeze a few extra frames in between a monitor's refresh rate,
therefor, faking an extra 60 hz through a 60 hz monitor.....
hmmmm
even at the graphic card level then...
i was thinking maybe you could squeeze a few extra frames in between a monitor's refresh rate,
therefor, faking an extra 60 hz through a 60 hz monitor.....
hmmmm
Get a CRT monitor, you might end up with better 3d.
Get a CRT monitor, you might end up with better 3d.
Get a CRT monitor, you might end up with better 3d.
Get a CRT monitor, you might end up with better 3d.
Get a CRT monitor, you might end up with better 3d.[/quote]
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
Get a CRT monitor, you might end up with better 3d.
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
Get a CRT monitor, you might end up with better 3d.[/quote]
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
Get a CRT monitor, you might end up with better 3d.
hmm. i get it.
what 3d vision (nvidia) does is it syncs to the LCD shutter/Refresh rate and transmits left and right images to the left and right eye respectively.
i could be wrong, apparently. if 3d graphics cards flickered too (their output), along with the monitor, along with the 3d shutter glasses, in a sync, you'd think that would work.
graphics cards could refresh and feed the monitor, monitors could squeeze in the extra refresh rate, and the nvidia nvision could catch it alongways.
you have to think a little beyond what current technology offers. but i think a driver should work.
it's just a theory. could work, i could be wrong, but it was worth posting about.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
_ NVLDDMKM problems_ | _ problems getting a driver for a laptop graphics card_ | _What PSU do I need?_
[quote name='The Professor' date='11 August 2011 - 10:33 AM' timestamp='1313055223' post='1277858']
I think Qazax is a pretty cool guy. eh kills aleins and doesnt afraid of anything.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
if your graphics card is playing a really old game and is outputting 500FP which can easily happen, the monitor still only displays 75hz because that is what its capable of. in some situations you get screen tearing because your current FPS cannot be devided by 75 and so the graphics card is mid output of a frame when the monitor refreshes and you get half of one frame on screen and the other half is a different frame and they dont line up.
vsync solves this by limiting the graphics cards ouptut frames to the same speed as the monitor.
you cannot trick, hack, or bypass in any way the monitors refresh rate and you need to pay the money to get the features. thats all there is to it, you must upgrade.
_ NVLDDMKM problems_ | _ problems getting a driver for a laptop graphics card_ | _What PSU do I need?_
[quote name='The Professor' date='11 August 2011 - 10:33 AM' timestamp='1313055223' post='1277858']
I think Qazax is a pretty cool guy. eh kills aleins and doesnt afraid of anything.