I am not sure who that would work with gaming. Problem with games is that unless you have an extra beefy machine there is no way to guarantee the same FPS through out the game. Also for a 120hz monitor you may have to limit the FPS to 60 or some factor of 120 to even try using that method. For 3d tvs hooked up to a PC over HDMI it's receiving 60HZ input with regular 2D viewing thus the hardware/software of the tv can actually interpolate that.
But remember in the case of the tv it has to
1) firstly receive frame 1 (maybe display it there but I think the tv actually buffers it)
2) waits 1/60 of a second receives frame 2
3) calculates frame 1.5/middle frame ( i have no idea how long that takes)
4) displays frame 1.5 while frame 3 is being buffered
5) and repeats the process.
There is alot of lag being added.
Take a look at 2 tvs side by side watching the same content. One with the anti-judder tech and one without. You should notice he anti-judder tech maybe a few frames behind.
I am not sure who that would work with gaming. Problem with games is that unless you have an extra beefy machine there is no way to guarantee the same FPS through out the game. Also for a 120hz monitor you may have to limit the FPS to 60 or some factor of 120 to even try using that method. For 3d tvs hooked up to a PC over HDMI it's receiving 60HZ input with regular 2D viewing thus the hardware/software of the tv can actually interpolate that.
But remember in the case of the tv it has to
1) firstly receive frame 1 (maybe display it there but I think the tv actually buffers it)
2) waits 1/60 of a second receives frame 2
3) calculates frame 1.5/middle frame ( i have no idea how long that takes)
4) displays frame 1.5 while frame 3 is being buffered
5) and repeats the process.
There is alot of lag being added.
Take a look at 2 tvs side by side watching the same content. One with the anti-judder tech and one without. You should notice he anti-judder tech maybe a few frames behind.
Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10
I am also wondering why this feature is still not available. Input lag would be a small issue, but GPU resource savings which could be exchanged for quality boost could bring HUGE benefits. I can assume that:
120 FPS (60 FPS + 60 Interpolated frames) high quality game would look much better than
120 FPS medium quality game.
Gamers with native 120Hz/144Hz monitors like ASUS VG278HE could benefit a lot from this technology.
Amount of realism which could be added to games by using this technology is amazing. Motion interpolation offers very cost effective quality boost while other options, like increasing number of resistors, cost huge amounts of money. If proven correct then whoever implements it first (nVidia, AMD or Intel) will get temporary significant advantage in the market and it would be crazy to ignore it.
nVidia have resources and they could just use help of SmoothVideo Project developers who have the knowledge and are doing this for free (of course if competitors won't snatch this oportunity first).
I am also wondering why this feature is still not available. Input lag would be a small issue, but GPU resource savings which could be exchanged for quality boost could bring HUGE benefits. I can assume that:
120 FPS (60 FPS + 60 Interpolated frames) high quality game would look much better than
120 FPS medium quality game.
Gamers with native 120Hz/144Hz monitors like ASUS VG278HE could benefit a lot from this technology.
Amount of realism which could be added to games by using this technology is amazing. Motion interpolation offers very cost effective quality boost while other options, like increasing number of resistors, cost huge amounts of money. If proven correct then whoever implements it first (nVidia, AMD or Intel) will get temporary significant advantage in the market and it would be crazy to ignore it.
nVidia have resources and they could just use help of SmoothVideo Project developers who have the knowledge and are doing this for free (of course if competitors won't snatch this oportunity first).
Pre-recorded video is "Static", each frame is pre-rendered. Interpolation works by comparing the prvious frame with the next frame and extrapolating what it thinks would be compatible frame from the information via an algorithm and inserting said frame between the two frames. Interpolation often leads to unwanted artifacts but has been greatly improved since it inception.
Since Video Games are "Dynamic" the next frame is dependent on what the player does. So the GPU doesn't know ahead of time what the next frame will be. The GPU would thus need to hold back the first frame until the second frame is known, then compute the "interopulated" frame and finally send them out to the display. Think about it, it doesn't work. Can you see the inherent input delay(lag)?
The closest thing, is frame buffer. Which is already used depending on certain factors.
If you want interpolation in a Video Game, get a monitor that does it via post processing of the input signal. But know this, there is an inherent lag that is associated with it. Most TVs have a Game Mode that turns off interpolation for a reason.
Pre-recorded video is "Static", each frame is pre-rendered. Interpolation works by comparing the prvious frame with the next frame and extrapolating what it thinks would be compatible frame from the information via an algorithm and inserting said frame between the two frames. Interpolation often leads to unwanted artifacts but has been greatly improved since it inception.
Since Video Games are "Dynamic" the next frame is dependent on what the player does. So the GPU doesn't know ahead of time what the next frame will be. The GPU would thus need to hold back the first frame until the second frame is known, then compute the "interopulated" frame and finally send them out to the display. Think about it, it doesn't work. Can you see the inherent input delay(lag)?
The closest thing, is frame buffer. Which is already used depending on certain factors.
If you want interpolation in a Video Game, get a monitor that does it via post processing of the input signal. But know this, there is an inherent lag that is associated with it. Most TVs have a Game Mode that turns off interpolation for a reason.
I think it's a personal thing. I love frame interpolation and use it on my TV all the time.
IMO, the 'soap opera' effect is just a mental hangover because films have always been 24 fps. It's almost arbitrary and only due to technological reasons as I understand it.
If you can let go of your conditioning, then it's a much more realistic way of watching things.
I think it's a personal thing. I love frame interpolation and use it on my TV all the time.
IMO, the 'soap opera' effect is just a mental hangover because films have always been 24 fps. It's almost arbitrary and only due to technological reasons as I understand it.
If you can let go of your conditioning, then it's a much more realistic way of watching things.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
I've always have said that interpolation tickles my brain. I initially hated it, especially with early displays/firmware that induced lots of artifacts.
Now I like it, because it puts a different spin on the content I'm viewing and seems to keep me more attuned.
I use SVP with my projector with great results.
First DVD I used it on was King Kong with Jack Black. It was great with all of the fast motion dinosaur chase scenes :P
http://www.svp-team.com/
I've always have said that interpolation tickles my brain. I initially hated it, especially with early displays/firmware that induced lots of artifacts.
Now I like it, because it puts a different spin on the content I'm viewing and seems to keep me more attuned.
I use SVP with my projector with great results.
First DVD I used it on was King Kong with Jack Black. It was great with all of the fast motion dinosaur chase scenes :P
But remember in the case of the tv it has to
1) firstly receive frame 1 (maybe display it there but I think the tv actually buffers it)
2) waits 1/60 of a second receives frame 2
3) calculates frame 1.5/middle frame ( i have no idea how long that takes)
4) displays frame 1.5 while frame 3 is being buffered
5) and repeats the process.
There is alot of lag being added.
Take a look at 2 tvs side by side watching the same content. One with the anti-judder tech and one without. You should notice he anti-judder tech maybe a few frames behind.
But remember in the case of the tv it has to
1) firstly receive frame 1 (maybe display it there but I think the tv actually buffers it)
2) waits 1/60 of a second receives frame 2
3) calculates frame 1.5/middle frame ( i have no idea how long that takes)
4) displays frame 1.5 while frame 3 is being buffered
5) and repeats the process.
There is alot of lag being added.
Take a look at 2 tvs side by side watching the same content. One with the anti-judder tech and one without. You should notice he anti-judder tech maybe a few frames behind.
Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10
120 FPS (60 FPS + 60 Interpolated frames) high quality game would look much better than
120 FPS medium quality game.
Gamers with native 120Hz/144Hz monitors like ASUS VG278HE could benefit a lot from this technology.
Amount of realism which could be added to games by using this technology is amazing. Motion interpolation offers very cost effective quality boost while other options, like increasing number of resistors, cost huge amounts of money. If proven correct then whoever implements it first (nVidia, AMD or Intel) will get temporary significant advantage in the market and it would be crazy to ignore it.
nVidia have resources and they could just use help of SmoothVideo Project developers who have the knowledge and are doing this for free (of course if competitors won't snatch this oportunity first).
Since Video Games are "Dynamic" the next frame is dependent on what the player does. So the GPU doesn't know ahead of time what the next frame will be. The GPU would thus need to hold back the first frame until the second frame is known, then compute the "interopulated" frame and finally send them out to the display. Think about it, it doesn't work. Can you see the inherent input delay(lag)?
The closest thing, is frame buffer. Which is already used depending on certain factors.
If you want interpolation in a Video Game, get a monitor that does it via post processing of the input signal. But know this, there is an inherent lag that is associated with it. Most TVs have a Game Mode that turns off interpolation for a reason.
IMO, the 'soap opera' effect is just a mental hangover because films have always been 24 fps. It's almost arbitrary and only due to technological reasons as I understand it.
If you can let go of your conditioning, then it's a much more realistic way of watching things.
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
Now I like it, because it puts a different spin on the content I'm viewing and seems to keep me more attuned.
I use SVP with my projector with great results.
First DVD I used it on was King Kong with Jack Black. It was great with all of the fast motion dinosaur chase scenes :P
http://www.svp-team.com/