Hey guys.
As some of you may know, there is a tool out and about called "GeDoSaTo", or "Generic DownSampling Tool" if you will.
Now, as the name implies, this is a tool that aims to be a generic tool, that will potentially work with any DirectX game, and will supply much needed features, that should have been there from the start.
Think the Dark Souls series on pc, and how it was fixed with mods.
With this tool, you wouldn't have to go hunting for said mods.
Recently the source code was made opensource on GitHub ( https://github.com/PeterTh/gedosato ).
So my question is this, would it be possible to implement some generic 3D Vision fixes into this?
I gotta be hones, i know very little of C++, and know nothing about DirectX.
I've only dabbled in OpenGL..
Hey guys.
As some of you may know, there is a tool out and about called "GeDoSaTo", or "Generic DownSampling Tool" if you will.
Now, as the name implies, this is a tool that aims to be a generic tool, that will potentially work with any DirectX game, and will supply much needed features, that should have been there from the start.
Think the Dark Souls series on pc, and how it was fixed with mods.
With this tool, you wouldn't have to go hunting for said mods.
Recently the source code was made opensource on GitHub ( https://github.com/PeterTh/gedosato ).
So my question is this, would it be possible to implement some generic 3D Vision fixes into this?
I gotta be hones, i know very little of C++, and know nothing about DirectX.
I've only dabbled in OpenGL..
Tried it today with a few titles. Haven't gamed in a while but from what I see, everything just works. I'm quite surprised.
Right now, it's only DX9, but with some more time the dev is looking into DX11 too.
Impressive. It's a shame nVidia's down-sampling methods can't work half as well.
Durante is pulling off some incredible stuff with it.
[url=http://blog.metaclassofnil.com/?tag=gedosato]For anyone interested, here's a download link[/url].
It allows for
[list]
[.]Downsampling WHILE playing in 3D (so 3D and incredible image quality)[/.]
[.]Downsampling and still playing at 120fps[/.]
[.]Texture replacement[/.]
[.]Universal forced anti aliasing (SMAA or FXAA, with options for low, medium, high, ultra[/.]
[.]Integrated SweetFx[/.]
[.]Forced V-sync controls[/.]
[.]Universal Borderless Fullscreen[/.]
[.]Bicubic scaling for cleaner IQ[/.]
[.]Specially made configurations for games, to pull off effects similar to ENB, but open source so users can create them[/.]
[/list]
The difference it makes in IQ is fantastic. Especially if you downsample with something like FXAA or SMAA layered on top, there's barely any aliasing to be seen at all. The image looks so clean.
Here's a comparison. [b][u]View em full sized.[/u][/b] I cropped them to make it easier to compare.
Here it is rendered at normal 1080p
[url=http://fc08.deviantart.net/fs71/f/2014/165/8/5/legomarvel_2014_06_14_20_20_42_87_by_aloo81-d7mekn5.jpg][img]http://fc08.deviantart.net/fs71/f/2014/165/8/5/legomarvel_2014_06_14_20_20_42_87_by_aloo81-d7mekn5.jpg[/img][/url]
Here it is downsampling with FXAA in GeDoSaTo
[url=http://fc05.deviantart.net/fs71/f/2014/165/e/f/screenshot_2014_06_14_20_13_00_by_aloo81-d7mekmp.jpg][img]http://fc05.deviantart.net/fs71/f/2014/165/e/f/screenshot_2014_06_14_20_13_00_by_aloo81-d7mekmp.jpg[/img][/url]
Aside from those differences, max settings elsewhere.
View them full size, and look at the buildings, or Venoms teeth. The aliasing is really obvious in the one rendered in regular 1080p, but the downsampled one you can't see an ounce of aliasing anywhere. GeDoSaTo implements a nice bloom that looks really nice on the screens in the background (and you can even disable the bloom if it's not something you like).
Looking at Red Hulks arm, you can see some really blatant aliasing all throughout in the top image, but in the GeDoSaTo one it's completely gone.
And provided you have a powerful enough GPU, you could play with that GeDoSaTo image quality in full 3D, or at 120fps.
It's a fantastic tool and absolutely everyone should give it a shot.
Downsampling WHILE playing in 3D (so 3D and incredible image quality)
Downsampling and still playing at 120fps
Texture replacement
Universal forced anti aliasing (SMAA or FXAA, with options for low, medium, high, ultra
Integrated SweetFx
Forced V-sync controls
Universal Borderless Fullscreen
Bicubic scaling for cleaner IQ
Specially made configurations for games, to pull off effects similar to ENB, but open source so users can create them
The difference it makes in IQ is fantastic. Especially if you downsample with something like FXAA or SMAA layered on top, there's barely any aliasing to be seen at all. The image looks so clean.
Here's a comparison. View em full sized. I cropped them to make it easier to compare.
Here it is rendered at normal 1080p
Here it is downsampling with FXAA in GeDoSaTo
Aside from those differences, max settings elsewhere.
View them full size, and look at the buildings, or Venoms teeth. The aliasing is really obvious in the one rendered in regular 1080p, but the downsampled one you can't see an ounce of aliasing anywhere. GeDoSaTo implements a nice bloom that looks really nice on the screens in the background (and you can even disable the bloom if it's not something you like).
Looking at Red Hulks arm, you can see some really blatant aliasing all throughout in the top image, but in the GeDoSaTo one it's completely gone.
And provided you have a powerful enough GPU, you could play with that GeDoSaTo image quality in full 3D, or at 120fps.
It's a fantastic tool and absolutely everyone should give it a shot.
[quote="Pirateguybrush"]Wow, that looks fantastic. What's the impact on performance?[/quote]
It's REALLY, really customizable, so the performance impact is on a really dynamic sliding scale.
For the example provided, it's a change from about 90-100fps to 60-70fps, but I basically flipped it on to "all the bells and whistles + rendering at 4K" mode.
Using the post processing (bloom, vibrance, stuff like that) can cause a reasonable hit but they don't change the visuals TOO much so if you disable it, you'll still get the really great downsampling with Bicubic scaling.
It currently works with DX9 games, and Durante is working on implementing DX11 support.
I have a moderately old GPU (GTX 570) so it's great for me to use on older and less demanding games to make them look great. I plan on testing it with Dead Space to see if I can get that game looking gorgeous downsampled in 3D. Anyone with a more powerful GPU can get some really beautiful visuals.
If you've got a more modern GPU, you can get visuals like this at around 60fps (I believe Durante has a 680, or 780, and these are his screenshots)
[img]http://abload.de/img/screenshot_2014-05-103ef9m.jpg[/img]
[img]http://abload.de/img/screenshot_2014-05-10nfiqq.jpg[/img]
[img]http://abload.de/img/screenshot_2014-05-10hwint.jpg[/img]
Pirateguybrush said:Wow, that looks fantastic. What's the impact on performance?
It's REALLY, really customizable, so the performance impact is on a really dynamic sliding scale.
For the example provided, it's a change from about 90-100fps to 60-70fps, but I basically flipped it on to "all the bells and whistles + rendering at 4K" mode.
Using the post processing (bloom, vibrance, stuff like that) can cause a reasonable hit but they don't change the visuals TOO much so if you disable it, you'll still get the really great downsampling with Bicubic scaling.
It currently works with DX9 games, and Durante is working on implementing DX11 support.
I have a moderately old GPU (GTX 570) so it's great for me to use on older and less demanding games to make them look great. I plan on testing it with Dead Space to see if I can get that game looking gorgeous downsampled in 3D. Anyone with a more powerful GPU can get some really beautiful visuals.
If you've got a more modern GPU, you can get visuals like this at around 60fps (I believe Durante has a 680, or 780, and these are his screenshots)
Excellent Live game compatibility list. I have added some of my own trials to it over the weekend.
[url]https://docs.google.com/spreadsheet/ccc?key=0AjiEnZ1RzqDMdGdmalZoX25nVUtOT2FOLUw3S0Fzenc&usp=sharing#gid=0[/url]
Keep in mind that the list absolutely isn't comprehensive, and as he's been updating compatability has been getting better.
So if it's listed as working, it probably works perfect and if it's listed as not working, it may have been fixed recently and you should try it out on it.
Some quick things to note, if it's absolutely not working at all, a lot of times disabling Force Borderless Fullscreen will fix it, and if the mouse isn't working correctly in game, on the bottom of the INI there are options that you should play around with to see if any fix the mouse issues.
Keep in mind that the list absolutely isn't comprehensive, and as he's been updating compatability has been getting better.
So if it's listed as working, it probably works perfect and if it's listed as not working, it may have been fixed recently and you should try it out on it.
Some quick things to note, if it's absolutely not working at all, a lot of times disabling Force Borderless Fullscreen will fix it, and if the mouse isn't working correctly in game, on the bottom of the INI there are options that you should play around with to see if any fix the mouse issues.
It definitely works for laptops and compatability is definitely hit or miss. For starters, it's currently only working for DX9 games (DX11 compatability incoming) and it doesn't work with all Dx9 games. It does however work with a LOT of games, and those it does work with the effect is damned great.
It definitely works for laptops and compatability is definitely hit or miss. For starters, it's currently only working for DX9 games (DX11 compatability incoming) and it doesn't work with all Dx9 games. It does however work with a LOT of games, and those it does work with the effect is damned great.
For free software that does something so amazing - something that even the almighty nVidia can' t even do after years of trying, it's a damn near miracle it works at all, let alone so well.
For free software that does something so amazing - something that even the almighty nVidia can' t even do after years of trying, it's a damn near miracle it works at all, let alone so well.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I'll have to give this a try with my new PJ (H6510BD).
With my old one(PLED-W500), I simply used Nvidia's custom resolution to input "pseudo" 3840x2400@120 by swithing to manual so that the "active pixels" would not change. It required a pixel clock patch that broke HDCP. I could get "pseudo" 2560x1600@120 or 1920x1080@120 without the patch.
It really improved the picture quality, but I think it had a lot to do with algorithm that the PJ used because of the diamond pixel 0.045 DMD chipset. Some games wouldn't recognize the resolution, but it worked great in the ones that did. It was also great for video playback. Using this method on the new PJ does not provide acceptable results. Perhaps it might work on a different model or brand?
When using my Radeon GPU I used the Downsampling with AMD: Guide and Demonstration from Guru3D http://forums.guru3d.com/showthread.php?t=366244
FYI: The new Acer H6520BD is releasing. Also for those looking to spend a little more, Optoma is releasing HD50 and the HD36 with PureMotion plus new Ultra Detail II.
http://www.avsforum.com/forum/68-digital-projectors-under-3-000-usd-msrp/1567386-acer-h6520bd-dlp-full-hd-dc3-144hz-3d.html
I'll have to give this a try with my new PJ (H6510BD).
With my old one(PLED-W500), I simply used Nvidia's custom resolution to input "pseudo" 3840x2400@120 by swithing to manual so that the "active pixels" would not change. It required a pixel clock patch that broke HDCP. I could get "pseudo" 2560x1600@120 or 1920x1080@120 without the patch.
It really improved the picture quality, but I think it had a lot to do with algorithm that the PJ used because of the diamond pixel 0.045 DMD chipset. Some games wouldn't recognize the resolution, but it worked great in the ones that did. It was also great for video playback. Using this method on the new PJ does not provide acceptable results. Perhaps it might work on a different model or brand?
FYI: The new Acer H6520BD is releasing. Also for those looking to spend a little more, Optoma is releasing HD50 and the HD36 with PureMotion plus new Ultra Detail II.
[quote="D-Man11"]I'll have to give this a try with my new PJ (H6510BD).
With my old one(PLED-W500), I simply used Nvidia's custom resolution to input "pseudo" 3840x2400@120 by swithing to manual so that the "active pixels" would not change. It required a pixel clock patch that broke HDCP. I could get "pseudo" 2560x1600@120 or 1920x1080@120 without the patch.
It really improved the picture quality, but I think it had a lot to do with algorithm that the PJ used because of the diamond pixel 0.045 DMD chipset. Some games wouldn't recognize the resolution, but it worked great in the ones that did. It was also great for video playback. Using this method on the new PJ does not provide acceptable results. Perhaps it might work on a different model or brand?
When using my Radeon GPU I used the Downsampling with AMD: Guide and Demonstration from Guru3D http://forums.guru3d.com/showthread.php?t=366244
FYI: The new Acer H6520BD is releasing. Also for those looking to spend a little more, Optoma is releasing HD50 and the HD36 with PureMotion plus new Ultra Detail II.
http://www.avsforum.com/forum/68-digital-projectors-under-3-000-usd-msrp/1567386-acer-h6520bd-dlp-full-hd-dc3-144hz-3d.html
[/quote]
Thanks for the insight D-Man11.
Acer 144Hz 3D sounds interesting. I wonder if it will handle 144Hz input i.e. 72FPS 3D Vision gaming. Hopefully we can hack the driver for 3D Vision.
D-Man11 said:I'll have to give this a try with my new PJ (H6510BD).
With my old one(PLED-W500), I simply used Nvidia's custom resolution to input "pseudo" 3840x2400@120 by swithing to manual so that the "active pixels" would not change. It required a pixel clock patch that broke HDCP. I could get "pseudo" 2560x1600@120 or 1920x1080@120 without the patch.
It really improved the picture quality, but I think it had a lot to do with algorithm that the PJ used because of the diamond pixel 0.045 DMD chipset. Some games wouldn't recognize the resolution, but it worked great in the ones that did. It was also great for video playback. Using this method on the new PJ does not provide acceptable results. Perhaps it might work on a different model or brand?
[quote="innuendo1231b"]Hello to all the devs! I think this is the place to ask my question...
How is the resolution override function supposed to work? I tried it in a few games but it didn't do anything.
Is it supposed to scale the games rendering resolution to a different output resolution specified in the ini? Because that is waht I'd need...
I am looking to render the game at 1920x1080 but my output resolution should be 3840x2160. Gedosato can do this, but only for dx9. When I saw that 3Dmigoto's ini has this resolution override thing there, I was hoping it could do it for dx11.... If it worked, It would be a big help to 4k TV users.
[/quote]
You might try this...if I understand your post correctly
You can do that manually via Nvidia's control panel.
Go to the "Change Resolution" tab of the NVCP
Select "Customize" then select "Create Custom Resolution"
Now set Horizontal pixels to 1920 and Vertical Lines to 1080
Now where it says "Automatic" select "GTF" first, then select "CVT reduced blank"
(toggling it between the two makes sure that values change, otherwise, they sometimes do not)
Where it now says "CVT reduced blank" , you'll want to change it to "Manual" before doing anything.
Now change "Horizontal Lines" from 1920 to 3840 and "Vertical Lines" from 1080 to 2160.
Click "Test", then click "Yes"
If it says "Duplicate resolution for some reason, start over and set the refresh to 59 or 61 Hz.
innuendo1231b said:Hello to all the devs! I think this is the place to ask my question...
How is the resolution override function supposed to work? I tried it in a few games but it didn't do anything.
Is it supposed to scale the games rendering resolution to a different output resolution specified in the ini? Because that is waht I'd need...
I am looking to render the game at 1920x1080 but my output resolution should be 3840x2160. Gedosato can do this, but only for dx9. When I saw that 3Dmigoto's ini has this resolution override thing there, I was hoping it could do it for dx11.... If it worked, It would be a big help to 4k TV users.
You might try this...if I understand your post correctly
You can do that manually via Nvidia's control panel.
Go to the "Change Resolution" tab of the NVCP
Select "Customize" then select "Create Custom Resolution"
Now set Horizontal pixels to 1920 and Vertical Lines to 1080
Now where it says "Automatic" select "GTF" first, then select "CVT reduced blank"
(toggling it between the two makes sure that values change, otherwise, they sometimes do not)
Where it now says "CVT reduced blank" , you'll want to change it to "Manual" before doing anything.
Now change "Horizontal Lines" from 1920 to 3840 and "Vertical Lines" from 1080 to 2160.
Click "Test", then click "Yes"
If it says "Duplicate resolution for some reason, start over and set the refresh to 59 or 61 Hz.
As some of you may know, there is a tool out and about called "GeDoSaTo", or "Generic DownSampling Tool" if you will.
Now, as the name implies, this is a tool that aims to be a generic tool, that will potentially work with any DirectX game, and will supply much needed features, that should have been there from the start.
Think the Dark Souls series on pc, and how it was fixed with mods.
With this tool, you wouldn't have to go hunting for said mods.
Recently the source code was made opensource on GitHub ( https://github.com/PeterTh/gedosato ).
So my question is this, would it be possible to implement some generic 3D Vision fixes into this?
I gotta be hones, i know very little of C++, and know nothing about DirectX.
I've only dabbled in OpenGL..
Right now, it's only DX9, but with some more time the dev is looking into DX11 too.
Impressive. It's a shame nVidia's down-sampling methods can't work half as well.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
For anyone interested, here's a download link.
It allows for
The difference it makes in IQ is fantastic. Especially if you downsample with something like FXAA or SMAA layered on top, there's barely any aliasing to be seen at all. The image looks so clean.
Here's a comparison. View em full sized. I cropped them to make it easier to compare.
Here it is rendered at normal 1080p
Here it is downsampling with FXAA in GeDoSaTo
Aside from those differences, max settings elsewhere.
View them full size, and look at the buildings, or Venoms teeth. The aliasing is really obvious in the one rendered in regular 1080p, but the downsampled one you can't see an ounce of aliasing anywhere. GeDoSaTo implements a nice bloom that looks really nice on the screens in the background (and you can even disable the bloom if it's not something you like).
Looking at Red Hulks arm, you can see some really blatant aliasing all throughout in the top image, but in the GeDoSaTo one it's completely gone.
And provided you have a powerful enough GPU, you could play with that GeDoSaTo image quality in full 3D, or at 120fps.
It's a fantastic tool and absolutely everyone should give it a shot.
It's REALLY, really customizable, so the performance impact is on a really dynamic sliding scale.
For the example provided, it's a change from about 90-100fps to 60-70fps, but I basically flipped it on to "all the bells and whistles + rendering at 4K" mode.
Using the post processing (bloom, vibrance, stuff like that) can cause a reasonable hit but they don't change the visuals TOO much so if you disable it, you'll still get the really great downsampling with Bicubic scaling.
It currently works with DX9 games, and Durante is working on implementing DX11 support.
I have a moderately old GPU (GTX 570) so it's great for me to use on older and less demanding games to make them look great. I plan on testing it with Dead Space to see if I can get that game looking gorgeous downsampled in 3D. Anyone with a more powerful GPU can get some really beautiful visuals.
If you've got a more modern GPU, you can get visuals like this at around 60fps (I believe Durante has a 680, or 780, and these are his screenshots)
GTX 1070 SLI, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310
https://docs.google.com/spreadsheet/ccc?key=0AjiEnZ1RzqDMdGdmalZoX25nVUtOT2FOLUw3S0Fzenc&usp=sharing#gid=0
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
So if it's listed as working, it probably works perfect and if it's listed as not working, it may have been fixed recently and you should try it out on it.
Some quick things to note, if it's absolutely not working at all, a lot of times disabling Force Borderless Fullscreen will fix it, and if the mouse isn't working correctly in game, on the bottom of the INI there are options that you should play around with to see if any fix the mouse issues.
Model: Clevo P570WM Laptop
GPU: GeForce GTX 980M ~8GB GDDR5
CPU: Intel Core i7-4960X CPU +4.2GHz (12 CPUs)
Memory: 32GB Corsair Vengeance DDR3L 1600MHz, 4x8gb
OS: Microsoft Windows 7 Ultimate
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Gigabyte Z370 Gaming 7 32GB Ram i9-9900K GigaByte Aorus Extreme Gaming 2080TI (single) Game Blaster Z Windows 10 X64 build #17763.195 Define R6 Blackout Case Corsair H110i GTX Sandisk 1TB (OS) SanDisk 2TB SSD (Games) Seagate EXOs 8 and 12 TB drives Samsung UN46c7000 HD TV Samsung UN55HU9000 UHD TVCurrently using ACER PASSIVE EDID override on 3D TVs LG 55
With my old one(PLED-W500), I simply used Nvidia's custom resolution to input "pseudo" 3840x2400@120 by swithing to manual so that the "active pixels" would not change. It required a pixel clock patch that broke HDCP. I could get "pseudo" 2560x1600@120 or 1920x1080@120 without the patch.
It really improved the picture quality, but I think it had a lot to do with algorithm that the PJ used because of the diamond pixel 0.045 DMD chipset. Some games wouldn't recognize the resolution, but it worked great in the ones that did. It was also great for video playback. Using this method on the new PJ does not provide acceptable results. Perhaps it might work on a different model or brand?
When using my Radeon GPU I used the Downsampling with AMD: Guide and Demonstration from Guru3D http://forums.guru3d.com/showthread.php?t=366244
FYI: The new Acer H6520BD is releasing. Also for those looking to spend a little more, Optoma is releasing HD50 and the HD36 with PureMotion plus new Ultra Detail II.
http://www.avsforum.com/forum/68-digital-projectors-under-3-000-usd-msrp/1567386-acer-h6520bd-dlp-full-hd-dc3-144hz-3d.html
Thanks for the insight D-Man11.
Acer 144Hz 3D sounds interesting. I wonder if it will handle 144Hz input i.e. 72FPS 3D Vision gaming. Hopefully we can hack the driver for 3D Vision.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
You might try this...if I understand your post correctly
You can do that manually via Nvidia's control panel.
Go to the "Change Resolution" tab of the NVCP
Select "Customize" then select "Create Custom Resolution"
Now set Horizontal pixels to 1920 and Vertical Lines to 1080
Now where it says "Automatic" select "GTF" first, then select "CVT reduced blank"
(toggling it between the two makes sure that values change, otherwise, they sometimes do not)
Where it now says "CVT reduced blank" , you'll want to change it to "Manual" before doing anything.
Now change "Horizontal Lines" from 1920 to 3840 and "Vertical Lines" from 1080 to 2160.
Click "Test", then click "Yes"
If it says "Duplicate resolution for some reason, start over and set the refresh to 59 or 61 Hz.