NVIDIA DLSS: Your Questions, Answered
https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/ By Andrew Edelsten on February 15, 2019 | Featured Stories DLSS Hi, I’m Andrew Edelsten, Technical Director of Deep Learning at NVIDIA. I’ve been working here since 2010, and for the last couple of years my team has been working with the folks at NVIDIA Research to create DLSS. This week, we’re excited to launch DLSS for BattlefieldTM V and Metro Exodus, following launches in Final Fantasy XV: Windows Edition and 3DMark Port Royal. There have been a lot of questions, and I wanted to get some answers out to you on the most popular ones. Q: What is DLSS? A: Deep Learning Super Sampling (DLSS) is an NVIDIA RTX technology that uses the power of AI to boost your frame rates in games with graphically-intensive workloads. With DLSS, gamers can use higher resolutions and settings while still maintaining solid framerates. Q: How does DLSS work? A: The DLSS team first extracts many aliased frames from the target game, and then for each one we generate a matching “perfect frame” using either super-sampling or accumulation rendering. These paired frames are fed to NVIDIA’s supercomputer. The supercomputer trains the DLSS model to recognize aliased inputs and generate high quality anti-aliased images that match the “perfect frame” as closely as possible. We then repeat the process, but this time we train the model to generate additional pixels rather than applying AA. This has the effect of increasing the resolution of the input. Combining both techniques enables the GPU to render the full monitor resolution at higher frame rates. Q: Where does DLSS provide the biggest benefit? And why isn’t it available for all resolutions? A: The results of DLSS vary a bit, because each game has different characteristics based on the game engine, complexity of content, and the time spent on training. Our supercomputer never sleeps, and we continue to train and improve our deep learning neural network even after a game’s launch. When we have improvements to performance or image quality ready, we provide them to you via NVIDIA software updates. DLSS is designed to boost frame rates at high GPU workloads (i.e. when your framerate is low and your GPU is working to its full capacity without bottlenecks or other limitations). If your game is already running at high frame rates, your GPU’s frame rendering time may be shorter than the DLSS execution time. In this case, DLSS is not available because it would not improve your framerate. However, if your game is heavily utilizing the GPU (e.g. FPS is below ~60), DLSS provides an optimal performance boost. You can crank up your settings to maximize your gains. (Note: 60 FPS is an approximation -- the exact number varies by game and what graphics settings are enabled) To put it a bit more technically, DLSS requires a fixed amount of GPU time per frame to run the deep neural network. Thus, games that run at lower frame rates (proportionally less fixed workload) or higher resolutions (greater pixel shading savings), benefit more from DLSS. For games running at high frame rates or low resolutions, DLSS may not boost performance. When your GPU’s frame rendering time is shorter than what it takes to execute the DLSS model, we don’t enable DLSS. We only enable DLSS for cases where you will receive a performance gain. DLSS availability is game-specific, and depends on your GPU and selected display resolution. Q: Some users mentioned blurry frames. Can you explain? A: DLSS is a new technology and we are working hard to perfect it. We built DLSS to leverage the Turing architecture’s Tensor Cores and to provide the largest benefit when GPU load is high. To this end, we concentrated on high resolutions during development (where GPU load is highest) with 4K (3840x2160) being the most common training target. Running at 4K is beneficial when it comes to image quality as the number of input pixels is high. Typically for 4K DLSS, we have around 3.5-5.5 million pixels from which to generate the final frame, while at 1920x1080 we only have around 1.0-1.5 million pixels. The less source data, the greater the challenge for DLSS to detect features in the input frame and predict the final frame. We have seen the screenshots and are listening to the community’s feedback about DLSS at lower resolutions, and are focusing on it as a top priority. We are adding more training data and some new techniques to improve quality, and will continue to train the deep neural network so that it improves over time. Q: Why don’t I just use upscaled TAA instead? A: Depending on the resolution, quality settings, and game implementation, some may prefer TAA in one game and DLSS in another. The game industry has used TAA for many years and we know that it can fall down in certain ways. TAA is generated from multiple frames and can suffer from high-motion ghosting and flickering that DLSS tends to handle better. Q: When’s the next DLSS update for Battlefield V and Metro Exodus? A: We are constantly working to improve image quality. Recently we updated the core of DLSS so that you get the latest model updates the moment you launch your game. So make sure you have our latest Game Ready Driver (418.91 or higher) installed. For Battlefield V, we think DLSS delivers a big improvement in 4K and 2560x1440 performance -- up to 40% -- for the corresponding quality, but also hear the community. For the next push, we are focusing our testing and training to improve the image quality at 1920x1080 and also for ultrawide monitors (e.g. 3440x1440). The current experience at these resolutions is not where we want them. For Metro Exodus, we’ve got an update coming that improves DLSS sharpness and overall image quality across all resolutions that didn’t make it into day of launch. We’re also training DLSS on a larger cross section of the game, and once these updates are ready you will see another increase in quality. Lastly, we are looking into a few other reported issues, such as with HDR, and will update as soon as we have fixes.
https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/


By Andrew Edelsten on February 15, 2019 | Featured Stories DLSS

Hi, I’m Andrew Edelsten, Technical Director of Deep Learning at NVIDIA. I’ve been working here since 2010, and for the last couple of years my team has been working with the folks at NVIDIA Research to create DLSS.

This week, we’re excited to launch DLSS for BattlefieldTM V and Metro Exodus, following launches in Final Fantasy XV: Windows Edition and 3DMark Port Royal. There have been a lot of questions, and I wanted to get some answers out to you on the most popular ones.

Q: What is DLSS?

A: Deep Learning Super Sampling (DLSS) is an NVIDIA RTX technology that uses the power of AI to boost your frame rates in games with graphically-intensive workloads. With DLSS, gamers can use higher resolutions and settings while still maintaining solid framerates.

Q: How does DLSS work?

A: The DLSS team first extracts many aliased frames from the target game, and then for each one we generate a matching “perfect frame” using either super-sampling or accumulation rendering. These paired frames are fed to NVIDIA’s supercomputer. The supercomputer trains the DLSS model to recognize aliased inputs and generate high quality anti-aliased images that match the “perfect frame” as closely as possible. We then repeat the process, but this time we train the model to generate additional pixels rather than applying AA. This has the effect of increasing the resolution of the input. Combining both techniques enables the GPU to render the full monitor resolution at higher frame rates.

Q: Where does DLSS provide the biggest benefit? And why isn’t it available for all resolutions?

A: The results of DLSS vary a bit, because each game has different characteristics based on the game engine, complexity of content, and the time spent on training. Our supercomputer never sleeps, and we continue to train and improve our deep learning neural network even after a game’s launch. When we have improvements to performance or image quality ready, we provide them to you via NVIDIA software updates.

DLSS is designed to boost frame rates at high GPU workloads (i.e. when your framerate is low and your GPU is working to its full capacity without bottlenecks or other limitations). If your game is already running at high frame rates, your GPU’s frame rendering time may be shorter than the DLSS execution time. In this case, DLSS is not available because it would not improve your framerate. However, if your game is heavily utilizing the GPU (e.g. FPS is below ~60), DLSS provides an optimal performance boost. You can crank up your settings to maximize your gains. (Note: 60 FPS is an approximation -- the exact number varies by game and what graphics settings are enabled)

To put it a bit more technically, DLSS requires a fixed amount of GPU time per frame to run the deep neural network. Thus, games that run at lower frame rates (proportionally less fixed workload) or higher resolutions (greater pixel shading savings), benefit more from DLSS. For games running at high frame rates or low resolutions, DLSS may not boost performance. When your GPU’s frame rendering time is shorter than what it takes to execute the DLSS model, we don’t enable DLSS. We only enable DLSS for cases where you will receive a performance gain. DLSS availability is game-specific, and depends on your GPU and selected display resolution.

Q: Some users mentioned blurry frames. Can you explain?

A: DLSS is a new technology and we are working hard to perfect it.

We built DLSS to leverage the Turing architecture’s Tensor Cores and to provide the largest benefit when GPU load is high. To this end, we concentrated on high resolutions during development (where GPU load is highest) with 4K (3840x2160) being the most common training target. Running at 4K is beneficial when it comes to image quality as the number of input pixels is high. Typically for 4K DLSS, we have around 3.5-5.5 million pixels from which to generate the final frame, while at 1920x1080 we only have around 1.0-1.5 million pixels. The less source data, the greater the challenge for DLSS to detect features in the input frame and predict the final frame.

We have seen the screenshots and are listening to the community’s feedback about DLSS at lower resolutions, and are focusing on it as a top priority. We are adding more training data and some new techniques to improve quality, and will continue to train the deep neural network so that it improves over time.

Q: Why don’t I just use upscaled TAA instead?

A: Depending on the resolution, quality settings, and game implementation, some may prefer TAA in one game and DLSS in another.

The game industry has used TAA for many years and we know that it can fall down in certain ways. TAA is generated from multiple frames and can suffer from high-motion ghosting and flickering that DLSS tends to handle better.

Q: When’s the next DLSS update for Battlefield V and Metro Exodus?

A: We are constantly working to improve image quality. Recently we updated the core of DLSS so that you get the latest model updates the moment you launch your game. So make sure you have our latest Game Ready Driver (418.91 or higher) installed.

For Battlefield V, we think DLSS delivers a big improvement in 4K and 2560x1440 performance -- up to 40% -- for the corresponding quality, but also hear the community. For the next push, we are focusing our testing and training to improve the image quality at 1920x1080 and also for ultrawide monitors (e.g. 3440x1440). The current experience at these resolutions is not where we want them.

For Metro Exodus, we’ve got an update coming that improves DLSS sharpness and overall image quality across all resolutions that didn’t make it into day of launch. We’re also training DLSS on a larger cross section of the game, and once these updates are ready you will see another increase in quality. Lastly, we are looking into a few other reported issues, such as with HDR, and will update as soon as we have fixes.

#1
Posted 02/16/2019 08:22 PM   
Interesting reading!
Interesting reading!

The very powerful and the very stupid have one thing in common. Instead of altering their views to fit the facts, they alter the facts to fit their views ... which can be very uncomfortable if you happen to be one of the facts that needs altering.

-- Doctor Who, "Face of Evil"

#2
Posted 02/16/2019 08:50 PM   
[quote="Zloth"]The game industry has used TAA for many years[/quote] Really? It doesn't feel that long. Maybe it has been around 4-5 years already. I never liked it. About DLSS, I've seen it compared to TAA, but I need to see more comparisons to a non antialiased image, at both native resolution and equivalent lower resolution.
Zloth said:The game industry has used TAA for many years


Really? It doesn't feel that long. Maybe it has been around 4-5 years already. I never liked it.
About DLSS, I've seen it compared to TAA, but I need to see more comparisons to a non antialiased image, at both native resolution and equivalent lower resolution.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#3
Posted 02/16/2019 09:40 PM   
DLSS: GIMMICK! Why? For simple reasons: Look at BFV and Metro Exodus. UNLESS I MATCH their exact resolution it can't be enabled! WTF is this? SERIOUSLY? IT'S MEANT TO BE a better performance ANTI-ALIASING! (Which should be supported by any resolutions including if I run 641x481 - Yes, the +1 there is not a TYPO), like any AA works. I gave it a go in both BF5 and Metro Exodus in (2D, DX12 - yuck, I know) and was really NOT IMPRESSED by this Blur FILTER that is supposed to be AA. RTX I can respect, but this?!?! This is just crap. I expected DLSS to be more like DSR ( a generic approach, but it seems is even more limited than anything before). I am really disappointed with the RTX 2000 series (in terms of these new "innovations"). For the record I didn't buy an RTX 2080Ti for the RTX and Tensor Cores, but for it's RAW power (CUDA cores). THANK GOD, they didn't say RTX ON only if you have this EXACT FREAKING RESOLUTION, like this utterly useless DLSS! (Or you can use all your CUDA cores if X == Y) HEY, NVIDIA! STOP dreaming about utter-AI-garbage in Video Games graphics and GET US 3D VISION 3 !!! (With 3D Vision you actually DID SOMETHING NEW and very visible! With RTX is semi-visible, with DLSS.... I prefer TAA, thank you, or better yet - NO AA if I run at 4k! I thought DLSS was meant to be used for LOWER resolutions - you know where you NEED AA, not at 4k, but what-so-ever! )
DLSS: GIMMICK! Why?

For simple reasons: Look at BFV and Metro Exodus. UNLESS I MATCH their exact resolution it can't be enabled!
WTF is this? SERIOUSLY? IT'S MEANT TO BE a better performance ANTI-ALIASING! (Which should be supported by any resolutions including if I run 641x481 - Yes, the +1 there is not a TYPO), like any AA works.

I gave it a go in both BF5 and Metro Exodus in (2D, DX12 - yuck, I know) and was really NOT IMPRESSED by this Blur FILTER that is supposed to be AA.

RTX I can respect, but this?!?! This is just crap. I expected DLSS to be more like DSR ( a generic approach, but it seems is even more limited than anything before).

I am really disappointed with the RTX 2000 series (in terms of these new "innovations"). For the record I didn't buy an RTX 2080Ti for the RTX and Tensor Cores, but for it's RAW power (CUDA cores).

THANK GOD, they didn't say RTX ON only if you have this EXACT FREAKING RESOLUTION, like this utterly useless DLSS!
(Or you can use all your CUDA cores if X == Y)

HEY, NVIDIA! STOP dreaming about utter-AI-garbage in Video Games graphics and GET US 3D VISION 3 !!!
(With 3D Vision you actually DID SOMETHING NEW and very visible! With RTX is semi-visible, with DLSS.... I prefer TAA, thank you, or better yet - NO AA if I run at 4k! I thought DLSS was meant to be used for LOWER resolutions - you know where you NEED AA, not at 4k, but what-so-ever! )

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#4
Posted 02/16/2019 10:41 PM   
[quote="Helifax"]DLSS: GIMMICK! Why? For simple reasons: Look at BFV and Metro Exodus. UNLESS I MATCH their exact resolution it can't be enabled! WTF is this? SERIOUSLY? IT'S MEANT TO BE a better performance ANTI-ALIASING! (Which should be supported by any resolutions including if I run 641x481 - Yes, the +1 there is not a TYPO), like any AA works. I gave it a go in both BF5 and Metro Exodus in (2D, DX12 - yuck, I know) and was really NOT IMPRESSED by this Blur FILTER that is supposed to be AA. RTX I can respect, but this?!?! This is just crap. I expected DLSS to be more like DSR ( a generic approach, but it seems is even more limited than anything before). I am really disappointed with the RTX 2000 series (in terms of these new "innovations"). For the record I didn't buy an RTX 2080Ti for the RTX and Tensor Cores, but for it's RAW power (CUDA cores). THANK GOD, they didn't say RTX ON only if you have this EXACT FREAKING RESOLUTION, like this utterly useless DLSS! HEY, NVIDIA! STOP dreaming about utter-AI-garbage in Video Games graphics and GET US 3D VISION 3 !!! (With 3D Vision you actually DID SOMETHING NEW and very visible! With RTX is semi-visible, with DLSS.... I prefer TAA, thank you, or better yet - NO AA if I run at 4k! I thought DLSS was meant to be used for LOWER resolutions - you know where you NEED AA, not at 4k, but what-so-ever! ) [/quote] +1 I hope your words are heared by Nvidia! I also hoped DLSS would be like DSR but not so demanding :(. 3D Vision is currently the only thing which takes games to another level of experience! Raytracing however is indeed the only way to get photo realistic results in games. So generally it's a good and innovative step Nvidia took here. But it's still too early for doing this in real time. Afaik BF5 limits Raytracing to reflections. GI / shadows / refractions are not computed by this. And still it costs a lot of fps. Nvidia should have waited another 3-4 years until graphics cards are strong and cheap enough for this. I really hope realtime raytracing will not have the same destiny as Physx.
Helifax said:DLSS: GIMMICK! Why?

For simple reasons: Look at BFV and Metro Exodus. UNLESS I MATCH their exact resolution it can't be enabled!
WTF is this? SERIOUSLY? IT'S MEANT TO BE a better performance ANTI-ALIASING! (Which should be supported by any resolutions including if I run 641x481 - Yes, the +1 there is not a TYPO), like any AA works.

I gave it a go in both BF5 and Metro Exodus in (2D, DX12 - yuck, I know) and was really NOT IMPRESSED by this Blur FILTER that is supposed to be AA.

RTX I can respect, but this?!?! This is just crap. I expected DLSS to be more like DSR ( a generic approach, but it seems is even more limited than anything before).

I am really disappointed with the RTX 2000 series (in terms of these new "innovations"). For the record I didn't buy an RTX 2080Ti for the RTX and Tensor Cores, but for it's RAW power (CUDA cores).

THANK GOD, they didn't say RTX ON only if you have this EXACT FREAKING RESOLUTION, like this utterly useless DLSS!

HEY, NVIDIA! STOP dreaming about utter-AI-garbage in Video Games graphics and GET US 3D VISION 3 !!!
(With 3D Vision you actually DID SOMETHING NEW and very visible! With RTX is semi-visible, with DLSS.... I prefer TAA, thank you, or better yet - NO AA if I run at 4k! I thought DLSS was meant to be used for LOWER resolutions - you know where you NEED AA, not at 4k, but what-so-ever! )


+1 I hope your words are heared by Nvidia! I also hoped DLSS would be like DSR but not so demanding :(.

3D Vision is currently the only thing which takes games to another level of experience! Raytracing however is indeed the only way to get photo realistic results in games. So generally it's a good and innovative step Nvidia took here. But it's still too early for doing this in real time. Afaik BF5 limits Raytracing to reflections. GI / shadows / refractions are not computed by this. And still it costs a lot of fps. Nvidia should have waited another 3-4 years until graphics cards are strong and cheap enough for this. I really hope realtime raytracing will not have the same destiny as Physx.

ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de

#5
Posted 02/16/2019 10:57 PM   
[quote="Pauldusler"]I really hope realtime raytracing will not have the same destiny as Physx.[/quote] Based on my current feeling and by what I've seen (FPS aside) it is awesome when it delivers! But, there are also portions where nothing visible is ray-traced, yet I still get the same FPS loss. I really hope, it will not go down as Physx, but I can't help but wonder...
Pauldusler said:I really hope realtime raytracing will not have the same destiny as Physx.


Based on my current feeling and by what I've seen (FPS aside) it is awesome when it delivers!
But, there are also portions where nothing visible is ray-traced, yet I still get the same FPS loss. I really hope, it will not go down as Physx, but I can't help but wonder...

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#6
Posted 02/16/2019 11:01 PM   
[quote="Pauldusler"]Nvidia should have waited another 3-4 years until graphics cards are strong and cheap enough for this.[/quote] About RTX being used early: Pros: - You can go back to these games in a few years, get good performance and make use of RTX. - Several years of R+D for developers and Nvidia so when the time comes for better performance with new hardware things will be better than if they start using RTX at that time. Cons: - "[s]PS3[/s] RTX has no games". - Too expensive hardware because of a feature that is barely used for now.
Pauldusler said:Nvidia should have waited another 3-4 years until graphics cards are strong and cheap enough for this.


About RTX being used early:

Pros:
- You can go back to these games in a few years, get good performance and make use of RTX.
- Several years of R+D for developers and Nvidia so when the time comes for better performance with new hardware things will be better than if they start using RTX at that time.

Cons:
- "PS3 RTX has no games".
- Too expensive hardware because of a feature that is barely used for now.

CPU: Intel Core i7 7700K @ 4.9GHz
Motherboard: Gigabyte Aorus GA-Z270X-Gaming 5
RAM: GSKILL Ripjaws Z 16GB 3866MHz CL18
GPU: MSI GeForce RTX 2080Ti Gaming X Trio
Monitor: Asus PG278QR
Speakers: Logitech Z506
Donations account: masterotakusuko@gmail.com

#7
Posted 02/16/2019 11:19 PM   
I thought DLSS used frames rendered at a lower resolution than the display's native resolution and then more or less up-scaled it. Thus giving better performance than rendering the frames at native res. While the image isn't rendered at "native", the quality is supposed to be close with a trade off of better performance. DSR uses rendering that is higher than native res and the downscales it. Note: The above statements are likely entirely wrong. ________________________________________________________________________________________________ Anyhow, it seems that DLSS should be available at any resolution. The head engineer specifically says that it is not dependent on multiples of [quote=Andrew Edelsten ...Director of Developer Technologies at NVIDIA]DLSS provides a new way of achieving higher display resolutions without the cost of rendering every pixel in the traditional sense. DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size.[/quote] Quoted from https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/ BTW, this is part of Nvidia's NGX SDK
I thought DLSS used frames rendered at a lower resolution than the display's native resolution and then more or less up-scaled it. Thus giving better performance than rendering the frames at native res. While the image isn't rendered at "native", the quality is supposed to be close with a trade off of better performance.

DSR uses rendering that is higher than native res and the downscales it.

Note: The above statements are likely entirely wrong.





________________________________________________________________________________________________




Anyhow, it seems that DLSS should be available at any resolution. The head engineer specifically says that it is not dependent on multiples of

Andrew Edelsten ...Director of Developer Technologies at NVIDIA said:DLSS provides a new way of achieving higher display resolutions without the cost of rendering every pixel in the traditional sense. DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size.


Quoted from https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/


BTW, this is part of Nvidia's NGX SDK

#8
Posted 02/16/2019 11:21 PM   
[quote="D-Man11"]I thought DLSS used frames rendered at a lower resolution than the display's native resolution and then more or less up-scaled it. Thus giving better performance than rendering the frames at native res. While the image isn't rendered at "native", the quality is supposed to be close with a trade off of better performance. DSR uses rendering that is higher than native res and the downscales it. Note: The above statements are likely entirely wrong. Well.. The few games that support it, have it locked at some "resolution", like it is EXCLUSIVE for that resolution. At least for the time being. ________________________________________________________________________________________________ Anyhow, it seems that DLSS should be available at any resolution. The head engineer specifically says that it is not dependent on multiples of [quote=Andrew Edelsten ...Director of Developer Technologies at NVIDIA]DLSS provides a new way of achieving higher display resolutions without the cost of rendering every pixel in the traditional sense. DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size.[/quote] Quoted from https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/ BTW, this is part of Nvidia's NGX SDK [/quote]
D-Man11 said:I thought DLSS used frames rendered at a lower resolution than the display's native resolution and then more or less up-scaled it. Thus giving better performance than rendering the frames at native res. While the image isn't rendered at "native", the quality is supposed to be close with a trade off of better performance.

DSR uses rendering that is higher than native res and the downscales it.

Note: The above statements are likely entirely wrong.

Well.. The few games that support it, have it locked at some "resolution", like it is EXCLUSIVE for that resolution. At least for the time being.



________________________________________________________________________________________________




Anyhow, it seems that DLSS should be available at any resolution. The head engineer specifically says that it is not dependent on multiples of

Andrew Edelsten ...Director of Developer Technologies at NVIDIA said:DLSS provides a new way of achieving higher display resolutions without the cost of rendering every pixel in the traditional sense. DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size.


Quoted from https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/



BTW, this is part of Nvidia's NGX SDK

1x Palit RTX 2080Ti Pro Gaming OC(watercooled and overclocked to hell)
3x 3D Vision Ready Asus VG278HE monitors (5760x1080).
Intel i9 9900K (overclocked to 5.3 and watercooled ofc).
Asus Maximus XI Hero Mobo.
16 GB Team Group T-Force Dark Pro DDR4 @ 3600.
Lots of Disks:
- Raid 0 - 256GB Sandisk Extreme SSD.
- Raid 0 - WD Black - 2TB.
- SanDisk SSD PLUS 480 GB.
- Intel 760p 256GB M.2 PCIe NVMe SSD.
Creative Sound Blaster Z.
Windows 10 x64 Pro.
etc


My website with my fixes and OpenGL to 3D Vision wrapper:
http://3dsurroundgaming.com

(If you like some of the stuff that I've done and want to donate something, you can do it with PayPal at tavyhome@gmail.com)

#9
Posted 02/17/2019 12:54 AM   
[quote="masterotaku"] Pros: - You can go back to these games in a few years, get good performance and make use of RTX. [/quote] That will only work if the technology still exists then ^^. If it's not open source and AMD is not capable to support Raytracing game developers also will not be willing to support this in their games. Of course there will be always some rare developers like Dice who are willing to push graphics levels to the next level but the mainstream developers will not be interested in this then. Currently the price for GTX 2080 / GTX 2080 TI is far beyond good and evil. GTX 980 did cost 400-500€. GTX 1080 was 700€. GTX 2080 is 900€ ... GTX 2180 will be 1100€? So in 5-10 years a Nvidia graphics card will cost 2000€? I doubt that everyone's wage was multiplied by 5 in that time ^^. Maybe that's also the reason why the Nvidia stock broke down (crypto mining aside). People cannot pay their insane prices any more.
masterotaku said:
Pros:
- You can go back to these games in a few years, get good performance and make use of RTX.


That will only work if the technology still exists then ^^. If it's not open source and AMD is not capable to support Raytracing game developers also will not be willing to support this in their games. Of course there will be always some rare developers like Dice who are willing to push graphics levels to the next level but the mainstream developers will not be interested in this then.

Currently the price for GTX 2080 / GTX 2080 TI is far beyond good and evil. GTX 980 did cost 400-500€. GTX 1080 was 700€. GTX 2080 is 900€ ... GTX 2180 will be 1100€? So in 5-10 years a Nvidia graphics card will cost 2000€? I doubt that everyone's wage was multiplied by 5 in that time ^^. Maybe that's also the reason why the Nvidia stock broke down (crypto mining aside). People cannot pay their insane prices any more.

ASUS ROG Strix GeForce GTX 1080 | Core I7-7700K | 16GB RAM | Win10 Pro x64
Asus ROG Swift PG278Q 3D Vision Monitor
Optoma UHD 40 3D Vision Projector
Paypal donations for 3D Fix Manager: duselpaul86@gmx.de

#10
Posted 02/17/2019 02:31 AM   
[quote="Helifax"]Well.. The few games that support it, have it locked at some "resolution", like it is EXCLUSIVE for that resolution. At least for the time being.[/quote] Well he said that it's the developers choice, so at least it is not locked driver side. Perhaps it could be hacked? "DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size."
Helifax said:Well.. The few games that support it, have it locked at some "resolution", like it is EXCLUSIVE for that resolution. At least for the time being.


Well he said that it's the developers choice, so at least it is not locked driver side. Perhaps it could be hacked?

"DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size."

#11
Posted 02/19/2019 02:39 AM   
Some recent videos posted by Nvidia Battlefield V: Now With NVIDIA DLSS – Up to 40% Performance Boost! https://www.youtube.com/watch?v=nshbUzdBlq8 NVIDIA DLSS boosts Port Royal Benchmark performance by up to 50% https://www.youtube.com/watch?v=tLyIu3lMdAc
Some recent videos posted by Nvidia

Battlefield V: Now With NVIDIA DLSS – Up to 40% Performance Boost!





NVIDIA DLSS boosts Port Royal Benchmark performance by up to 50%


#12
Posted 03/02/2019 11:01 PM   
DLSS only works with specific cards and with high resolution... For exemple DLSS in 1440p with 2070 will not work in Battlefield 5. That's sad... .
DLSS only works with specific cards and with high resolution...

For exemple DLSS in 1440p with 2070 will not work in Battlefield 5.

That's sad...



.
[quote=Andrew Edelsten ...Director of Developer Technologies at NVIDIA]DLSS provides a new way of achieving higher display resolutions without the cost of rendering every pixel in the traditional sense. DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size.[/quote] Well this statement makes me think that it's up to developers as to what limitations are imposed. But checking real quick, like Dugom said, certain GPUs have certain restrictions that might actually be imposed by Nvidia. https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_DLSS/ The following quote is from article written by W1zzard at the link above "Just like in Metro Exodus, NVIDIA has chosen to limit the availability of DLSS, but in a slightly different way. Enabling DLSS in Battlefield V requires RTX raytracing to be enabled, which in turn requires DirectX 12 enabled. Additional limitations come in form of supported DLSS resolutions. DLSS at 1080p is only available on RTX 2060 and RTX 2070 (not RTX 2080 and RTX 2080 Ti). DLSS at 1440p is available on all RTX cards except for RTX 2080 Ti, and at 4K, all cards can run DLSS. This is somewhat surprising as we can think of many scenarios where people would want to have higher frame rates at lower resolutions to, for example, drive 144 Hz monitors."
Andrew Edelsten ...Director of Developer Technologies at NVIDIA said:DLSS provides a new way of achieving higher display resolutions without the cost of rendering every pixel in the traditional sense. DLSS is also flexible enough to allow developers to choose the level of performance and resolution scaling they wish rather than being locked to certain multiples of the physical monitor or display size.


Well this statement makes me think that it's up to developers as to what limitations are imposed.

But checking real quick, like Dugom said, certain GPUs have certain restrictions that might actually be imposed by Nvidia.


https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_DLSS/


The following quote is from article written by W1zzard at the link above


"Just like in Metro Exodus, NVIDIA has chosen to limit the availability of DLSS, but in a slightly different way. Enabling DLSS in Battlefield V requires RTX raytracing to be enabled, which in turn requires DirectX 12 enabled. Additional limitations come in form of supported DLSS resolutions. DLSS at 1080p is only available on RTX 2060 and RTX 2070 (not RTX 2080 and RTX 2080 Ti). DLSS at 1440p is available on all RTX cards except for RTX 2080 Ti, and at 4K, all cards can run DLSS. This is somewhat surprising as we can think of many scenarios where people would want to have higher frame rates at lower resolutions to, for example, drive 144 Hz monitors."

#14
Posted 03/03/2019 12:02 AM   
Resolution Scaling at the moment is just better, It works with 3D vision, doesn't require Tensor Cores, is useful at trying to achieve high framerates, is way easier to implement... looks the same or better.
Resolution Scaling at the moment is just better, It works with 3D vision, doesn't require Tensor Cores, is useful at trying to achieve high framerates, is way easier to implement... looks the same or better.

I'm ishiki, forum screwed up my name.

9900K @5.0 GHZ, 16GBDDR4@4233MHZ, 2080 Ti

#15
Posted 03/03/2019 12:39 AM   
Scroll To Top