DLAA / DLSS image quality blind test

Which of the screenshots has the best image quality?

  • Screenshot number 1

    Votes: 11 16.4%
  • Screenshot number 2

    Votes: 14 20.9%
  • Screenshot number 3

    Votes: 13 19.4%
  • Screenshot number 4

    Votes: 29 43.3%

  • Total voters
    67
  • Poll closed .
Guys, let's see if you can tell which screenshot has the best image quality. At the end of the poll I will reveal which is which and we will see how many people chose DLAA correctly :). If DLAA wins, it means it's still worth playing at 100% resolution scale and worth to invest in RTX5090 just to play at native resolutions.

Very High settings, mediumPT, no FG


Screenshot number 1 / DLSS Performance 75 fps


4.jpg


DLSSP.jpg


Screenshot number 2 / DLSS Balance 62 fps


3.jpg


DLSSB.jpg



Screenshot number 3 / DLSS Quality 51 fps


2.jpg


DLSSQ.jpg



Screenshot number 4 / DLAA 28 fps


1.jpg


DLAA.jpg
 
Last edited:
Static photos are only one side of the equation. Where upscaling technologies significantly diminish is in motion, especially fast motion. That's what you should be comparing.
 
Static photos are only one side of the equation. Where upscaling technologies significantly diminish is in motion, especially fast motion. That's what you should be comparing.
I can see subtle differences during motion, especially around hair or grass, HOWEVER, lower internal resolution increases the frame rate (by a factor of 2.7x in this particular game), and as you guys know more FPS on sample and hold display equals not only smoother image, but also sharper image. So not always DLAA / 100% resolution scale will look the best during motion. It will only look the best if you compare all DLSS modes with the same framerate, lets say 120fps. Only then DLAA will look the best during motion.
 
Last edited:
Hair is always the tell-tale for me. I voted for number two.

"If DLAA wins, it means it's still worth playing at 100% resolution and worth to invest in RTX5090 just to play at native resolutions."

At 28fps?
 
JPG format, still image with no motion...

uhm...

aside from the hair they look pretty much identical. but due to these being .jpg it's hard to say if some of the stuff I see are compression artifacts or DLSS artifacts.

the character is the only thing moving, which is why there is a slight difference there

also they all look kinda oversharpened to me, does the game have a sharpening filter setting?
 
Last edited:
Hair is always the tell-tale for me. I voted for number two.

"If DLAA wins, it means it's still worth playing at 100% resolution and worth to invest in RTX5090 just to play at native resolutions."

At 28fps?
Maybe DLAA + framegen?
 
Upscaling breaks apart in motion and you know that man. It's like when people were taking FSR stills and claiming it was as good as DLSS...then you saw it in motion and saw all the problems with the moray patterns, dis-occlusion problems, fizzle, particles disappearing, and fine details fading in and out of existence.

Wukong is actually a great example. I was playing with DLSS Performance 4K and was surprised at how good it was, but then I ran through dense foliage and yeah, it was easy to tell it wasn't native 4K+TAA.
 
Last edited:
Heh, it's number 4

QJJ5i0g.jpeg
eKO9uH1.jpeg


The artifacting on the rock's shadow (2nd screenshot) is present in all the other images, except the 4th one (1st screenshot)

Same for the bamboo shadows
 
I took a guess based on the hair, but issues with upscaling are pretty much all about motion.
 
Options 1-4

1. Pretty fucking stuttery

2. Pretty fucking stuttery

3. Pretty fucking stuttery

4. Pretty fucking stuttery

Conclusion: Who gives a shit. Maybe don't choose a game as an example that has microstutter issues as your test basis for visuals vs framerates.
 
Hilarious that this is even a question. It just shows how good DLSS is. Not using upscaling with a Nvidia card is just plain stupid

Personally getting a 5080 upon release.
 
4th (aka 1.jpg) = DLAA (game disables forced sharpening at 100% scale, different from the menu one) rest could go either way, but this one i am sure with

3rd (aka 2.jpg) = DLSS Performance
1st (4.jpg) and 2nd (3.jpg) are Quality and Balanced, hard to tell which is which (theres only 7% resolution scale difference between them) 1st one is probably quality.
 
Last edited:
Says you I guess. Have you played it with framegen?

I have used frame gen but not in Black Myth Wukong.

But I understand the technology and know there has to be significant latency, especially at such a low frame rate no less.

Mathematically it must be there, so can you explain to me how there isn't? What are you basing this on?
 
Number 4 looks slightly better IMO, but as others have pointed out a lot of the problems of upscaling come with motion. If you are standing still even the basic FSR solution used on many base Ps5/SX games can look ok...until you start moving and the smearing begins
 
Last edited:
Movement is where these temporal soup kitchens really shit the bed. Stills tell us nothing. In fact they're probably responsible for the high option so many have of the temporal soup kitchens.

Rise up gamers. Demand a meat and potato dinner. Leave the soup kitchens behind.
 
Last edited:
I have used frame gen but not in Black Myth Wukong.

But I understand the technology and know there has to be significant latency, especially at such a low frame rate no less.

Mathematically it must be there, so can you explain to me how there isn't? What are you basing this on?

frame gen will clearly increase input lag, that's a simple truth.

however, not every game has the same amount of base input latency.
God of War 2018 has worse input lag than Call of Duty Warzone for example. so in this situation, adding frame gen latency to CoD on PS5, with a 60fps target, would probably still result in a game with less lag than God of War at a real 60fps in the PS5 patched version.

I can tell you for certain that Black Myth on PS5 with frame gen has lower input lag than Forspoken has in perfomance mode with 60 real frames per second.

that isn't saying much, but it's just an example of how input lag isn't that simple, and for a non-competitive game anything below 90ms is fine, 100ms is questionable, and anything above 120ms are varying degrees of bad, until around 200ms where it's imo becoming unplayable in many genres.
 
Last edited:
Hard to tell differences that easy in still shots. As mentioned it is better in motion, DLSS Quality should beat TAA native in motion though. Obviously, it cannot quite match DLAA.
 
Not a very good test. You need motion and you need a system that barely plays in native resolution and settings.
 
frame gen will clearly increase input lag, that's a simple truth.

however, not every game has the same amount of base input latency.
God of War 2018 has worse input lag than Call of Duty Warzone for example. so in this situation, adding frame gen latency to CoD with a 60fps target would probably still result in a game with less lag than God of War at 60fps.

I can tell you for certain that Black Myth on PS5 with frame gen has lower input lag than Forspoken has in perfomance mode with 60 real frames per second.

that isn't saying much, but it's just an example of how input lag isn't that simple, and for a non-competitive game anything below 90ms is fine, 100ms is questionable, and anything above 120ms are varying degrees of bad, until around 200ms where it's imo becoming unplayable in many genres.
If Nvidia's claims hold true, and the testing of Cyberpunk via DF is correct, then added latency with regular FG with the new AI driven model will be around ~8ms. Which is rather minor, all things considered.
 
Upscaling breaks apart in motion and you know that man. It's like when people were taking FSR stills and claiming it was as good as DLSS...then you saw it in motion and saw all the problems with the moray patterns, dis-occlusion problems, fizzle, particles disappearing, and fine details fading in and out of existence.

Wukong is actually a great example. I was playing with DLSS Performance 4K and was surprised at how good it was, but then I ran through dense foliage and yeah, it was easy to tell it wasn't native 4K+TAA.
DLSS 3.8.1 has minimal shimmering / artefacts even in performance mode, while FSR3.1 has very strong shimmering even in Quality mode. In games like RDR2 even DLSS Performance (with updated 3.8.1 dll) has less artefacts than TAA native during motion (and way sharper image on top of that), so I really dont think people need to worry about artefacts when using DLSS performance.

In older games, the difference between DLAA (or even DLSSQuality) vs "DLSS performance" was much more noticeable to me, because DLSSP looked soft even on the static image, however the latest 3.8.1 DLSS has improved the image quality so much that I started having problems telling which is which. Of course, if I were to look for imperfections in image quality from up close, I would find a little noise around hair and grass in DLSSP mode. I cant however imagine playing like that (literally with my head in front of the TV). When playing on TV from a normal viewing distance I doubt many people would notice DLSSP motion artefacts, they would however easily notice 2.7x higher framerate. Soon, DLSS image reconstruction will be further improved (transformer model) and the gap between DLAA and DLSSP will shrink even more.

I started this thread because I realised that in this day and age, playing at native resolution is the biggest waste of hardware resources, and not only on consoles, but also on PC. Even the RTX5090 cant run the most demading PT games at 4K native.


Options 1-4

1. Pretty fucking stuttery

2. Pretty fucking stuttery

3. Pretty fucking stuttery

4. Pretty fucking stuttery

Conclusion: Who gives a shit. Maybe don't choose a game as an example that has microstutter issues as your test basis for visuals vs framerates.
I saw traversal stutters in BMW, but nothing that would ruin my experience. Frame delivery was smooth on my PC and I couldnt complain about performance either.

Cinematic settings with DLSSP FG. In this particular game, DLSS FG provides the lowest possible latency by enabling Nvidia Reflex. I meassured 48ms with FG on and 60ms without it.

4-K-DLSSP-FG-Cinematic.jpg



Very high settings


4-K-DLSSP-FG-Very-HIGH-FULLRT.jpg



Optimized settings, medium PT drastically improve performance and still looks a lot better compared to lumen.


4-K-DLSSP-FG-Very-High-Medium-PT.jpg



Optimized settings without FG


4-K-DLSSP-Very-High-Medium-PT.jpg


Stuttering bothered me in SH2 remake (because I saw camera stutter literally all the time), but not in BMW.

But this thread isnt just about BMW, almost every modern game support DLSS image reconstruction.
It is extremely difficult to tell these apart in static images.

I can tell you when I'm playing Resident Evil 4, there is clear difference with DLAA vs DLSS.
The RE4 does not support DLSS. It only has an extremely blurry FSR that shimmers even when I am not moving the camera.
 
Last edited:
If Nvidia's claims hold true, and the testing of Cyberpunk via DF is correct, then added latency with regular FG with the new AI driven model will be around ~8ms. Which is rather minor, all things considered.

that is indeed extremely low.
but on a system like the current consoles, they don't have the luxury of Nvidia Reflex, so in those cases the optimisation on the side of the devs will be very important.

but even there, some games have such low base latency that it wouldn't be an issue, while in some it would make it really bad (like in Forspoken, which has a shitload of lag just by default)
 
It's a toss-up between 2 and 4 for me. The second image resolves more detail and is a tiny bit sharper, but also has a bit more jaggies in the final image (blades of grass, bamboo leaves, and hair) and what looks like artifacts on shadows (rock face). The fourth image has a softer look overall (hair especially) and no shadow artifacts, which is what I would expect of an anti-aliasing solution, but loses out on some of the detail (blades of grass and bamboo leaves) and sharpness of the second picture. Had to break out imgsli for this since the screenshots are very close in quality, but here's how I would rank them: https://imgsli.com/MzM3NjEy
- 2.jpg - DLAA
- 4.jpg - DLSS Quality
- 3.jpg - DLSS Balanced
- 1.jpg - DLSS Performance
 
Last edited:
OP clearly made super deceiving and devious comparision where there is 0 movement and pics are relatively small, we dont look at 4k screen from distances like we looked at hd or full hd screens, we need to be closer to apreciate it.
Even nvidia themselfs in their very marketing video for dlss/framegen used wukong in motion/durning fights to ilustrate how it looks.

DF mentioned IQ issues in wukong too:
 
DLSS 3.8.1 has minimal shimmering / artefacts even in performance mode, while FSR3.1 has very strong shimmering even in Quality mode. In games like RDR2 even DLSS Performance (with updated 3.8.1 dll) has less artefacts than TAA native during motion (and way sharper image on top of that), so I really dont think people need to worry about artefacts when using DLSS performance.

In older games, the difference between DLAA (or even DLSSQuality) vs "DLSS performance" was much more noticeable to me, because DLSSP looked soft even on the static image, however the latest 3.8.1 DLSS has improved the image quality so much that I started having problems telling which is which. Of course, if I were to look for imperfections in image quality from up close, I would find a little noise around hair and grass in DLSSP mode. I cant however imagine playing like that (literally with my head in front of the TV). When playing on TV from a normal viewing distance I doubt many people would notice DLSSP motion artefacts, they would however easily notice 2.7x higher framerate. Soon, DLSS image reconstruction will be further improved (transformer model) and the gap between DLAA and DLSSP will shrink even more.

I started this thread because I realised that in this day and age, playing at native resolution is the biggest waste of hardware resources, and not only on consoles, but also on PC. Even the RTX5090 cant run the most demading PT games at 4K native.



I saw traversal stutters, but nothing that would ruin my experience. Frame delivery was smooth on my PC and I couldnt complain about performance either. Stuttering bothered me in SH2 remake (because I saw camera stutter literally all the time), but not in BMW.

But this thread isnt just about BMW, almost every modern game support DLSS image reconstruction.

The RE4 does not support DLSS. It only has an extremely blurry FSR that shimmers even when I am not moving the camera.
You can use a mod to inject DLSS or DLAA.
 
frame gen will clearly increase input lag, that's a simple truth.

however, not every game has the same amount of base input latency.
God of War 2018 has worse input lag than Call of Duty Warzone for example. so in this situation, adding frame gen latency to CoD on PS5, with a 60fps target, would probably still result in a game with less lag than God of War at a real 60fps in the PS5 patched version.

I can tell you for certain that Black Myth on PS5 with frame gen has lower input lag than Forspoken has in perfomance mode with 60 real frames per second.

that isn't saying much, but it's just an example of how input lag isn't that simple, and for a non-competitive game anything below 90ms is fine, 100ms is questionable, and anything above 120ms are varying degrees of bad, until around 200ms where it's imo becoming unplayable in many genres.

When you say 90ms… you're talking end to end?

Either way, 10ms is significant to me. I work really hard to cut my latency to be as low as possible. And once your body gets used to that near instantaneous response, anything else feels off.
 
Top Bottom