Digital Foundry - Nvidia DLSS 4 Deep Dive: Ray Reconstruction Upgrades Show Night & Day Improvements

Gaiff

SBI’s Resident Gaslighter


toZ6h9k.png

tywjeyR.png
 
Last edited:
The upscaler video is the one I'm waiting for the most. He said "soon".
Yep! So far the only hard info we have on DLSS4 (transformer) super resolution from DF is the performance cost of using it on the different RTX generations:

RTX 5000: 4.0%
RTX 4000: 4.7%
RTX 3000: 6.5%
RTX 2000: 7.9%

That's great results and pretty much in line with what one would expect considering the generational tensor core changes over time. Even on 2000 series, an 8% cost lets you run it in Performance mode for better IQ results than Quality gave you previously, so it will still end up running faster at better IQ even on those.
 
Last edited:
The trend where with every new DLSS version folks finally realize the last wasn't as amazing as they've been saying continues. It's SO awesome. Wait, NOW it's awesome. No wait, NOW it finally fixes the HUGE issues we didn't care for before (also omg @ the lame competition not having it)! Etc. 🤷‍♂️
 
Last edited:
The trend where with every new DLSS version folks finally realize the last wasn't as amazing as they've been saying continues. It's SO awesome. Wait, NOW it's awesome. No wait, NOW it finally fixes the HUGE issues we didn't care for before (also omg @ the lame competition not having it)! Etc. 🤷‍♂️
That's not what's happening at all, but you're evidently not interested in anything factual, so have at it.
 
The trend where with every new DLSS version folks finally realize the last wasn't as amazing as they've been saying continues. It's SO awesome. Wait, NOW it's awesome. No wait, NOW it finally fixes the HUGE issues we didn't care for before (also omg @ the lame competition not having it)! Etc. 🤷‍♂️
I hear you, but I don't think that applies in this case. This is only comparing ray reconstruction, the criticisms of which have been popular since it came out. The analysis for RR was always "Wow. This is great! It also has some quirks that cause issues here, here and here, but still better than not using it." And now the DLSS4 version is "Wow, this is way better than DLSS3 RR, but also here's a couple small regressions here and here." This video doesn't include "not using RR at all" in the comparisons, which if it did, would show you why you want to use even the DLSS3 version in the first place.

For upscaling, all of these can be true at the same time:
1) DLSS4 TM makes DLSS3 look bad now ( a generational leap)
2) DLSS3 is still better than DLSS2 and all other upscaling competition
3) DLSS2 is a generational leap compared to DLSS1
4) DLSS1 sucks
 
Last edited:
The trend where with every new DLSS version folks finally realize the last wasn't as amazing as they've been saying continues. It's SO awesome. Wait, NOW it's awesome. No wait, NOW it finally fixes the HUGE issues we didn't care for before (also omg @ the lame competition not having it)! Etc. 🤷‍♂️

I mean DLSS 4 looks consistently better than native at this point. It's pretty amazing.
 
Looking at that performance chart, I am SO fucking happy I stuck it out with my 1080 Ti and dodged those garbage 20 and 30 series cards to get the 4090 lmao love this beast.
 
The trend where with every new DLSS version folks finally realize the last wasn't as amazing as they've been saying continues. It's SO awesome. Wait, NOW it's awesome. No wait, NOW it finally fixes the HUGE issues we didn't care for before (also omg @ the lame competition not having it)! Etc. 🤷‍♂️

And when « before » we had AAA scale graphic showcases with path tracing?
 
Ray reconstruction still looks bad when combined with upscaling, DLAA looks really good tho, 1440 DLAA looks much better than 4K Quality.
 
Sheesh, that performance penalty.

toZ6h9k.png

yup that checks out. when I tested CP2077 with pathtracing on my 3060ti I dropped from 60+fps down to 39fps with the transformer model ray reconstruction.
while without pathtracing the drop was only a few FPS.
 
Gosh, Nvidia is so hot right now. Why doesn't Sony partner with them for their console instead of that forgotten 2nd class manufacturer? I mean even the Switch runs Nvidia.
 
Ray reconstruction upgrade is nice, but the transformer upscaling is next level. I woulda expected that video first but hey!
 
The regression portion of the DF video says that standing still causes weird artifacts. Not a bad trade off for a video game, unless your standing still to stop and smell the roses.
 
anyone with a 3080 notice stuttering when using the transformer model in quality mode? was getting random stuttering in RDR2 and Dead Space. No issues in Balanced mode.
 
anyone with a 3080 notice stuttering when using the transformer model in quality mode? was getting random stuttering in RDR2 and Dead Space. No issues in Balanced mode.
Didn't notice it while playing Tokyo Xtreme Racer in quality mode.
 
They aren't very good partners.

grudges don't work in business

Nvidia's and Sony's relationship was rushed and a panic move by Sony

Kutaragi was almost about to launch a Playstation 3 with their internal made GPU, which was greatly underpowered because the fucking madman thought Cell was everything. Internally devs said this would be a massive mistake to launch the console as is. They knocked on Nvidia's door without any time to make anything custom.

If Nintendo can, if Microsoft is buying billions and billions of Nvidia AI, there's no reason to hold a 20 years old story as any indication how it would work out nowadays.

In fact I think one of the two console manufacturers should break away from AMD so that they stop being almost on parity. One with deeper pockets should. Nvidia software is so good and ahead of competition that there's only so much comparison with TFlops and price margin talk can do compared to what is brought to the table.

By the way, what we see here is still beta. Nvidia sees a lot more improvement coming with this new model.
 
So I have to properly digest this on my monitor but we're definitely veering into 'can look shittier but more realistic' territory.

Personally, if stylization and dryer skin looks better than the real thing, I'll take the former.
 
grudges don't work in business

Nvidia's and Sony's relationship was rushed and a panic move by Sony

Kutaragi was almost about to launch a Playstation 3 with their internal made GPU, which was greatly underpowered because the fucking madman thought Cell was everything. Internally devs said this would be a massive mistake to launch the console as is. They knocked on Nvidia's door without any time to make anything custom.

If Nintendo can, if Microsoft is buying billions and billions of Nvidia AI, there's no reason to hold a 20 years old story as any indication how it would work out nowadays.

In fact I think one of the two console manufacturers should break away from AMD so that they stop being almost on parity. One with deeper pockets should. Nvidia software is so good and ahead of competition that there's only so much comparison with TFlops and price margin talk can do compared to what is brought to the table.

By the way, what we see here is still beta. Nvidia sees a lot more improvement coming with this new model.
EVGA.
 
Crazy times - living in an age where PC gamers are advocating for poor initial renders and then repair/reconstruct the image, and pad it out with fake frames.

This would be unthinkable to the OGs. And yeah, I think the same about PSSR and FSR too.

I guess ultimately what it shows is that the true /pcmr needs are the tinest portion of the userbase (which is good I guess), and most 'enthusiasts' are really not that bothered and have an equivalence of physical blu ray to streaming quality.
 
4090 performing better than 5090.
I'm reading that as a comparison of each GPU to itself using the different methods, rather than against each other, so I imagine 5090's so called 93% performance is still overall better than the 4090 result. Unless another chart actually comparing them in absolute numbers shows otherwise.
 
Last edited:
I think that's in comparison to themselves using the different methods, rather than against each other, so I imagine 5090's so called 93% performance is still overall better than the 4090 result. Unless another chart actually comparing them in absolute numbers shows otherwise.

I mean % performance drops when you use DLSS4.

4090 has less % drops, i think because it has the optical flow hardware, just wild speculation. 🤡
 

Oh yeah, an AIB that threw a whole division under the bus because some old man didn't like to wait for final MSRP and didn't even bother to go AMD to save jobs is totally down to earth and totally points to Nvidia being unreasonable here :rolleyes:

You think AIBs like to make a booth at CES to showcase a new AMD card, by all means even the press was pre-briefed that RDNA 4 news would drop at the conference with even some slides on it presented to them and then have AMD cancel announcement? They're stuck on the floor with a card they cannot even talk about, nor say a price?

The cards that will sit in warehouses for 2 months because AMD shifts launch?

If we talk about AIB fuckery, there's a reason EVGA didn't even bother to knock at AMD's door.
 
What stands out a lot for me is that dlss4 perf or balanced mode gives better more or less image along with performance boost and less vram usage, big fucking W
 
Crazy times - living in an age where PC gamers are advocating for poor initial renders and then repair/reconstruct the image, and pad it out with fake frames.

This would be unthinkable to the OGs. And yeah, I think the same about PSSR and FSR too.

I guess ultimately what it shows is that the true /pcmr needs are the tinest portion of the userbase (which is good I guess), and most 'enthusiasts' are really not that bothered and have an equivalence of physical blu ray to streaming quality.
How do you propose to play videogames at 4k? A 5090 can't do it without upscaling.
 
Crazy times - living in an age where PC gamers are advocating for poor initial renders and then repair/reconstruct the image

TAA already does this and does it often poorly. DLSS does it better in many cases.

so you have 2 choices. TAA or DLSS... TAA is native res and shit, DLSS saves performance and is superior but not perfect while being sub-native 🤷
 
Last edited:
It is funny how every time there is a new version of DLSS DF claims huge upgrades in IQ and that the new version is perfect and better than native.

Just wait for the DLSS5 so they can show the flaws of the previous version.
 
Nice subtle improvements. Night and day difference, it is not. You can barely tell in the video footage and he has to zoom in massively to show it off. Ok Alex. It's also not without drawbacks, the new model looks worse in areas.
 
Last edited:
It is funny how every time there is a new version of DLSS DF claims huge upgrades in IQ and that the new version is perfect and better than native.

Just wait for the DLSS5 so they can show the flaws of the previous version.
lol its what, batusta the nvidia shill, is there for.
 
Last edited:
It is funny how every time there is a new version of DLSS DF claims huge upgrades in IQ and that the new version is perfect and better than native.

Just wait for the DLSS5 so they can show the flaws of the previous version.
lol its what, batusta the nvidia shill, is there for.
It's funny how this is patently incorrect and they talk about the flaws every single time, just like in this video, but sure, let's go with this narrative.
 
Last edited:
It is funny how every time there is a new version of DLSS DF claims huge upgrades in IQ and that the new version is perfect and better than native.

Just wait for the DLSS5 so they can show the flaws of the previous version.

It was pretty much always true. DLSS2 won with native TAA in some games all other reconstruction tech at the time. DLSS3.5-3.8 wins with TAA in many games and all other ML and reconstruction tech (PSSR, XeSS, TSR, FSR, TAAU etc.).

Now tranformer model wins with 3.8 CNN model. Nvidia is competing with themselves at this point...
 
Last edited:
Top Bottom