• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Silent Hill 2 Remake Patch 1.06 removed PSSR On PS5 Pro to fix previous issues

kevboard

Member
Bbbut DLSS is better than native! How dare you!

it usually is. the ghosting in that example can probably be fixed by replacing the .dll file for a different DLSS version. sadly, most devs don't really test this shit, and you gotta fix it yourself. but that is thankfully as easy as drag&dropping the desired DLSS version into the game's folder. and thankfully people will usually find the best version fairly quicly.

it's not always the newest DLSS version that gives the best results. it's different for each game, and if devs fuck up with either the wrong version used or the wrong preset selected for what their game works best with.

what different presets can do to a game can be seen in the video below.
these are all recorded using DLSS 3.1.11, the only difference you see is the presets for each mode



in the first example. notice the very obvious trail in Preset A, D and E in the first example, while the other presets have noticeably less trailing issues, while still looking different from eachother as well.
they all use the same quality mode and version number, just a different preset.

and if you look at all the examples he shows in that video, Preset C seems to have the best results across the situations shown.
but if you replaced the DLSS version, a different preset might work better. so if devs don't optimally set this, they can introduce these issues.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Am I wrong for feeling like there's more than a whiff of gate-keeping/protectionism about the way that PSSR not being perfect is being highlighted?

As if other forms of upscaling don't have their issues also, and that somehow PSSR is making previously "flawless" IQ worse.
I have a 6900XT so I am stuck with FSR so have no need to gate-keep. This kind of joking is just an 'I told you so' after so many threads where people refused to listen to reason that DLSS wasn't going to be beaten out of the gate and possibly never will be.
 

SlimySnake

Flashless at the Golden Globes
Just imagine sh2 running 4k80fps pristine image quality. And bake of higher quality than their piss poor low real time gi. I blame ue5. It’s the ultimate lazy tool
Nonsense. the game ran fine at launch in the quality mode using Tsr on the base ps5. just because bloober are fucking morons and fucked up the quality version using pssr doesnt mean ue5 is the issue. i played the game on pc using dlss dropping down to 1080p internal resolution at times to mitigate their awful stuttering issues and it didnt have any of these shimmering issues.

btw, the stuttering issues we all attributed to UE5 were patched by bloober and the culprit was something called sky map generation. so yeah, its more bloober than ue5.

I always laugh at people blaming engines for incompetence. should we blame Decima for how poorly Until dawn ran at launch? literally 2 years after kzsf launched? if we are to blame ue4 for star wars jedi survivor just breaking apart thanks to its RT effects then how do we explain callisto protocol which uses even more rt effects and runs at native 4k 30 fps and 1440p 60 fps on the pro?

i think dev competency has played a huge role here. yes, pssr has issues, but GG, lords of the fallen devs, striking distance studios, and several others saw it coming from a mile away, and stayed away from it. Meanwhile morons who released straight up broken games like jedi survivor and bloober dont even have the brains to fix both modes and only ended up patching one of them.

P.S Epic has improved UE5 performance by 30-50% on the gpu allowing devs to run hardware lumen on the same budget as software lumen on UE5.1. Their CPU implementation has increased by 50-80%. We have hard metrics thanks to tests DF ran in the matrix demo.

The devs just refused to build their games on the latest version and while they might have their reasons; time, budget, complexity.... lets not blame UE5 since Epic has literally fixed these performance issues.
 
Last edited:

SweetTooth

Gold Member
It’s funny to see people who always shit-talked the Pro come out of the woodwork to thrash it when it has issues. They’re nowhere to be found in good implementations, but the moment it’s bad, they’re like rabid dogs on a carcass.

Its so damn funny, they are nowhere to be seen in CP thread, but here they are lecturing people on how "PSSR is not ready" LMAO
 

ShakenG

Member
It’s funny to see people who always shit-talked the Pro come out of the woodwork to thrash it when it has issues. They’re nowhere to be found in good implementations, but the moment it’s bad, they’re like rabid dogs on a carcass.
Yeah, get used to that sort of behaviour. And its far from 1 sided.. just part of the shit show.

Its obviously a bit premature or rushed in some cases. Developers shouldn't be just slapping it on without testing it and calling it a day.
 
Last edited:

rofif

Can’t Git Gud
Nonsense. the game ran fine at launch in the quality mode using Tsr on the base ps5. just because bloober are fucking morons and fucked up the quality version using pssr doesnt mean ue5 is the issue. i played the game on pc using dlss dropping down to 1080p internal resolution at times to mitigate their awful stuttering issues and it didnt have any of these shimmering issues.

btw, the stuttering issues we all attributed to UE5 were patched by bloober and the culprit was something called sky map generation. so yeah, its more bloober than ue5.

I always laugh at people blaming engines for incompetence. should we blame Decima for how poorly Until dawn ran at launch? literally 2 years after kzsf launched? if we are to blame ue4 for star wars jedi survivor just breaking apart thanks to its RT effects then how do we explain callisto protocol which uses even more rt effects and runs at native 4k 30 fps and 1440p 60 fps on the pro?

i think dev competency has played a huge role here. yes, pssr has issues, but GG, lords of the fallen devs, striking distance studios, and several others saw it coming from a mile away, and stayed away from it. Meanwhile morons who released straight up broken games like jedi survivor and bloober dont even have the brains to fix both modes and only ended up patching one of them.

P.S Epic has improved UE5 performance by 30-50% on the gpu allowing devs to run hardware lumen on the same budget as software lumen on UE5.1. Their CPU implementation has increased by 50-80%. We have hard metrics thanks to tests DF ran in the matrix demo.

The devs just refused to build their games on the latest version and while they might have their reasons; time, budget, complexity.... let’s not blame UE5 since Epic has literally fixed these performance issues.
I don’t disagree but ue forces users into using their stuff that’s hardware intensive. Any other way and is difficult to get where they want
 

Markio128

Gold Member
Yeah, get used to that sort of behaviour. And its far from 1 sided.. just part of the shit show.

Its obviously a bit premature or rushed in some cases. Developers shouldn't be just slapping it on without testing it and calling it a day.
I don’t mind those who provide honest advice/critique without going overboard (it’s genuinely helpful), but it’s those who are overly venomous all the time that make themselves sound like dicks.
 
Dlss 2.0 is from 2020 and it was good from the start. Improved over time but it didn't have that much issues.

Dlss 1.0 from 2019 works very differently (and it was shit).

Console released 2 weeks ago but it was in development for years and so was PSSR.
But IIRC DLSS 2.0, when it came out, had issues with RT?

But that’s besides the point since people today are comparing pssr to latest iteration of DLSS.
 

Panajev2001a

GAF's Pleasant Genius
it usually is. the ghosting in that example can probably be fixed by replacing the .dll file for a different DLSS version. sadly, most devs don't really test this shit, and you gotta fix it yourself. but that is thankfully as easy as drag&dropping the desired DLSS version into the game's folder. and thankfully people will usually find the best version fairly quicly.

it's not always the newest DLSS version that gives the best results. it's different for each game, and if devs fuck up with either the wrong version used or the wrong preset selected for what their game works best with.

what different presets can do to a game can be seen in the video below.
these are all recorded using DLSS 3.1.11, the only difference you see is the presets for each mode



in the first example. notice the very obvious trail in Preset A, D and E in the first example, while the other presets have noticeably less trailing issues, while still looking different from eachother as well.
they all use the same quality mode and version number, just a different preset.

and if you look at all the examples he shows in that video, Preset C seems to have the best results across the situations shown.
but if you replaced the DLSS version, a different preset might work better. so if devs don't optimally set this, they can introduce these issues.

That is not what was being reviewed, not the PS5 Pro patch using PSSR vs users manually selecting the DLSS DLL depending of the section of the game they are in to chose the best one. PSSR also has different versions and it is up to the dev not users to use the HW and SW tools properly.

It was a comical “… and DLSS is by far the better choice…” statement when the video on screen was showing quite terrible ghosting. This is a good follow up to the SH2 botched patch (no extra RT, no higher quality setting, just lower but fixed resolution and PSSR on top and called it a day) where DF claimed that dynamic resolution IS the cause for frame rate drops and Bloober made a good choice fixing the resolution target as it eliminates those stutters (sure buddy, the fact that this resolution target was lower than the window you used before has nothing to do with it… let alone DRS shortens the stutters and does not create them… cart and horses…).
 

Bojji

Member
But IIRC DLSS 2.0, when it came out, had issues with RT?

But that’s besides the point since people today are comparing pssr to latest iteration of DLSS.

Of course they compare with the latest version.

But this narrative that DLSS 2.0 was poor at launch it untrue, it had minor issues compared to issues with PSSR in some games. DLSS 1.0 was shit and they jumped from shit (1.0) to very good (2.0).

As far as we know PSSR might already be fixed and version of that is in hands of developers right now. Question is, why allow them to show bad version?
 
it's not worse than fsr2 was but it's not good in AW2. Remedy screwed something up
Have you played up to the city level where you get to play as Alan Wake for the first time? It's worse in this type of location with thr rain and all the angular geometry of buildings.

In particular, on grid-like patters like metal walkways and grates, pssr is reacting wildly with shimmer and pixel crawl as soon as you move your character...while in general it's arguably it is better now on Pro, the particular shimmer I'm describing is the most distracting yet ...

It's not related to pssr interacting with RT it's pssr on patterns and specular highlights ...it just defeats the purpose of pssr. How do we have games like Rebirth looking so beautiful with pssr and games like AW2 that are still a mess?

BTW, I compared AW2 with another game that has been deemed a bad Pro port, Jedi Survivor, and there's nothing in Survivor that is close to as bad as the aliasing/artifacting in AW2 and that game also uses multiple RT features...i keep saying it Jedi Survivor in 30 fps looks phenomenal on the Pro. I'm hoping the devs don't touch Quality mode in that after DF's video as they might fuck it up.

Another question I have ....why did SH2 decs roll back the 60 fps mode to fsr2 but left the Quality mode with PSSR? They have to be about the laziest and dumbest devs we've yet seen bungle a Pro patch ...did they just assume that people don't care about Quality mode? I betcha that's why they didn't bother with it. Because they know PSSR was making it look worse in both modes so it makes zero fucking sense to not make the same changes for both modes!

Devs that just slap on PSSR without making fuck all other improvements and simultaneously f'ing up the iq of their game should be lined up and kicked in the nuts.
 

Bojji

Member
The industry should just forget about AI upscaling and TAA. Everyone would profit. Except maybe NVIDIA.

You can run any modern game without TAA (if it allows to turn off) and it will look like absolute garbage - even at 4k native. Even last gen solutions like SMAA won't help much.

You get sharper image for sure but it's also shimmering mess with shit ton of aliasing. MSAA no longer works in modern engines so you can forget about it.
 
It's kinda odd that most games pssr does a good job are old games and anything newer and more demanding doesn't really work on the pro. For those who forked out the cash I hope they make it work, otherwise Sony fucked you over with their old games advertising on the pro.
 

DenchDeckard

Moderated wildly
It's kinda odd that most games pssr does a good job are old games and anything newer and more demanding doesn't really work on the pro. For those who forked out the cash I hope they make it work, otherwise Sony fucked you over with their old games advertising on the pro.

Definitely seems to look like more games that run on older engines, or are cross gen look the best.

Not sure whats going on with the more demanding games and UE5 stuff. But UE5 sucks balls so....
 
Remedy is a clown developpers.

Alan Wake 1 is on 540p on Xbox 360.
Quantum Break is on 720p TAA and shitty upscaled to 1080p on One (And PC port was a shitshow)
Control is pretty Meh and "Shining" on PC, because, it was a perfect Benchmark and marketing for DLSS (DLSS 1 then DLSS 2). (Just look how many DF video Alex Battaglia doing with this game)..
Alan Wake 2 is a game purely developped for Nvidia GPU.

All of this mixed into stupid story "woah, all our games are interconnected, it's the Remedy-Verse".
Control looks good on PC with only a few tweaks. It is pretty well optimized as well so it can be played native.
QfgDw58.png
 

Elios83

Member
It's kinda odd that most games pssr does a good job are old games and anything newer and more demanding doesn't really work on the pro. For those who forked out the cash I hope they make it work, otherwise Sony fucked you over with their old games advertising on the pro.

This has no technical sense.
It has nothing to do with how modern or new the engine is.
It has to do with the baseline you're using as an input.

These "demanding" games are often sub 1080p material on top of which low res, super noisy RT effects are applied as the starting point.
With this kind of input no upscaler can do magic, it's also unlikely that you can cover all cases with the same training without at least different profiles.
If you train the AI upscaler to not enhance RT noise in general you're then going to lose details in games that do not have these issues.

The issue is that there are tools, these tools are not magic, they can be improved, but it's up to developers to test their stuff and understand what's best for their games and adjust what they're doing.
In the rush of updating their games a few developers created some bad apples but overall the experience with the Pro is absolutely positive, at least mine has been so.
 
Of course they compare with the latest version.

But this narrative that DLSS 2.0 was poor at launch it untrue, it had minor issues compared to issues with PSSR in some games. DLSS 1.0 was shit and they jumped from shit (1.0) to very good (2.0).

As far as we know PSSR might already be fixed and version of that is in hands of developers right now. Question is, why allow them to show bad version?
Oh I agree with you and I do remember DLSS 1.0 but I also distinctly remember v2 had issues at launch that progressively got better.

PSSR shouldn’t have been launched in this state, feels rushed to market.
 

Elios83

Member
Oh I agree with you and I do remember DLSS 1.0 but I also distinctly remember v2 had issues at launch that progressively got better.

PSSR shouldn’t have been launched in this state, feels rushed to market.

Why given that it works perfectly in a lot of games and no one forces developers to use it if they see that it doesn't yield the best possible results for their games?
In those cases the developers can simply take advantage of the extra GPU power and much higher ray tracing performance to do other stuff using other upscalers.

The main issue is with some developers not testing their games and not making valid choices because they didnt have enough time, not the PSSR itself which is just an optional tool that will obviously continue to be improved.
 

twilo99

Member
It has to do with the baseline you're using as an input.

Right, but how are they supposed to raise that baseline while having to deal with outdated hardware?

FSR was supposed to "save" the xss and we know how that worked out, but at least there the price reflects the results..
 
Last edited:

Elios83

Member
Right, but how are they supposed to raise that baseline while having to deal with outdated hardware?

By taking advantage of the extra 40% raster power, increasing the DRS window or focusing on higher frame rates, taking advantage of 2x the ray tracing performance and not using PSSR if it doesn't work well with your games? Also asking Sony if they can help with customized/tailored profiles for the PSSR for future games.
 
Last edited:
Why given that it works perfectly in a lot of games and no one forces developers to use it if they see that it doesn't yield the best possible results for their games?
In those cases the developers can simply take advantage of the extra GPU power and much higher ray tracing performance to do other stuff using other upscalers.

The main issue is with some developers not testing their games and not making valid choices because they didnt have enough time, not the PSSR itself which is just an optional tool that will obviously continue to be improved.
Yes there are cases where PSSR works (SB, F1, TLOU) but evidently there are cases where it doesn’t work at all.

Just like DLSS this needs to mature: model, implementation, tools, drivers all need to be better otherwise what is the point of it all?
 

Elios83

Member
Yes there are cases where PSSR works (SB, F1, TLOU) but evidently there are cases where it doesn’t work at all.

Just like DLSS this needs to mature: model, implementation, tools, drivers all need to be better otherwise what is the point of it all?

Ok but the point for me is not that PSSR needs improvements. It's obvious that no upscaler is magic and DLSS produces artifacts as well although some people on PC have tried to hide this reality also because they have the luxury to increase base settings until the algorithm does a good job.

It will get better and imo Sony will have to introduce different profiles for different needs (you can't handle completely different use cases with the same training, either you're trained to try to make noisy low res RT reflections look good, or you tell the algorithm to make everything super crisp assuming that your input is decent).

The point is that PSSR is an optional tool and what happened with SH2 is fully on the developers.
It didn't take much to understand their input is too noisy to work well with PSSR.
Remedy at least chose a compromise, they're now running the performance mode with quality mode details settings and they think some artifacts are a good tradeoff.
But Bloober just fucked up. There is no benefit in their implementation.
So just focus on increasing the DRS window and removing frame rate issues as much as you can and avoid this shitshow where now PSSR is disabled in one mode and they didn't care about quality mode.
 
Last edited:

Bojji

Member
Ok but the point for me is not that PSSR needs improvements. It's obvious that no upscaler is magic and DLSS produces artifacts as well although some people on PC have tried to hide this reality also because they have the luxury to increase base settings until the algorithm does a good job.

Damn. DF compared PSSR with DLSS using the same abysmal low resolutions (800p for AW2) and DLSS was much better than PSSR.
 

Elios83

Member
Damn. DF compared PSSR with DLSS using the same abysmal low resolutions (800p for AW2) and DLSS was much better than PSSR.

DLSS produces artifacts as well and the lower res you get the worse are the results.
There is no escape from that. Magic doesn't exist even on PC.
 
Last edited:

Elios83

Member
It's not magic or free from issues but it handles even such low resolutions better than PSSR:

PZDDcYK.jpeg
KD2Eka0.jpeg
wBYbhs2.jpeg

And people have pointed out situations where PSSR does things better than DLSS that makes branches disappear on vegetation, introduces ghosting and other issues.

But PSSR vs DLSS is not my point, for me it's not a contest for people that want to see their preferred tech prevail at all costs.
The point is that people especially on consoles are new to this kind of technology and need to understand that AI based upscalers are just a tool.
There is no magic, AI upscalers are generally good at creating the impression of a high res crisp image compared to other traditional techs but they have their own specific sets of shortcomings that make the image affected by unstable elements.

Either people are educated on the reality behind a technology or expectations will be unrealistic.
 

Bojji

Member
And people have pointed out situations where PSSR does things better than DLSS that makes branches disappear on vegetation, introduces ghosting and other issues.

But PSSR vs DLSS is not my point, for me it's not a contest for people that want to see their preferred tech prevail at all costs.
The point is that people especially on consoles are new to this kind of technology and need to understand that AI based upscalers are just a tool.
There is no magic, AI upscalers are generally good at creating the impression of a high res crisp image compared to other traditional techs but they have their own specific sets of shortcomings that make the image affected by unstable elements.

Either people are educated on the reality behind a technology or expectations will be unrealistic.

Fully agree. There were unrealistic expectations before launch, PSSR is just another reconstruction tech but it was treated like something far beyond that.

We have console that is 30-45% more powerful than PS5 and that's what people should expect in games. Every game SHOULD be better than on PS5 but thanks to PSSR having issues and developers brain dead decisions this is not always the case.
 

playbignbox

Member
It’s funny to see people who always shit-talked the Pro come out of the woodwork to thrash it when it has issues. They’re nowhere to be found in good implementations, but the moment it’s bad, they’re like rabid dogs on a carcass.
That's because this is happening in games that should specifically benefit from the PS5 Pro, such as games with low resolution or unstable frame rates. The other PS5 Pro updates, like Demon's Souls, already run very well on the base console.
 

Loboxxx

Member
What I do know is that if Guerrilla didn't use PSSR for Horizon and it's the best looking game on PS5pro, there must be a reason. They're Sony's technical reference and they ruled it out.

PSSR needs work to make it less power intensive, until then, it's probably best to rule it out.
 

MDSLKTR

Member
I can't believe I'm saying this.....Glad I played this before I got a Pro.
Eh it's enjoyable enough for me now in performance mode, just got to not look at puddles when outside lol.
Anyway the way I see it, the game is too good to not be double dipped on PC now or in the future.
 

SlimySnake

Flashless at the Golden Globes
But IIRC DLSS 2.0, when it came out, had issues with RT?

But that’s besides the point since people today are comparing pssr to latest iteration of DLSS.
nothing this bad. Cyberpunk was the first rt game with dlss 2.0 and i dont remember this kind of shimmering.

hell, even dlss 1.0 which was was more like fsr and sonys checkerboarding solution, did not have these issues. i played Metro with its rtgi and had no issues with flickering foliage. Control came out a few months later and featured like 6 different rt techniques and i ran that game at 864p internally reconstructed to 1440p then upscaled by my 4k tv, and while it was blurry, it did not have the same flickering and shimmering we are seeing here.

i am going to install Anthem and test out dlss 1 because that was the first game i played with dlss and it handled all of that fancy foliage just fine at 1440p dlss. it was just as good as the checkerboarding solution in horizon zero dawn, days gone and death stranding. i dont think it was ever patched to dlss2.0 so it will be a good test.

again, i was there at the start and dlss was never this bad.

although some people on PC have tried to hide this reality also because they have the luxury to increase base settings until the algorithm does a good job.
nah, i have played games at dlss 4k quality (1440p internal) and 1440p dlss performance (720p internal) and it never produces this kind of shimmering.

On the contrary, jedi survivor runs well over 1440p internal resolution in quality mode and has the same shimmering. alan wake 2 in its rt mode runs at 1226p or closed to 4k dlss balanced and has the same shimmering on rt objects.

i posted this footage of cybperunk running on full path tracing. so literally everything is ray traced and tell me if you are seeing the same shimmering here at 720p. lower res than sh2, dragons dogma, alan wake 2 and star wars.



now i personally prefer dlss 4k quality because its much sharper than dlss performance, but the problem is that 4k pssr quality is not great either in rt games like jedi suvivor. nor is 4k pssr balanced in AW2. it works just as well as dlss in demon souls, tlou1, tlou2 and ff7 rebirth but those are games without RT so clearly pssr is a fantastic piece of tech let down by its sony overlords who didnt bother training it on rt games before launch.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
But IIRC DLSS 2.0, when it came out, had issues with RT?

But that’s besides the point since people today are comparing pssr to latest iteration of DLSS.

It's not besides the point. It's a point that everyone should remember. We can compare today's PSSR to DLSS 3.7 and that' fine. But lets not act as if the first generation of PSSR won't get better like DLSS did.

Of course they compare with the latest version.

But this narrative that DLSS 2.0 was poor at launch it untrue, it had minor issues compared to issues with PSSR in some games. DLSS 1.0 was shit and they jumped from shit (1.0) to very good (2.0).

As far as we know PSSR might already be fixed and version of that is in hands of developers right now. Question is, why allow them to show bad version?

It's dumb to call today's version of PSSR the "bad" version, when it's literally running great on games out right now! Maybe the devs need to learn how to use it. Stop blaming tech, when it's clearly not the issue.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
i posted this footage of cybperunk running on full path tracing. so literally everything is ray traced and tell me if you are seeing the same shimmering here at 720p. lower res than sh2, dragons dogma, alan wake 2 and star wars.



now i personally prefer dlss 4k quality because its much sharper than dlss performance, but the problem is that 4k pssr quality is not great either in rt games like jedi suvivor. nor is 4k pssr balanced in AW2. it works just as well as dlss in demon souls, tlou1, tlou2 and ff7 rebirth but those are games without RT so clearly pssr is a fantastic piece of tech let down by its sony overlords who didnt bother training it on rt games before launch.


See if Cyberpunk 2077 looked like this on the PS5 Pro, I'd buy it immediately! And this is SlimySnake SlimySnake running it at 720p internally and then it upscaling to 1440p. And from what I could see it was running at 50-60fps. That I'd accept. EASILY!
 

SlimySnake

Flashless at the Golden Globes
It's kinda odd that most games pssr does a good job are old games and anything newer and more demanding doesn't really work on the pro. For those who forked out the cash I hope they make it work, otherwise Sony fucked you over with their old games advertising on the pro.
it's not that odd if you look at what these next gen games are doing. most of these games are pushing next gen rt features while older games just dont. others are pushing way more complex geometry and foliage compared to last gen games.

there is also heavy usage of dynamic elements like volumetric fog, weather effects that result in moving foliage, trees, and other objects. anything dynamic and you are going to have issues with pssr. dlss also struggled with ghosting for a long time with moving objects like crows in avatar but they nailed down the shimmering from the very beginning. especially in rt games since they were well aware of just how much dlss can mess with their rt denoisers and vice versa.

whereas pssr was likely only trained on last gen sony first party games like tlou which had rt baked into textures.... way easier to resolve lighting, ao and shadow detail thats basically a texture instead of being calculated in realtime by the Gpu.

you can blame sony first party studios for their focus on cross gen games, and lack of rt usage. Insomniac is the only one that utilized rt but they also passed on rtgi and had a very limited rt reflection implementation that doesnt apply to all objects like it does in AW2 and Callisto. sony probably trained pssr on spiderman 2 and ratchet, and figured hey this is good enough, but now they are realizing that even those games had a very limited RT feature set compared to other games by third parties.
 
Every time this thread gets bumped some delusion in me kicks into high gear and expects some miraculous "JK guyz here's the real patch lulz" tweet being posted...

Sad Tears GIF
 
Last edited:

Dorfdad

Gold Member
Newer versions with up to date SDK's should hopefully improve the situation. Not to be overblown but there seem to be some issues with specific implementations currently, no need to deny it. It can be argued that Sony were not quite ready for launch for their algorithm. I miss the days where consoles were consoles not freaking lite PCs in dire need of of continual updates for things to work as intended.
Tried to get that Holiday shopping dollar and in before the switch 2..
 
This has no technical sense.
It has nothing to do with how modern or new the engine is.
It has to do with the baseline you're using as an input.

These "demanding" games are often sub 1080p material on top of which low res, super noisy RT effects are applied as the starting point.
With this kind of input no upscaler can do magic, it's also unlikely that you can cover all cases with

it's not that odd if you look at what these next gen games are doing. most of these games are pushing next gen rt features while older games just dont. others are pushing way more complex geometry and foliage compared to last gen games.

there is also heavy usage of dynamic elements like volumetric fog, weather effects that result in moving foliage, trees, and other objects. anything dynamic and you are going to have issues with pssr. dlss also struggled with ghosting for a long time with moving objects like crows in avatar but they nailed down the shimmering from the very beginning. especially in rt games since they were well aware of just how much dlss can mess with their rt denoisers and vice versa.

whereas pssr was likely only trained on last gen sony first party games like tlou which had rt baked into textures.... way easier to resolve lighting, ao and shadow detail thats basically a texture instead of being calculated in realtime by the Gpu.

you can blame sony first party studios for their focus on cross gen games, and lack of rt usage. Insomniac is the only one that utilized rt but they also passed on rtgi and had a very limited rt reflection implementation that doesnt apply to all objects like it does in AW2 and Callisto. sony probably trained pssr on spiderman 2 and ratchet, and figured hey this is good enough, but now they are realizing that even those games had a very limited RT feature set compared to other games by third parties.
I was being nice to all the Pro owners feelings. You explained the obvious, but hey, blame the devs. I don't expect much improvement on PS5 Pro but maybe magic happens and it's a good testing ground for PS6.
 

marquimvfs

Member
it usually is. the ghosting in that example can probably be fixed by replacing the .dll file for a different DLSS version. sadly, most devs don't really test this shit, and you gotta fix it yourself. but that is thankfully as easy as drag&dropping the desired DLSS version into the game's folder. and thankfully people will usually find the best version fairly quicly.

it's not always the newest DLSS version that gives the best results. it's different for each game, and if devs fuck up with either the wrong version used or the wrong preset selected for what their game works best with.

what different presets can do to a game can be seen in the video below.
these are all recorded using DLSS 3.1.11, the only difference you see is the presets for each mode



in the first example. notice the very obvious trail in Preset A, D and E in the first example, while the other presets have noticeably less trailing issues, while still looking different from eachother as well.
they all use the same quality mode and version number, just a different preset.

and if you look at all the examples he shows in that video, Preset C seems to have the best results across the situations shown.
but if you replaced the DLSS version, a different preset might work better. so if devs don't optimally set this, they can introduce these issues.

I agree that DLSS is a good thing, and it's extremely easy to use. I just don't agree that it is, in any version or use case, better than native, that's just lunacy. There's always some unresolved detail, some smearing, some minor problem that may be imperceptible on fast movement, but that does show up under proper investigation. It may fix some aliasing better that the original filter, yes, but that doesn't make it "better than native".
 
Top Bottom